‘There’s no transparency’: Utility data keeps data centers from linking up with clean power (original) (raw)

‘There’s no transparency’: Utility data keeps data centers from linking up with clean power

A Microsoft data center in Wyoming (Credit: Microsoft)

As data center growth surges in regions across the U.S., technology companies racing to capitalize on the demand for artificial intelligence are jockeying over limited access to power. But there’s another dynamic at play— these firms don’t want just any power. It needs to be clean, too, in order to meet their climate goals.

Developers like Clearway Energy want to meet that demand. The company is one of the largest operators of solar, wind, and battery storage projects in North America with more than 11 GW of capacity. But guarded utility data, the company claims, is keeping clean energy developers from aligning efforts with the large load customers, like data centers, that are attempting to connect to the grid.

There’s “strong interest” for developers of clean energy projects and data centers to link up, according to Chris Barker, Clearway’s managing director of transmission and grid integration. However, “there’s no transparency” from electric utilities around where those large loads are going.

Barker’s comments were made during an interconnection workshop hosted by the Federal Energy Regulatory Commission. This particular panel aimed to unpack data transparency issues between developers and utilities to improve the interconnection process. While the workshop primarily focused on generation interconnection challenges, the conversation shifted to load interconnection, and data center demand, as it seems most end up doing in energy industry discussions of this time.

“If we’re talking about transparency for generation interconnection information, there is certainty not that level of transparency for load,” Barker said. “I think we would need to seize the opportunity for some of that thinking for new large load.”

According to a an EPRI study, data centers could consume up to 9% of U.S. electricity generation by 2030 — more than double the amount currently used. This could create regional supply challenges, among other issues, the group said. Pacific Gas & Electric, which provides power to Silicon Valley, expects to add around 3.5 GW of load from data centers on its own over that time.

The substantial load growth pushed PG&E to implement a cluster study process for interconnecting large customers, similar to the approach it took 15 years ago to manage the onslaught of renewable energy projects entering the queue. Martin Wyspianski, PG&E’s vice president of electric engineering, said the process aims to square local generation needs with deliverability.

A uniform standard for studying and data sharing around large load interconnections would provide greater clarity for the clean energy developer community, Barker said, by informing “sensible” project development.

“We would love to see what PG&E is doing nationwide,” Barker said.

Clean energy has found some success in the data center realm, however.

In May, Duke Energy announced agreements with Amazon, Google, Microsoft and Nucor to significantly accelerate clean energy deployments in the Carolinas. The proposed Accelerating Clean Energy (ACE) tariffs would enable large customers like Amazon, Google, Microsoft and Nucor to directly support carbon-free energy generation investments through financing structures and contributions that address project risk to lower costs of emerging technologies. ACE tariffs would facilitate onsite generation at customer facilities, participation in load flexibility programs and investments in clean energy assets.

The ACE framework also would include a Clean Transition Tariff (CTT) – a feature enabling Duke Energy to provide individualized portfolios of new carbon-free energy to commercial and industrial customers. The CTT would match clean-energy generation and customer load. This would be a voluntary program for larger customers seeking to advance their clean energy goals, and it would include protections for non-participating customers, Duke Energy said.

Oracle, meanwhile, is designing a gigawatt-scale data center that would be powered by a trio of small modular reactors, company chairman and chief technology officer Larry Ellison recently told investors. The cloud services giant currently has 162 data centers, live and under construction worldwide. The largest of these is 800 MW, and Oracle will soon begin construction of data centers that are more than a gigawatt. Ellison did not elaborate on the location for the data center or any construction timelines.

Data center developers increasingly view around-the-clock nuclear power as a good match for their similarly around-the-clock needs. For example, advanced nuclear company Oklo recently said it has non-binding letters of intent for about 1,350 MW of microreactor capacity, a large majority of that for data center customers.

Aside from nuclear, geothermal has also emerged as potential solution to energy-hungry data centers. Google recently entered into an agreement with Berkshire Hathaway electric utility NV Energy to power its Nevada data centers with about 115 MW of geothermal energy, and Houston-based geothermal startup Sage Geosystems and Meta Platforms recently announced an agreement to deliver up to 150 MW of new geothermal baseload power to support the latter’s data center growth.

Originally published in POWERGRID International.