The EPRI-led DCFlex will be a testbed for utilities and tech companies to explore ways to make the facilities a boon for the grid.
Photo credit: Shutterstock
Photo credit: Shutterstock
Data centers are unique loads. Computational demand can be shifted in both time and space, making them candidates as flexible grid assets. But how much can utilities rely on them?
A new research and development coalition led by the Electric Power Research Institute aims to answer that question by coordinating flexibility demonstrations at a variety of data centers around the country. The group’s goal, EPRI said this week, is to develop replicable strategies that “enable operational and deployment flexibility, streamline grid integration, and transition backup power solutions to grid assets.”
Known as DCFlex, the project will involve flexibility “hubs” to demonstrate the use of technologies like hydrogen or long-duration energy storage, and to test the limits of data centers as grid resources.
The group includes large utilities, RTOs and ISOs, as well as hyperscalers and other major computing players, including:
Microsoft and Amazon are absent from the list, though EPRI said they are in “active discussions” with other hyperscalers, along with “other key data center developers.” EPRI’s emerging technologies executive Anuja Ratnayake said they “anticipate more stakeholders joining in the days and weeks to come.”
Utilities and grid operators need a lot more flexibility. By 2030, the Department of Energy expects an additional 200 gigawatts of peak demand in the U.S. as aging power plants retire, electrification spreads, and data center capacity expands. Utilities are increasingly open to creative approaches to managing that peak – which, if the hubs prove valuable, could include flexible data centers.
According to Ratnayake, the focus is less on ramping data center demand up and down, and more on “the flexibility potential of the total load the grid has to manage.”
“Most data centers today are passive loads,” she said. “But between their backup solutions, computational flexibility, and auxiliary power flexibility, there’s the potential to be a prosumer and be a part of the shared energy economy.”
However, the coalition is offering few technical specifics. Caroline Golin, Google’s global head of energy market development, listed technologies like “biodiesel, hydrogen, advanced energy storage systems, and artificial intelligence itself” in her LinkedIn post about the initiative. But Ratnayake told Latitude that the technologies for testing have not yet been determined.
The deployment of the hubs will begin as soon as early 2025, and testing will take several years — running through 2027.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Golin described the five to 10 hubs as “living laboratories to explore strategies for integrating data centers with the grid.” Those hubs, she added, will be “setting the stage for potential grid-connected operations in the future.”
Ratnayake confirmed that the initiative will run demonstrations on a range of types and sizes of data centers, in order to “best match grid needs with the flexibility capabilities of different data center business operations.” The group will also work in multiple electricity markets, and experiment with different types of backup fuel sources.
The demonstrations will not be limited to AI-specific data centers, Ratnayake added. The intent, she said, “is to develop design criteria, program frameworks, etc. that address the largest data center types,” which are likely to handle a range of workloads, from video streaming to large language model training to crypto mining.
According to EPRI, electricity usage by hyperscalers more than doubled between 2017 and 2021 — and it expects data centers to consume up to 9% of US electricity by 2030.
This new industry coalition is the result of discussions with the Department of Energy and other stakeholders. Industry recommendations for managing data center energy demand, which were presented to Secretary Jennifer Granholm in July, included closer collaboration between the players now represented by DCFlex.
Some data center operators are already starting to experiment more with flexibility. Verrus, a company from Sidewalk Infrastructure Partners, built a computing and energy management platform that dynamically controls data center demand. And Soluna focuses on batchable computing to match renewable energy production.
And Google has been testing demand response capabilities across pilots in the U.S., Asia, and Europe. In January of this year, a Google data center in Nebraska responded to a request from the Omaha Public Power District to ramp down during Winter Storm Gerri, which helped keep local lights (and heat) on when the utility’s thermal power plants went offline.
The Nebraska data center story was profiled on the new season of Where the Internet Lives, co-produced by Latitude Studios:
As Google’s data and climate software lead Savannah Goodman told Latitude Media last fall, scaling data center demand response depends on “new policy frameworks that better incentivize this type of participation from large scale energy buyers.”
The coalition may also represent an opportunity to build utilities' comfort with new tools. Kleber Costa, chief commercial officer of AES Clean Energy, which builds solar-plus-storage projects for many of the hyperscalers and is not a member of the coalition, said in June that he was unsure how far the industry would be willing to push flexibility, given data center operator reluctance. (Costa was not commenting on the EPRI coalition specifically.)
“Data center operators do not like to be flexible — in the same way that if you ask a power plant operator, they don’t like to ramp their plants up and down,” Costa said at the time. Industry embrace of those tools, he added, may ultimately just come down to price.
In a matter of months it has become clear that flexibility is precious to utilities — and therefore to the data center developers that need power as well. Today, AES said it is leading co-location efforts with hyperscalers. It is using AI for flexibility at many of those projects, including at the Baldy Mesa project in California that opened this summer.
And as grid constraints complicate the expansion of data centers, EPRI’s Ratnayake said the tech industry is increasingly eager to work with utilities. “Data centers are leading the load growth trajectory at this point in time and the technology industry’s willingness to collaborate with the energy sector is proving a unique opportunity,” she said.
Editor's note: This piece was updated on November 4 to clarify that AES is not a member of DCFlex, and to add some context to Kleber Costa's quote on data center operator reluctance to be flexible.