Analysis
Sponsored
AI

Energy is now the ‘primary bottleneck’ for AI

A year ago, there was a scramble for chips. Now, it’s for energy.

Listen to the episode on:
Apple Podcast LogoSpotify Logo
A large data center and cloud computing facility.

A large data center and cloud computing facility. Photo credit: Costfoto / NurPhoto via Getty Images

A large data center and cloud computing facility.

A large data center and cloud computing facility. Photo credit: Costfoto / NurPhoto via Getty Images

A year ago, access to chips was the biggest concern for tech companies scaling AI. Now, it’s getting enough clean power.

That was the consensus at a recent Transition-AI event featuring top experts in data center design, power delivery, and market analysis.

“Six to 12 months ago, I think a lot of companies were concerned about chip availability, but I think everyone has come around to realize that that is not going to be the biggest constraint,” said Brian Janous, co-founder of Cloverleaf Infrastructure. It is “absolutely power.”

We’re in the midst of a data center boom. Microsoft plans to double new data center capacity every six months. According to a leaked internal presentation, the company has built AI clusters using GPU chips in 98 locations around the world. Other hyperscale data center operators are expected to double capacity every four years, mostly to meet the demand for AI.

“If you look at that quarter over quarter, year over year, [hyperscalers] are all accelerating in investment. It's a great leading indicator of how much power, how many chips they're going to plug in in the future,” said Janous.

Janous is the former VP of energy at Microsoft. Seeing the wave of new electricity demand coming from hyperscale computing and manufacturing, he co-founded Cloverleaf earlier this year to acquire “ready-to-build” sites near load pockets that will feature batteries, renewables, and grid-enhancing technologies.

Data centers are an energy efficiency success story. Over the last 25 years, internet traffic has climbed more than 500x while data center electricity use has remained flat. The servers and energy infrastructure have gotten much more efficient, and the biggest tech companies have focused on powering those warehouse-scale computers with renewables.

That experience makes data center energy use hard to predict. After the IEA projected that data center energy use would double in the next two years, researcher Jonathan Koomey told Latitude Media that everyone needed to “calm the heck down.”

But Janous believes the AI arms race will be different. A year and a half after the public launch of ChatGPT 3 kicked off fierce competition over generative AI among top tech companies, a scramble for power is underway.

“What happens if NVIDIA comes out with a chip that's twice as efficient? Well, then Meta or Microsoft or Google will plug in twice as many chips. If they have the power, they're going to build as big of a model as they can build,” he said.

John Belizaire, CEO of data center developer Soluna, said the design needs of data centers are changing: “It's important for me to stress that energy is the primary bottleneck.”

“When you look at today's data center design, it's designed for traditional compute processes and servers that use a 10th of the power that this new technology uses,” he said. “It's at that level where the entire building and facility will just be AI processing. How are we going to power that?”

Soluna is building smaller, modular data centers that run "batchable" processes and can shift computing needs based on availability of variable clean energy. The company partners with developers and IPPs to co-locate renewables with data centers, and to buy their curtailed energy. 

“That is what we're calling renewable computing. The concept of bringing computing to the location where these power plants are located…using the computing as a battery. It's becoming clearer that that is the future of energy. The convergence between energy and computing is inevitable at this point,” explained Belizaire.

That convergence is causing a problem for hyperscale data center developers looking to build or expand facilities quickly: utilities planning new gas plants to serve them. 

Duke Energy, Georgia Power, and Tennessee Valley Authority are collectively proposing 11 gigawatts of new gas plants to meet load increases over the next few years. Microsoft, which has seen emissions rise by 30% because of AI, is pushing back on Georgia Power’s plan to build combined-cycle fossil gas plants, arguing the utility is undervaluing renewable energy and overestimating data center load.

“The rush to new gas for many utilities is real. It's kind of their fallback option for what to build when they see new load coming online,” said Michelle Solomon, a senior policy analyst at Energy Innovation.

New industrial activity and electrification are set to double load growth over the next five years. Solomon says serving data centers is the perfect opportunity to “learn to grow again” without building lots of new gas.

Data centers currently make up about 1% of total U.S. power demand. “To reach a net-zero economy, we need total electricity demand to totally double or even triple or more by 2050. So while these are significant increases in electricity, we actually need a lot more.”

Solomon pointed to Xcel Energy, which is expecting a large increase in power demand in the coming years, but is looking to make the most of its current infrastructure. The utility plans to retire all of its coal plants by the early 2030s by bringing new renewable sources online at retired coal sites, utilizing existing interconnection points. 

“AI is not a special unicorn that requires a gas plant to meet its electricity demand. So utilities, again, want to just build the thing that they know how to build, but a new gas plant doesn't get built overnight either. So I think we need to be looking at what are all the options for building new energy quickly. A lot of it depends on getting more out of our existing grid so that we can use cleaner resources,” said Solomon.

Available options include batteries, behind-the-meter renewables, grid-enhancing technologies to unlock new capacity, and demand response capabilities from data center operators. All three panelists agreed that the technologies exist to serve new loads without relying on new gas.

“I mean, ultimately it's about how do we sort of even mimic flexible loads? Even if I'm a data center and I need 24/7 uptime, there's a lot of resources that data centers already have on their sites. They have generators, they have batteries. Those can be optimized and they can be expanded and they can be used to provide flexibility back to the grid,” said Janous.

“A lot of this is just having more of a conversation between customers and utilities. And from my experience, those conversations are not happening at the level and depth they need to happen.”

Listen to the full conversation on The Carbon Copy podcast.

Listen to the episode on:
Apple Podcast LogoSpotify Logo

Join Ahmad Faruqui, Scott Engstrom of GridX, and Stephen Lacey of Latitude Media for a Frontier Forum focusing on the importance of good rate design – and the consequences of getting it wrong. The online event is on June 13 at 12pm ET. Register for free.

No items found.
No items found.
No items found.
No items found.
Get in-depth coverage of the energy transition with Latitude Media newsletters
No items found.