Interview
Sponsored
AI
DATA + CLOUD

Why a Google data center architect thinks AI energy fears are overblown

Urs Hölzle thinks the worst-case scenarios are not based in reality.

|
Published
October 18, 2024
Listen to the episode on:
Apple Podcast LogoSpotify Logo

Photo credit: Google

Photo credit: Google

In January, when the International Energy Agency warned that data center electricity consumption could double in two years, it set off a mild panic about the energy and emissions impact of artificial intelligence. 

Since then, power supply has eclipsed chip supply as the biggest restriction for scaling AI. And the local impacts of rapid data center expansion are often acute: grid capacity constraints, lifelines for aging fossil plants, and fights over who should pay for infrastructure upgrades.

But in its latest World Energy Outlook, the agency implied that it isn’t actually that worried about the impact of AI-related data center demand on a macro scale, saying that it still only amounts to a small sliver of projected electricity growth across industry, electric vehicles, and heating and cooling.

Urs Hölzle, one of the early architects of Google’s data centers, isn’t convinced by the most radical demand projections for AI computing either. McKinsey’s estimate is a particularly aggressive example; it projects data centers to consume up to 12% of America’s electricity by 2030, up from 4% last year.

“A lot of times when these forecasts are done, they come up with alarming numbers,” he said on a recent episode of the Where the Internet Lives podcast. “They assume that technology remains constant and that demand is increasing exponentially.” (Where the Internet Lives is a Latitude Studios partner podcast, produced with Google.) 

In 2022, Hölzle co-authored a paper that identified technical solutions for radically reducing the energy demand of machine learning by 100 times. He believes many predictions fail to account for these improvements.

“The world is not static,” he said. “We're really still at the beginning of understanding how these things work and finding ways to make them more efficient. Pretty much every new time we're building a new version of Gemini, that version is substantially more efficient in its training or in its serving than the previous one.”

Another reason he believes the worst scenarios won’t play out: it’s a matter of business survival.

“We can do a lot better. We have to find ways to double the speed, to halve the cost [of computing], so that we actually survive. In the end, you have a business. You can't afford your costs to go up exponentially. So you're really motivated to find ways to solve that,” he said.

Hölzle, the eighth employee at Google, was a key figure in the company’s early data centers and server designs. Today, he’s a fellow at the company focused more on research, including on improving the energy efficiency of computing in the AI era.

“AI is a huge computational problem,” he said. “You need a supercomputer to make a new model like Gemini. And then that supercomputer runs for weeks or months to just build this one model. And then you also need a lot of hardware to serve it efficiently, to actually answer questions using that model.” 

While we’re still “in the very early days of AI,” he added, we’re already learning how to both train models and operate servers “much more efficiently.”

Still, aggregate power demand is increasing steadily, partly because of ML and AI workloads. And there’s an open question about whether improved efficiency will enable faster expansion of the technology, fueling more data center expansion.

“What happens if NVIDIA comes out with a chip that's twice as efficient?” asked Brian Janous, Microsoft’s former VP of energy, at a recent Latitude Media event. “Well, then Meta or Microsoft or Google will plug in twice as many chips. If they have the power, they're going to build as big of a model as they can build.”

Hölzle believes it is “rational” to be worried about AI energy usage. But like a lot of AI enthusiasts, he also believes that the net benefit to society will outweigh the energy cost of computing.

“AI is just the next step of being able to optimize things…to make buildings lighter, to make cars lighter, or to find better ways of doing batteries that use less material,” he explained. “And so if you have AI…you make other processes more efficient, and therefore reduce the energy by using compute to run the rest of the world more efficiently.”

Listen to the full episode where Hölzle is featured:

Listen to the episode on:
Apple Podcast LogoSpotify Logo

This story borrows from an interview that appeared on Where the Internet Lives, a Latitude Studios partner podcast. Where the Internet Lives is a show about the unseen world of data centers, produced by Latitude Studios and Google. Follow on Apple, Spotify, or wherever you get your shows.

No items found.
No items found.
No items found.
No items found.
Get in-depth coverage of the energy transition with Latitude Media newsletters
No items found.