Interview
Sponsored
DATA + CLOUD
AI

Data center growth ‘dwarfs’ anything we’ve seen in computing history

Peter Freed, who built Meta’s early energy team, riffs on demand projections, development models, and capacity trends.

|
Published
December 6, 2024
Listen to the episode on:
Apple Podcast LogoSpotify Logo

A Meta data center under construction in Arizona (Photo credit: Around the World Photos) 

A Meta data center under construction in Arizona (Photo credit: Around the World Photos) 

Back in 2018, when Peter Freed’s team at Meta was scrambling to secure power for the company's latest data center, they called it "the wall": a surge in computing demand that seemed insurmountable at the time.

Today’s demand, though, “dwarfs the wall," said Freed, a founding partner at Near Horizon Group and Meta's former director of energy strategy, speaking with me at our Transition-AI conference this week. “The wall is a tiny little hump.”

Freed has watched the data center industry transform from a relatively straightforward real estate business into one of the most important drivers of electricity demand in the modern economy. When he joined Meta in 2014, the company was still building one data center at a time. It had completed three facilities with hardly anyone dedicated to energy strategy — it was just something Meta dealt with after construction.

But as projects got bigger and Meta started building multiple data centers in parallel, energy became central to development decisions. And now the rise of artificial intelligence is accelerating these trends at a pace that would have been unimaginable just a few years ago.

"It is not an exaggeration to say that the industry looks entirely different today than it did 10 years ago," Freed told me.

Freed is "extremely bullish" on data center growth as our digital lives expand. Add in the computing demands of artificial intelligence — both for training large models and running inference — and we're entering what he calls "unprecedented level of demand from the sector."

This surge comes just as the economy is electrifying everything else as well. "We just spent two and a half years passing a variety of federal legislation to incentivize electrifying darn near everything we can — and it's working," Freed said. "So how do we think about this moment in time where we've got the immediacy of the data center load growth coupled with a longer trajectory growth in electrification across the economy? Very interesting times."

In this wide-ranging interview at Latitude Media's Transition-AI conference, Freed discusses how the AI era is reshaping the data center landscape, why traditional development models are breaking down, and what it means for energy markets. (Our conversation has been edited for clarity and brevity.)

A report from McKinsey released earlier this year anticipated that the U.S. will see 50 gigawatts of demand by the end of the decade. When you look at numbers like that, what signals are you seeing now that lead you to believe that is possible?

Peter Freed: Looking at the trajectory of announcements from the hyperscalers, we are seeing more announcements this year than we've ever seen before. However, many of these announced projects were likely already in planning cycles. The data center planning horizon refreshes projections yearly and builds to that. What's probably happening is that projects that would have been announced or deployed in 2025 are now being pulled into 2024.

I believe 2025 is going to be a pivotal year, when many projects initiated in 2024 will begin breaking ground. Given a three- to four-year deployment timeline for data centers, those loads will really begin coming online in 2027 and 2028. That's the beginning of an interesting window that coincides with economy-wide electrification. The back end of that window is when we expect to see new clean firm power sources like nuclear and advanced energy technology come online.

Many people think we're going to see these multi-gigawatt scale campuses. Some think the industry will be limited to 200-megawatt conventional building blocks. What size do you think we'll see and what are the impacts on the grid?

Peter Freed: I'm somewhat skeptical that we will see broad deployment of the mega-campus design. While it's not a new concept — all hyperscalers have pursued it at some point with 800-megawatt to several-gigawatt campus sizes — it hasn't taken off previously because it's extremely challenging to develop. Getting electric utilities, entitlements, enough land, and workforce for large-scale infrastructure is very difficult.

The industry has settled into a typical hyperscale data center building block size of about 200 megawatts. You often see 400, occasionally 600 or more, but it's usually in these 200-megawatt increments. There's a reason for that — it's pushing up against the limit of what's relatively easy to develop.

Peter Freed and Stephen Lacey in conversation at Transition-AI 2024 (Photo credit: Anne Bailey)

About a year ago, there was this idea that putting more GPUs in one place would yield better training outcomes for generative AI models. But that wasn't based in computer science — it was more of a Silicon Valley mentality of "we used 30,000 GPUs to train the last model, what if we got to a million?" Now people are questioning whether that's true, and we're starting to see data suggesting it might not be. My guess is that we'll figure out distributed training: putting GPUs in different places while still running large-scale training across them.

Once we find a way to solve the computer science challenge of needing to put compute-intensive resources in a single place, we'll probably return to the mean of something like a 200-megawatt building block. It's much easier to do. Will we see any mega-campuses? Certainly. We already saw the Entergy-Meta project in Louisiana that appears to be a multi-gigawatt campus from the regulatory filing. We'll see a few more of those, but I don't think it's going to become the dominant model.

Let's talk about how utilities are responding. What are the most common acute challenges that you're hearing from utilities that you're helping them solve?

Peter Freed: I'd love to say it was cool and innovative technology deployments, but the truth is they're just trying to figure out how real this demand signal is. I've been in and around the data center business for 15 years, and I have not seen anything like the level of speculative behavior that we see in the marketplace right now.

The load-side interconnection process — and frankly, "process" is probably too strong a word — is not sophisticated, and it hasn't needed to be for a long time. Load has been relatively flat for 20-ish years, so utilities were happy to have economic development that came with low growth. The barriers to entry are low. In many cases, the load-side interconnection is just a spreadsheet on someone's computer.

That doesn't work in an environment where two people and their dog in a pickup truck are now deciding they're going to be data center developers. Utilities are completely inundated with requests, especially in hot markets. They're trying to figure out what to build. I can say with high confidence that the projections we're seeing will be wrong, both because projections are always wrong, and because we're seeing duplicative requests.

First and foremost, utilities are trying to figure out what's happening and how to respond. What should they build? What goes into resource planning? How do they address specific constraints on the system while fulfilling their primary responsibility of providing reliable power to all customers?

Secondly, they're trying to figure out how to take advantage of this opportunity using their available resources. While some utilities are hesitant about more data centers due to capacity constraints, many utilities in the United States would welcome them, recognizing the benefits for local communities. I think we'll see more deployments in markets that historically haven't had data centers, but that's certainly going to be part of the path forward.

If we have this window from 2027 to the early 2030s to serve this new demand, what technologies do you think will be able to move through that window? What clean firm power options are you focused on?

Peter Freed: Let's talk about nuclear first. Clean firm power in general is on the other side of that window, and I'm doing a lot of work personally to try and accelerate deployment timelines for nuclear, particularly large-scale projects. While there's quite a lot of interest in small modular reactors, given the scale of what we're needing to do, just building multiple AP-1000 reactors is not the worst idea. But even in the most optimistic scenarios, we're still talking about 2032 and beyond.

So for that 2027-2032 window, we need to make some interesting decisions. The first thing I think about is how to use the existing grid better than we currently do. There's a suite of grid-enhancing technologies that we've known about but have been hard to deploy economically. This moment of tremendous load growth creates real opportunities to unlock these technologies' potential if we can clearly demonstrate their ability to deploy additional capacity for data centers or other uses.

Listen to the episode on:
Apple Podcast LogoSpotify Logo
LATITUDE STUDIOS
Elevate your brand with Latitude Studios

Work with Latitude’s expert team of storytellers, producers, and marketers to create impactful campaigns that generate leads and enhance your brand’s thought leadership.

LEARN MORE
LATITUDE STUDIOS
Elevate your brand with Latitude Studios

Work with Latitude’s expert team of storytellers, producers, and marketers to create impactful campaigns that generate leads and enhance your brand’s thought leadership.

LEARN MORE
LATITUDE STUDIOS
Elevate your brand with Latitude Studios

Work with Latitude’s expert team of storytellers, producers, and marketers to create impactful campaigns that generate leads and enhance your brand’s thought leadership.

LEARN MORE
LATITUDE STUDIOS
Elevate your brand with Latitude Studios

Work with Latitude’s expert team of storytellers, producers, and marketers to create impactful campaigns that generate leads and enhance your brand’s thought leadership.

LEARN MORE
Get in-depth coverage of the energy transition with Latitude Media newsletters

We still have a translation problem — utilities aren't necessarily thinking about this, and neither are the companies looking to develop data centers. We need to figure out ways to demonstrate that deploying specific technologies in specific locations could unlock additional data center capacity.

I've been looking at an interesting analysis from both Rhodium and McKinsey about headroom inside the existing natural gas fleet in the United States. We've got somewhere between 100 and 200 gigawatts of available gas energy on the system that we can work with. If we start thinking about addressing capacity constraints through load flexibility or battery deployments, we can unlock some of that available capacity much sooner to power these facilities.

What's your outlook on geothermal? Is the path to scaling geothermal for data centers more or less complicated than nuclear?

Peter Freed: I love geothermal. I think it's the clean, firm technology that people talk about least. It's certainly simpler and probably cheaper than building new nuclear, at least in the beginning. The challenge is geography. Fervo Energy gets a lot of attention for demonstrating that you can generate power with a dry well by pumping fluid down. But they're still mostly limited to existing geothermal basins in the Mountain West.

There's a company called Sage Geosystems, which Meta had announced a deal with recently, that I like because it pushes the geography out. While it's not viable for the whole country, you are getting east of the Rockies, which is an important development. To the extent that you're looking to deploy into markets where that's an acceptable technology, great.

There is a far-future geothermal technology called super hot rock, which basically means drilling super deep, and you can do that anywhere. But that's still longer out than new nuclear development.

Will the benefits of AI to the energy system and society outweigh the challenges and constraints?

Peter Freed: I am incredibly bullish on AI as a technology. I think we're at the tip of the iceberg in terms of capability. These technologies, especially the generative models, were largely science fair projects inside tech companies until 24 months ago. Now we have the full weight of the best engineering talent on the planet focused on making better products.

When I was at Meta, we had an internal AI research function where many folks were passionate about climate issues and wanted to help. One challenge we found was a missing translation function — people who knew about energy and climate didn't understand what AI was capable of or how to use it, while those passionate about climate work inside the organization didn't really understand the energy issues they should be working on.

Those barriers are starting to break down as we're seeing significant venture capital flowing into the intersection of AI and energy and climate. But I think it's still early days. Personally, I believe there will be substantial benefits across the energy sector from deploying this technology that will likely outweigh the costs of running the models, especially as we see significant efficiency gains over time.

No items found.