Podcast
Sponsored
DATA + CLOUD
U.S. market

Explaining the ‘Watt-Bit Spread’

A data center developer on why power prices are too low — and the consequences for the data center boom.

Listen to the episode on:
Apple Podcast LogoSpotify Logo

Photo credit: Sander van der Werf / Shutterstock

Photo credit: Sander van der Werf / Shutterstock

Every data center company is after one thing right now: power. Electricity used to be an afterthought in data center construction, but in the AI arms race access to power has become critical because more electrons means more powerful AI models.

But how and when these companies will get those electrons is unclear. Utilities have been inundated with new load requests, and it takes time to build new capacity.

Given these uncertainties, how do data center companies make the high-stakes decisions about how much to build? How sustainable is the rate of construction? And how much will these data center companies pay for electricity?

In this episode, Shayle talks to Brian Janous, co-founder and chief commercial officer at data center developer Cloverleaf Infrastructure. Brian recently explained how he thinks about these questions in a LinkedIn post titled “The Watt-Bit Spread,” which argues that the value of watts is incredibly high right now, and the cost of those watts is too low. Shayle and Brian cover topics like:

  • The unclear data center demand and high costs that are making data center companies hesitant to build
  • How the skills required for data center development have shifted from real estate and fiber to energy
  • Why higher power prices are needed to incentivize new generation
  • Potential solutions for better pricing electricity and speeding up the construction of new generation

Recommended resources

  • Latitude Media: AES exec on data center load: 'It's like nothing we’ve ever seen'
  • Latitude Media: Mapping the data center power demand problem, in three charts
  • Latitude Media: Are we thinking about the data center energy problem in the right ways?
  • Catalyst: Can chip efficiency slow AI's energy demand?
  • Catalyst: Under the hood of data center power demand
  • Sequoia Capital: AI’s $600B Question

Catalyst is brought to you by EnergyHub. EnergyHub is working with more than 70 utilities across North America to help scale VPP programs to manage load growth, maximize the value of renewables, and deliver flexibility at every level of the grid. To learn more about their Edge DERMS platform and services, go to energyhub.com.

Listen to the episode on:
Apple Podcast LogoSpotify Logo

Transcript

Tag: Latitude Media, podcasts at the Frontier of Climate Technology.

Shayle Kann: I'm Shayle Kann and this is Catalyst.

Brian Janous: I don't know that there's any energy conversion that creates a greater return than turning an electron into a bit.

Shayle Kann: You can't really understand the dynamics of AI and energy until you understand the Watt-Bit Spread. I'm Shayle Kann. I invest in revolutionary climate technologies at Energy Impact Partners. Welcome. So the world of energy is usually one where change occurs kind of slowly, but once momentum builds, it becomes this incredibly powerful force that changes the world. Just taking one example of that, let's talk about the rise of natural gas for power generation in the US, which has been a big story, but in the year 2000, natural gas made up about 17% of power generation here in the US. By 2020 it was 40%. So by the standards of this market, that's a seismic shift, but that's also 20 years to get a 2.3x market share growth. Contrast that with the pace of change that we're seeing today in the AI landscape, and you can start to understand where the nexus of these two things—compute and power—is showing all sorts of tension.

We've talked about it before on this podcast, numerous times. It's emerged as clearly the biggest issue in the electricity sector right now and arguably one of, if not the biggest issue in the AI sector as well. But amidst the frenzy and the hype, of course, what I feel like has been missing is sort of a unified narrative of what's actually going on and the underlying drivers of everybody's behavior. Yes, utilities are inundated with large load requests and yes, access to power is probably the critical driver of data center growth today. But what is defining the amount of growth that we should expect to see? What is defining how those customers will pay and how much they're willing to pay and whether it is sustainable, what they're building right now?

My friend Brian Janous has a theory on this that he calls the Watt-Bit Spread. He wrote a really good piece on this on LinkedIn a while back that I encourage you to check out. I found it to be a pretty helpful heuristic as I read news articles and talk to folks who are in this industry day to day. So I wanted to talk through it with him. Brian is of course the co-founder of Cloverleaf Infrastructure, which develops power ready sites for large loads like data centers, but he previously spent over a decade as the VP of energy at Microsoft. Here's Brian. Brian, welcome back.

Brian Janous: Thanks, Shayle. Appreciate you having me.

Shayle Kann: All right, so I was thinking about this the other day, once in a while you talk to somebody who has been toiling on a thing or working on a thing, I guess for a very long time that was sort of out in the outskirts of importance of some market, or at least in the general public's understanding and has been building up this big base of experience and knowledge in that space and then all of a sudden something happens in the world and it rockets to the epicenter of everybody's attention. I feel like that's you, right? You've been focused on data centers and energy for what, 12 years or something like that?

Brian Janous: Yeah, 13 years at least.

Shayle Kann: Yeah, and it's been like maybe it wasn't, I don't know, you could tell me, right? Maybe it wasn't a momentary like oh my God, all of a sudden this is the thing. But it's certainly been, I don't know, the slope has curved upwards for the past couple years at least. And I'm curious what that experience has been like for you.

Brian Janous: Yeah, I mean I remember when I first joined Microsoft and it was actually right about this time, 2011, it still wasn't even clear to me why that job existed. I really thought maybe I would do it for a couple of years and then go on and find something else to do because I mean who really wanted to be the energy person at a tech company, didn't even really understand why that was that important.

Shayle Kann: And you were a cost center, right?

Brian Janous: Yeah, yeah. We were just a cost center. We were just paying utility bills. It wasn't particularly strategic for the company. And I remember I used to tell people on my team or when I'd be recruiting people, I was like, this is a really cool job. It's great working for a tech company, but energy's never going to be office. No one's going to ever think about it that much. And I remember an old manager telling me, Steve Ballmer probably thinks about energy for one minute a year. And I was like, yeah, that sounds about right. That's probably all the head space he needs to give it. I certainly would wager to say Satya [Nadella]'s thinking about energy a lot more than one minute a year. It truly has become existential for anyone who is, whether you're a big tech company or whether you're a provider of infrastructure into the space, whether you're selling GPUs or equipment that goes in data centers, the ability to acquire the energy needed to build out this infrastructure has become so critical to any industry that touches cloud and AI.

Shayle Kann: Yeah. All right, so you're now at the epicenter of this madness and we've talked before on this podcast a little bit about it, but it's been a little while. And so I guess I want to start by bringing us up to speed on what you're seeing in the trenches, so to speak today. What's changed over the past, I don't know, six to nine months at this nexus of data centers and energy?

Brian Janous: Well, the market is still clearly very hot. There's huge demand for data center capacity, though there is a caveat in there. I feel like a lot of the big players right now are struggling with something very similar that we struggled with back in the 2010s when we were building out cloud infrastructure, which is exactly how much infrastructure should any one company build because you're sort of building for your own stack. I mean keep in mind, like Microsoft and Meta, they're not Equinix, they don't build data center capacity and then lease it out. I mean they're largely running first party platforms. So that makes demand planning pretty tricky because even if you have extraordinarily high conviction as we did in the early 2010s that the cloud market was going to be very large, we did not have incredible conviction on exactly what Azure's market share would be versus AWS, versus GCP.

Shayle Kann: My friend David Kahn, different Kann, but David Kahn from Sequoia coined this term, the $600 billion question, talking about this, which is if you add up all of the CapEx announced and planned from the major players in data centers, it's like 600 billion dollars. And the question is, for any given one of them, there's some share of that 600 billion, how much of the revenue, you need 600 billion plus in revenue or ideally 600 billion plus in earnings to make up for that CapEx, how much of that can they attain? And so it's a classic. You've got a land grab combined with a tragedy of the commons and figuring that out is super hard.

Brian Janous: It's really hard. And what happened during the 2010s is you had the colocation market sort of fill the gap because every company under invested in their infrastructure, which is why Microsoft and Amazon and Google lease capacity from the likes of Vantage and CyrusOne and Equinix, but the difference between that era and this era is the skill set needed to fill that gap is not the skill set that was used in the prior era, which is being really good at real estate and fiber.

And if you look at the makeup of those colocation companies, they're largely real estate and fiber dominated in terms of the talent and the skills because energy wasn't a challenge when we were building 50 megawatt data centers for a hundred megawatt data centers. Now the world has changed and those companies that filled the gap in the last era are really not particularly well-equipped to fill it in this era because the depth of energy talent at those companies is actually quite thin. And so it's really changed the problem set to being not solving for can I find real estate in some proximity to Northern Virginia, but where can I get a gigawatt of power, which is a completely different challenge.

Shayle Kann: Does it turn out, we've talked about this before, right on the stack ranked list of important factors in data center siting power, shot to the top and remains at the top now, but does it turn out that the other ones, fiber and land and all that water, labor, are those things just less important than everyone considered them to be five or 10 years ago? Or are they just as important and there just happens to be one thing that's more important than the rest of them?

Brian Janous: I mean, I think they're all still important. I mean, we're still going to have demand for services with low latency. In some ways labor has actually become more important in that if the data centers we're building today are 10x, the data centers we were building just a few years ago, the amount of construction labor required is enormous. And so I think in a lot of ways people are underestimating the importance of labor and that is going to become a challenge because there has been this tendency to say, well, we'll just go out in middle of nowhere and build these giant data centers, which-

Shayle Kann: Because of the power problem, right?

Brian Janous: Because of the power problem.

Shayle Kann: Northern Virginia is densely populated, but middle of nowhere, Southwestern United States where you might have a gigawatt of power capacity is not.

Brian Janous: Right. But if too many people try to do that all at once, you're going to create a labor problem and you're going to have significant delays in getting a lot of this infrastructure stood up. Yeah.

Shayle Kann: Yeah. All right, so it's clearly still dynamic. I mean I sort of interrupted you, but you were basically saying it's still a hot market, everybody's trying to build still. I mean, if you're a power provider, you're still getting inundated with large load interconnection requests more so than ever perhaps. But there is this demand planning challenge. Is that manifesting in any change in strategy for those who are trying to build data centers? Are they pulling back? Are they pairing the plans? Are they just full steam ahead and hoping?

Brian Janous: Yeah, I think it's more, probably somewhere in the middle. I think there is some pullback not because again, there's not conviction that the opportunity is there, but rather again, for any one individual player, the ability to commit billions of dollars to electric utilities to build out more good infrastructure is a hard pill to swallow if you're not fully convinced that you have a customer on the other side of that to receive it. So that's the challenge I think that the industry's in right now, is that there is still some hesitation when it actually comes time to write that check.

And utilities are getting a little more savvy around really holding feet to the fire for some of these companies. Some of them are big tech companies and some of them are just two guys with a truck that decided they were going to be data center developers and they go get a cute position and they have to pay 10,000 dollars, that's it, to get in the queue, which is shockingly low. But utilities have just never had to deal with this before having this much large load coming in at once. So I think it is in some ways sort of looking a little bit like a pullback when it's really just uncertainty. I think the conviction is still very much there in the market that the AI market is going to be huge and we need a ton of infrastructure, but when it comes down to writing those checks, it can get pretty problematic at the scale we're talking about.

Shayle Kann: Yeah. You and I have talked to this about this before, so correct me where I've gotten, how you've explained this to me wrong, but my recollection is that in previous waves here, the way that this market was mostly structured was either for hyperscale data centers, for large data centers, either a hyperscaler themselves, so Meta, Amazon, Google, Microsoft would build a data center themselves for their own demand, in which case they have one presumes pretty good visibility into how much they're going to need. To your point, if anything, they're maybe conservative historically and they don't build enough, but that means they have especially good visibility that we know we're going to need this. Or you had one of the colos, one of those companies that you mentioned before and then a few others who were not really building on spec. They would go where one of the hyperscalers already had nearby sites or where they had signaled that they had demand and so on.

And so nobody was really building, I guess what I would think of as a merchant data center and building it, spending a lot of money on it and then leasing out the capacity. That wasn't happening a whole lot, but as this wave has taken hold, it feels like that started to happen and maybe what's going to, and that results in all sorts of, a bigger pipeline fallout, you'd assume a bigger pipeline fallout over time. That's part of what utilities are concerned about. If your AP and you have 80 gigawatts of large load requests, you know not all of that is real, but it's kind of hard to figure out which parts of it are and aren't or not. So to the extent that there's a pullback, I guess the hope would be that we go back to a market where there's reasonable certainty about demand for any given site that is getting developed. And so you can be reasonably certain that if you get the power this scarce asset, then it will be sufficiently utilized.

Brian Janous: Yeah, I think the problem that we're seeing today though is that in a world of relative power abundance, all you needed to do was get a piece of land. You didn't actually have to spend a whole lot of money to in essence, pre-provision a data center that could be built say in 18 months. So you were always sort of 18 months out from being able to push bits right into the world for relatively low cost. I mean the cost of land is a very small contribution to the overall data center.

Shayle Kann: Can we pause on that for one second? Because we talked about that. I don't think everybody actually recognizes how big a delta there is between the cost of the land and the cost of the shell and the cost of the data center itself. Can you just quickly walk through those economics?

Brian Janous: Sure. Yeah. I mean on a total cost basis over a fifteen-year period, TCO is a way we think about it in the world, land is about 1% of the cost to operate a data center, the full stack with servers and everything. So it's pretty inconsequential. So going out and pre-positioning land is sort of an easy decision for any provider to make. Where it gets problematic, where we're at today is in a world of constraint. It's not just land. It is land plus clear line of sight to power in that same sort of 18 to 24 month time horizon. And if that necessitates the utility building, new generation, building new transmission, acquiring more substation capacity and transformers, the cost now goes up considerably. It's not just that 1% anymore.

You're making multi-billion dollar commitments to utilities in purchase and cancellation agreements. So that's sort of changes the game. It makes it a much more difficult to in essence, pre-provision for that power, not to mention the complexities of the utility planning side of the tariffs and the regulations and the technology and everything that's required to move a piece of dirt into a piece of powered land into a data center. That whole equation has become substantially more complex, and so it makes it much more difficult for a co-location company or two guys in a truck to get to a point where they have a piece of dirt that can house a data center in 18 months.

Shayle Kann: So we started to talk a little bit about the economics. You coined this phrase, the Watt-Bit Spread, which I think is actually, I found to be a really useful heuristic for understanding the underlying economics that are driving all the craziness that we're seeing right now. So I want to talk about what the Watt-Bit Spread is and then a little bit about what it tells us about how the behavior in the market that we are seeing and that we might see. So first maybe describe what the Watt-Bit Spread is.

Brian Janous: Sure. So the concept came from thinking about spark spreads in the power world of the delta between the cost of an MMBtu of gas to the value of the electricity you can create with that gas. And it's the same concept here is that the entire data center ecosystem is really about taking watts and turning them into bits. And so there's a value in that conversion, that energy conversion from electrons to light. And that value is pretty substantial. I mean, I don't know that there's any energy conversion that creates a greater return than turning an electron into a bit. And so not to mention the fact that the ability to acquire and collect those electrons then in some ways creates a moat against other folks that would want to try to do the same thing. So the more electrons that you can get a hold of, especially in the AI world, the more value you can create because you can create bigger training models and you can deploy more inferencing.

And so that value of a watt has increased substantially in the last, let's say 18 to 24 months. As a lot of companies realized, wow, if I can plug in more GPUs, I can build a pretty powerful moat. And so we've seen a huge demand obviously for getting a hold of these watts, but what you haven't seen is a commensurate increase in the price of the watts. If you look at your standard utility tariff, it's not like the price of electricity has skyrocketed as the demand has increased, and that actually is a problem in that the market is not accurately reflecting the value of those watts. And when I say a problem, I don't mean we don't want to see, obviously rates go up for residential customers and all consumers just because a bunch of AI companies want power, but it's a problem in that if it's not sending the right sort of price signal to a utility or to an IPP, then we're going to build less energy infrastructure and therefore plug in less GPUs.

Shayle Kann: So your basic premise here is that the willingness to pay for watts or watt hours is higher than the current price of those watts or watt hours. The value that you can generate with them is higher than the current price, and so one could raise prices if it unlocked more supply. You're saying that that spread, the Watt-Bit Spread is high basically?

Brian Janous: Exactly, yes. That if the true value of those watts were reflected to electric utilities into IPPs, then they would be incentivized to build more infrastructure faster or even go further upstream into say, Hitachi and the production of transformers, right, or switchgear. Those two have more value than their current market price reflects.

Shayle Kann: How do we know how high? Because one of the challenges here, to your point before about demand uncertainty, my suspicion is there's also a lot of revenue uncertainty, and so right now there is this arms race and everybody wants, if you can get watts, you'll pay a premium and there's a lot of money sloshing around into that space. But is there some risk that the underlying Watt-Bit Spread is not as high as it seems today? Could it be artificially inflated today by just where we are in the cycle, or do you think that's a sustainable thing?

Brian Janous: It's possible though. I suspect that given my fundamental belief is that the demand for compute, let's just say through 2030, will exceed the available power in the market and therefore the marginal value of the next watt or gigawatt, frankly that you can produce will remain quite high for some time because there will be a shortage of power available to plug-in GPUs over that period. Now at some point we will get back to some level of equilibrium where the market starts to settle out, but just given the time dimensions here of what it takes to build out energy infrastructure, I feel pretty convinced that we're going to be in a period of shortage and therefore every marginal gigawatt is going to have substantial value to some player in the market.

Shayle Kann: Sure though, putting myself in the shoes of the infrastructure provider, that timeline worries me, you're asking me to build more power generation, you're asking me to build more transmission, you're asking me to spend billions of dollars in CapEx on things that are supposed to last 20, 30, 40 years. And if we have visibility and good certainty into willingness to pay out to 2030, I mean I'm going to generate two, three years of value at that price by the time I actually build anything out. And so I need to feel as the infrastructure provider that I have pretty high confidence these assets I build are not going to be stranded or overpriced or whatever.

The economic value proposition that I'm offering is sound for a longer time into the future than that. And that feels like it's where there's a little bit of a disconnect in the market. I think there's a couple of places where there's a disconnect in the market because easy to just say, okay, well raise prices, use that money to build more infrastructure and everything is going to get solved here. But as you know, as well as I do, this market, the electricity market is not structured entirely that way. So one of the ways in which I feel like there's a disconnect is just on that timeline. So I wonder how you think about, how do you incentivize the construction of new long-term infrastructure for an uncertain market?

Brian Janous: Yeah, so when I say that, the way I think about the spread is it's really about the value of capacity in a given year. So the value of a megawatt in 2027 is worth more than a value of megawatt in 2032 because there's an assumption that by 2032 power will probably be more abundant that we'll have sort of run through this cycle. So the real question of how you monetize this in a way that's rational for all actors is as a utility, I should look for ways to make investments in my system that allow me to accelerate the delivery of capacity and find entities that are willing to pay for that capacity in a given year at a higher price relative to delivery in five years later.

That doesn't mean they're paying on a per megawatt hour basis more forever, but it means they're paying in essence higher demand charges to recover the cost of that infrastructure so that they can get plugged in sooner because that load's not going to go away. I mean, data centers don't really get turned off. Once you have the customer, they're going to be there. So that investment will be in the utility parlance used and useful for its useful life. It's really about the timing of when I deliver that first electron, that's what sort of being mispriced right now.

Shayle Kann: You also, when you put together this Watt-Bit Spread piece, described it in terms of manufacturing theory in a way that I thought was useful to think about how different actors in this equation are thinking of it in different economic terms. So can you walk through the, who is operating how?

Brian Janous: Yeah, so in terms of manufacturing, there's sort of two different models. One is sort of lean manufacturing, which says hold as little inventory as possible. Everything's about just in time, and then if lead times extend to deliver, that's fine. The customers will just have to wait, but we're going to improve margins by not having a bunch of excess inventory. There's another model which is the theory of constraints, which says no, what you need to focus on is throughput. That's how you create enterprise value and therefore you need to find areas of the supply chain or the manufacturing process that tend to become constrained and make sure that you are always building extra capacity, so more labor, more overhead for those particular points of the system. So if I am downstream in the system, so I'm data center operators, I'm cloud providers, I'm Nvidia, I'm OpenAI, I'm very much thinking about the world in terms of theory of constraints because again, the cost of the base infrastructure for my end product is actually really low.

When I say base infrastructure, I'm talking about land, power, a data center shell, those things are relatively inexpensive in the grand scheme of operating my business, so I would be willing to pay a premium for those things to go faster. Now, if I hop over into the utility world, again, going back to what we were talking about just before, a utility sells an electron in 2027 for the same price that sells it in 2030, what is the incentive the utility has to go faster? What's the incentive the utility has to stockpile 345 KV transformers? None. All they do is say, well, okay, lead times are longer. So it used to be that I could connect a customer in two years and now it's seven years, and that's okay in sort of the utility world right now, it's not ultimately in their best interest. They should want to go faster and get more customers, but their business model is not designed in such a way to operate in this theory of constraints sort of world.

Shayle Kann: I think that's mostly true though. I think people, I know you know this, but a lot of people don't appreciate all of the constraints not to reuse that term in a different context that utilities have on them. And so they're thinking about, yes, they're incentivized generally speaking, to invest in CapEx, to build, spend CapEx and earn their regulated rate of return on that CapEx and then minimize costs so that the profits they actually gain are as high as possible, but they are subject to regulators who are unique and specific and have different things that they require. They also have to place the highest premium on reliability and resiliency and all these things. It doesn't necessarily cut entirely against what you were saying, but I think what my experience of watching this all play out over the last couple years has been okay, a bunch of huge customers started showing up on utilities doorsteps faster than they ever had before.

Utilities started by saying, okay, I'll put you through my normal process, which led to, okay, lead times are getting longer and longer and longer, but they're adjusting to that now. And so I'm curious what you think about, we've seen some of this publicly. There was a settlement recently, I can't remember which utility in MISO territory between a bunch of the hyperscalers and the utility around what a new large load interconnection protocol I guess would look like. I think we're starting to see more innovation around this, and I'm curious what you've seen out there so far and what you think it should look like.

Brian Janous: Yeah, I mean there's definitely been some work that utility has done. AEP certainly has been a leader in this around focusing on guaranteeing cost recovery for that. I mean, going back to what the issue you were talking about before is if I build this, you better come, you better show up and I'm going to hold you to some minimum amount, some minimum take or pay for what I build, and that's caused some consternation again with the hyperscalers because going back to what we were talking about before, they don't have perfect visibility into their future demand, and so if they have to sign up for long-term agreements that guarantee some minimum payment, that's a little problematic, but that to me is just sort of the blocking and tackling that has to happen when you're going through this new wave of actual electric demand growth. Utilities obviously are going to be conservative.

They're going to want to guarantee that cost recovery. What we haven't really seen from utilities, at least not in any significant degree, is innovation around how do I start to capture some of these excess rents that the market is offering me in terms of customers saying, "hey, if you go faster, I would be willing to pay a premium." And so for instance, if I'm a utility and I've been looking at, well, I say one of your portfolio companies form energy. I've been looking at doing a form energy project for the last five years and I just haven't been able to figure out what to do with it or how I am going to get the cost recovery because particularly if I'm doing any sort of early technology that has some cost premium and now I've got to recover that across my entire rate base, because I don't have any load growth, it's hard to justify making those investments as a utility.

But now today, if I can look at a suite of technologies that maybe were on my five-year roadmap to say, well, how would a form energy battery or how would a line vision dynamic line rating deployment allow me to accelerate the delivery of electrons to a particular subset of customers who are willing to pay to get power sooner? Now I can think differently about how I do cost recovery, and it could be more targeted towards these customers who have been saying, hey, give me power in 2027, not 2030. And so that's the innovation that we haven't quite seen yet, and this is what I described in that Watt-Bit Spread piece is the advanced grid tariff.

Thank you to Katie Fehrenbacher for that name. We were talking at a conference a couple months ago about what we should call this thing, and so she came up with advanced grid tariff and I like it, but it's basically the same concept that we have around green tariffs, which is, hey, if I'm a customer like what Google did in Nevada, if I'm a customer and I am willing to pay for a different, in that case quality of service, I want power provided from this Fervo geothermal project, not just power from the grid, then by all means you should be able to do that and the commission should approve deals like that because it's not harming the other ratepayers.

Now, this is the same concept, but it's really about capacity and how do we get electrons flowing sooner on the system? So what investments could utilities make that guarantee full cost recovery from this subset of customers, but allow them to get more out of the existing system faster? And a lot of this is going to be things like grid enhancing technologies, storage as transmission, because these are the things that can go quick.

Shayle Kann:

I think what you're describing just at the highest level is the idea that because of this Watt-Bit Spread, because there is extra value, extra willingness to pay on behalf of the customer, utilities can take advantage of that by saying, okay, let's have that customer subsidize the cost of a bunch of things that are going to get them power sooner, which is the main thing that they care about. I think in order for that equation to all pencil, just in terms of how the electricity market works, what you want is for those assets to benefit not just that customer. I think this is implied in what you're saying, but I want to make it explicit, right. The benefit should accrue to other ratepayers as well in the form ultimately of either higher reliability, I think, or more likely lower cost. You could imagine if the willingness to pay is high enough, then the data center customers are subsidizing lower cost electricity ultimately for the rest of the ratepayers, and I don't know how far that can go, but that's kind of where you'd want it to land, I think.

Brian Janous: Yeah, because ultimately load growth is good for everyone. Having a larger electric grid is inherently good. Having more data centers on the grid is good because they use power 8,760 hours a year, so there's more cost recovery for more infrastructure in more hours if you have more data centers. So it's inherently a good thing that the challenge is that, that first dollar that gets spent, how does that get recovered and is that guaranteed to get recovered? And that's where people start to get nervous of saying, oh, utilities are going to go, start spending billions of dollars into infrastructure, but they're also going to get billions of dollars of new revenue on the other side of that.

And so we have to look at sort of the long game on what is actually good for the system for all ratepayers, for reliability and for cost, because I completely agree that we don't want rates going up for average consumers just because utilities are trying to serve data centers. We have to design rates in such a way to ensure that we're doing appropriate cost allocation, which is something utilities have done for decades. I mean, we understand how to do cost allocation, assign it to different rate classes. In this case though, we're saying utilities should actually start to put a premium on time and the value of capacity and then find customers that are willing to pay that premium so that they can actually move faster.

Shayle Kann: You mentioned this before, but I've seen it a few times as well, where to the extent that utilities or others kind of float concepts like this, there is a bunch of consternation from the customer set about the cost that's being proposed to them. Do you think, are they just positioning ultimately and if you forced it, they would cave or have they not woken up? Has the customer not woken up to the actual scale of the Watt-Bit Spread yet?

Brian Janous: I think a little bit of that is positioning. I think a little bit of is sort of maybe just lack of really understanding the nature of building out large infrastructure. Again, there's a lot of players in this space that aren't particularly sophisticated when it comes to energy systems and utility rates and regulations. There are a lot that are, I mean, a lot of, especially the big tech companies have really extraordinarily talented energy teams and know how to work with utilities and regulators. But again, if you've got some company that was going to operate a Bitcoin mine or they were going to do green hydrogen project and now they decided, no, I'm actually a AI data center company now, it's like, okay, well, when you start getting questions from utilities about the cost of this infrastructure, sometimes people's eyes start to bug out because they see things that end in billions and they're like, whoa, I'm kind of getting in a lot deeper than I thought.

But I think that just shows a lack of understanding of the overall value of that electricity. Yes, the cost is very high when we're talking about several hundred megawatts or multiple gigawatts of power, but again, the revenue on the other side or even looking at the overall CapEx deployment. So the cost of a data center today in terms of the full stack CapEx deployment all the way through GPUs is about 25 million a megawatt or 25 billion a gigawatt. So if you go to a utility and say, "hey, I want a gigawatt of power", and they say, "okay, that's fine. Sign here." That's going to be a billion dollars of infrastructure. Yeah, that sounds like a big number, but in the context of the 24 billion that are going to follow it in terms of capital deployed, it's actually not that big. So you just really have to understand it in the full context of the CapEx deployment opportunity.

And now that's just the CapEx deployment. That's not the revenue and the margins downstream. So the overall value of all of that stuff behind that first billion dollars is enormous. And so as I've talked to utilities, I've very much been an advocate of, you should be putting more stringent requirements on companies that are requesting power, whether it's a hyperscaler or a developer like my company, Cloverleaf Infrastructure, or the two guys in a truck. If you're saying you're going to build a gigawatt data center or you're holding onto a gigawatt of power in queue position, at some point you need to be able to point to billions of dollars of capital that you have access to, to build out that infrastructure. And if you can't do that, you're probably not a serious player in this space.

Shayle Kann: All right, Brian, that's all the time we've got now. Fascinating times in this market as ever. I appreciate you helping me make sense of it, and I'm sure we're going to do it again soon because we're going to hear about the five gigawatt data center that's getting developed in Abilene, Texas or whatever.

Brian Janous: Exactly. I'm sure we will. Well, always. Great to catch up, Shayle. Thanks for having me on.

Shayle Kann: Brian Janous is the co-founder and chief commercial Officer at Cloverleaf Infrastructure. This show is a production of Latitude Media. You can head over to Latitudemedia.com for links to today's topics. Latitude is supported by Prelude Ventures, prelude backs visionaries, accelerating climate innovation that will reshape the global economy for the betterment of people and planet. Learn more at Preludeventures.com. This episode was produced by Daniel Waldorf, mixing by Roy Campanella and Sean Marquand theme song by Sean Marquand. I'm Shayle Kann and this is Catalyst.

No items found.
No items found.
No items found.
No items found.
Get in-depth coverage of the energy transition with Latitude Media newsletters
No items found.