Interview
Sponsored
AI
Grid edge

The transformative grid tech no one is talking about

Accelerated computing could have a big impact on utility operations — maybe even a bigger impact than artificial intelligence.

|
Published
August 21, 2024
Listen to the episode on:
Apple Podcast LogoSpotify Logo

Photo credit: Josh Edelson / AFP via Getty Images

Photo credit: Josh Edelson / AFP via Getty Images

The energy world has spent 2024 giddy about the potential technological breakthroughs enabled by artificial intelligence. But accelerated computing, AI’s lesser-known companion, is also likely to dramatically impact how electric utilities operate. 

Accelerated computing is a process that splits an application’s data-intensive parts from everything else, to be processed on a separate piece of hardware known as a hardware accelerator. Graphics processing units, or GPUs, are common accelerators because they can perform multiple tasks simultaneously, instead of one at a time like a central processing unit. 

Recently, chip manufactures like Nvidia have been working to increase the speed and processing power of GPUs without driving up costs — with major benefits for a system’s energy efficiency.

“For the first time, you've got technology and compute resources like GPUs that are price-performant for this type of work,” said Marc Spieler, senior managing director of global energy industry at Nvidia, in an episode of the With Great Power podcast.  

This development will have a significant impact on managing the power grid. Spieler added that there are many potential optimizations that are currently limited by how compute-intensive they are. Spieler and his team are currently working with utilities to speed up grid simulations, which show how the power grid will perform under different conditions and with certain equipment. The process is currently limited by processing power, and can often take longer than a day to complete.

​​”Today you can simulate a portion of the grid with so many interconnections…and it runs on CPUs and it can only go so big before you're outgrown a single computer,” said Spieler. “With accelerated computing, you can actually add way more aspects to that,[like modeling power flow from distributed energy resources]. With a bidirectional grid, now all of a sudden you want to be able to see many different options.”

Accelerated computing also allows for day-of decision-making, which can quickly improve operations, he added — and that’s crucial given the urgency of the energy transition. Spieler said that utilities in states with renewables and electric vehicle mandates, such as Portland General Electric in Oregon and Southern California Edison and Pacific Gas & Electric in California, are leading adoption of accelerated computing.

“Some of these utilities that really are trying to incorporate more renewables onto the grid in a very fast way have to be innovative,” he said. 

But that’s not the case everywhere. Spieler lives in Texas, and is concerned about how rising electricity demand will impact utility infrastructure. His 11-kilowatt Tesla, for instance, is drawing from his neighborhood’s 25-kW transformer. 

“I’m thinking, there's four houses attached to this. What's going to happen when we have four Teslas plugged in?" he said. “It's going to start to raise issues unless we're able to send communication really fast, faster than electrons.”

That’s an area where he sees advanced computing as having the potential to help.

“Looking at that data and coming up with better plans and operational procedures, I think will save some of these companies tremendous amounts of money, and it'll save people a tremendous amount of heartache,” said Spieler. 

For the full conversation with Marc Spieler on accelerated computing, listen to his interview on season 3 of With Great Power.

Listen to the episode on:
Apple Podcast LogoSpotify Logo

With Great Power is a show about the people building the future grid, today. It's a co-production of GridX and Latitude Studios. Subscribe on Apple, Spotify, or anywhere you get your shows.

Transcript

Brad Langley: In the late 2010s, big oil and gas companies started ramping up investments in alternative energy sources like renewables and hydrogen.

News anchor:  One of the biggest oil companies in the world is going solar, sort of. The new clean energy project happening right in Chevron's Bay Area backyard.

Brad Langley: It was a big deal for an industry that had long denied the connection between fossil fuels and climate change.

Marc Spieler: They were making substantial investments in geothermal work, hydrogen, wind, and other things. And I think when you start to see companies as large as Shell and Chevron and BP, and others start to make that transition, you realize that the industry is going to evolve.

Brad Langley: Marc Spieler was working for the oil and gas titan Halliburton at the time, and when he saw electric vehicles start to hit the consumer market, he knew it was time to make a change.

Marc Spieler: When we started to see Tesla and other EVs start to become more mainstream, it was clear that this transition was going to happen, and it was going to start happening a lot faster than what we'd seen in the past.

Brad Langley: Marc wanted to work on the energy transition, but he didn't want to build renewables or clean energy resources. He wanted to work on the technology that would drive down the cost of those resources and make them attractive business investments.

Marc Spieler: My father and my brother are both engineers. My father's an electrical engineer, so I always knew that I wanted to be in technology, and then the energy side of things came through just experience.

Brad Langley: So in 2019, after 13 years at Halliburton, he joined NVIDIA, the technology and chip-making giant. That year, NVIDIA announced a new line of semiconductor chips, specifically designed for artificial intelligence processing. The new chips promised to unlock more data processing power without driving up costs.

Marc Spieler: I think what technology has done, and especially AI and digital technology, is it's making these technologies much more affordable and more realistic to reduce the cost. And therefore, as you look at the energy transition, just like anything else, right? First, you have to be able to solve the problem, but then you have to make it affordable.

Brad Langley: For utilities and clean energy companies, advances in semiconductor chips over the past few years have made it easier to develop and adopt products that support cleaner resources, and that is exactly what Marc works on today.

Marc Spieler: Our job is to basically help energy companies develop and create, and move energy as efficiently as possible, using everything from high-performance computing to AI to visualization, to do it as environmentally friendly as possible.

Brad Langley: This is With Great Power, a show about the people building the future grid today. I'm Brad Langley, Some people say utilities are a slow to change, that they don't innovate fast enough. And while it might not always seem like the most cutting-edge industry, there are lots of really smart people working really hard to make the grid cleaner, more reliable, and customer-centric. This week I'm talking with Marc Spieler, Senior Managing Director for the Global Energy Industry at NVIDIA. Marc and his team support energy companies with physical and digital technology solutions like semiconductor chips and software platforms, and he's part of the group leading the AI revolution. Marc refers to his work at NVIDIA as building an ecosystem because NVIDIA partners with other technology companies to offer utilities and solutions. Not everything needs to be built from scratch. It's an approach that will speed up digitalization.

Marc Spieler: If you can create an ecosystem of third parties that are all working together and you're able to leverage both software and hardware platforms from other people who have already developed solutions, that you can then build on top of and take those software stacks and repurpose them for industry-specific solutions. All of a sudden, you can move that technology ahead considerably faster than if you're developing tools that already exist in the market today.

Brad Langley: I spoke with Marc about the work NVIDIA is doing in the power sector and how artificial intelligence is changing utility operations, and we talked about what the rise in data centers, the supporting infrastructure for AI processing means for utilities. But first, I asked Marc how his work in other parts of the energy sector impacts his work with utilities. How does your work in other areas of energy influence your work with electric utilities today?

Marc Spieler: I think there's a lot to be learned from. Energy is an industry of industries, right? There's retail, there's manufacturing, there's exploration, there's a lot of different areas that really cross paths with the work that we do today. Everything from data center opportunities for high-performance computing to edge computing and understanding things at the edge to gas stations and retail down to a smart meter. The technology differs, but the underlying use cases, let me say, the use cases differ, but the underlying technology are similar. And what this has allowed us to do is think about how we're solving these problems in other industries, and in my case, how did we solve them in oil and gas, and how can we apply them to utilities? And then, I've learned considerable amounts in the last five years, as every time I go back to corporate and I meet with my peers or I talk with my peers from other industries, it's amazing to me the work that's being done and how things like autonomous cars can be applied to autonomous grids.

Right. And I heard one of your previous sessions on virtual power plants and other innovative technologies. Once again, the goal is an autonomous grid, right? And so how do we train software stacks and put it out throughout the grid to be able to predict what's going to happen and have the grid respond based on things that it's seeing, measuring waveform data, no different than different sensors that sit in a car, and be able to react to it? Having seen it before because it was trained on data that you've either collected in the field or you've created synthetically and you've exposed it to.

Brad Langley: So, digging into some of your current work today, we've all seen the headlines that electricity demand is on the rise. We actually had Rob Gramlich from Grid Strategies on the show earlier this season to talk about this very topic. But I want to talk to you specifically about data centers. One of the biggest challenges data centers face is interconnections. Why has that become such an issue?

Marc Spieler: Data centers are growing in electric consumption considerably; new data centers are much more dense than previous ones. Accelerated computing requires a higher energy footprint per server, but it reduces considerable amounts of servers. And so from that perspective, I think as we start to add data centers and AI data centers around the country, a lot of these utilities we're not prepared for this spike in AI that we're seeing today. And once again, I don't necessarily think that it's a bad thing. I think eventually what will go back, if we look back in history 10 years from now, we'll find that AI has solved a lot of problems that were energy intense, that we're able to reduce.

But when you train these models and as you get started in the AI journey, it is compute-intensive. But five or six short years ago, it wouldn't have been affordable because you would've needed to do this with a different type of technology that would've required lots more power, much bigger facilities, and significantly more time. So I think as the data centers grow, the utilities are going to have to get a better understanding and a better forecast from data center providers as to where they want to be. I think we need to work closer with permitting and figuring out how do we get permits to build new substations and transmission lines to get power to the right place. I also think we're going to find that a lot of data centers are going to start to look at potentially generating their own power as part of the data center.

Brad Langley: We're starting to see data center owners prioritize locations where they can secure their energy needs, like picking one utility's territory over another. Have you seen this? And if so, what does it mean for utilities?

Marc Spieler: I've not seen it directly, but I've heard utility executives talk a lot about it. And I think what it's going to mean is how nimble are they in their grid? How advanced is their grid, and how fast are they able to build new interconnections and add not only data centers but also the required generation to their grid in order to make it happen? I think it's going to be, I don't want to say it's going to be an issue. All utilities want load growth, so load growth is good for the utility at the same point, and sustainable load growth. If somebody once told me in the industry, "A good customer is a predictable customer."

Because that way, you can plan for it. And so predictable load growth is a really good thing. I also think that data centers will serve as an asset to the grid if properly worked between the data center provider and the utility. And I know that organizations like EPRI are starting to look into what that looks like. But yeah, I think people who have spare capacity right now are going to win data centers. And what we might see is a lot of smaller data centers that pop up because somebody will have a couple of hundred megawatts of capacity, and you can put a data center here for that very fast versus large mega data centers that could be 800, 900 megawatts or even a gigawatt.

Brad Langley: Maybe dig into this piece a little bit. What do you think it looks like for data centers to be an asset to the grid and expand on that point, if you would?

Marc Spieler: Data centers are a tremendous resource, meaning that they have a lot of infrastructure to support 24 by seven operations at five nines uptime. And so not only do they have batteries and they have generators and all of these things, but those costs are covered by the people using those data centers. They're a resource that they buy. And I think at some point you're going to be able to incentivize data center providers to be a variable asset on the grid, where they take additional capacity from the grid and charge batteries when you have excess capacity, and where they discharge back to the grid.

No different than what homes can do with their electricity today. You'll see data center companies willing to do the same thing because A, it'll probably move them up in line for interconnections if they're willing to commit to making investments in battery storage or other generation that can be an asset to the grid so that on peak days, maybe they can run independently for a while and give that capacity back to the grid or even run down some of their batteries as long as they're able to know, collect enough data, and be able to predict with 100% certainty that they're not going to run out of power and cause an issue for their data center.

But I think those discussions are underway, and I think that the people working in the data center space and the utilities are smart enough to figure this out, and those organizations that are willing to step up and be a resource will get great deals on electricity and be a great resource for their utility partners.

Brad Langley: Maybe, taking a quick step back, why are we seeing such an explosion of data centers? And as part of that, talk about how computing power is evolving. Why is this influx in data centers so necessary?

Marc Spieler: Well, I think our friends at OpenAI created something or made something available in the last couple of years that has really changed the way in which people are doing computing. I think generative AI, it was demonstrated that it can solve incredibly challenging problems in a reasonable cost. It all comes down to return on investment. And I think for the first time, you've got technology and compute resources like GPUs that are price performant for this type of work, and the outcomes are written into these algorithms and other things are able to solve problems in a speed that makes it usable for people and provides significant benefit. And therefore, I think the whole industry; everybody's jumped into this, and therefore we're seeing this huge uptick right now because it's new. I do think it'll level out over time, but probably will get, replaced is not the right word, but there will be something else that comes after that. This is the industrial revolution; we've had a few of them, right? This one just happens to be digital.

Brad Langley: If we kind of think, if we move beyond just AI, though, I know AI is a key driver today, but maybe talk a little bit more about what's driving increased processing power in applications outside of just AI?

Marc Spieler: Accelerated computing has been around for a while, and I would say NVIDIA has led that charge since the early 2000s, right? We've done accelerated computing for graphics right before that, but then once we started to get into high-performance computing, we were really able to demonstrate that anything that can be parallelized can be accelerated for a fraction of the cost and a fraction of the power. And so, while I think AI is going to continue to be huge and replace some traditional workflows, I do believe that any engineer out there typically wants to rely on first principles-based physics to get the source of truth. And so, even those applications can be accelerated. They're highly parallelizable. And what we're seeing is that those products can be become larger and larger. So even when you think about grid simulation, right? We're working with a few companies right now to accelerate their grid simulators because today you can simulate a portion of the grid with so many interconnections and this and that, and it runs on CPUs, and it only can go so big before you're outgrown a single computer.

Well, with accelerated computing, you can actually add way more aspects to that. And when you're dealing with a bidirectional grid, now all of a sudden you want to be able to see how many different options, and if you sub-optimize the next two steps on the grid, can you actually get way better performance over the course of the next day? And so there's so many things that you want to try in optimization that's not AI-based; it's just compute-intensive. And what I would tell you is typically those things wouldn't be affordable, but even more so by the time you simulate them, the day was over, and getting an answer for today, tomorrow doesn't help you. And so, I think AI has now opened a lot of people's eyes to: How do we do things differently? And it's a tough industry, and I admire those people, and there's a few of them out there who are just leading the innovation in this space with great tenacity in a culture that doesn't reward as much as other industries.

Brad Langley: Do you care to highlight some of those ones you think are acting with great tenacity and being very innovative? Who stands out to you?

Marc Spieler: I've seen a lot of great things coming out of the West Coast utilities. Portland General Electric is doing some innovative things. Southern California Edison, great, Pacific Gas & Electric. I've had great conversations with the Exelon teams. I think some of these utilities that really are trying to incorporate more renewables onto the grid in a very fast way have to be innovative. They're in states that are requiring faster adoption of EVs and renewables, and therefore they're responding accordingly and looking to partner to find solutions. I look in my backyard and I look up at the transformer behind my house, and it's a 25KW transformer, and I know that the Tesla in my garage is 11KW, right?

And I'm thinking, "Okay, there's four houses attached to this. What's going to happen when we have four Teslas plugged in?" It's going to start to raise issues unless we're able to send communication really fast, faster than electrons, into what's coming and how to do things differently. We're going to start to see failures. And who knew that vegetation management was one of the largest operating costs of any utility out there? And being in Houston, we just had the hurricane come through, and trees fell and power lines were down for a week plus. And we've got access to satellite data, lidar data, all kinds of stuff. And looking at that data and coming up with better plans and operational procedures, I think will save some of these companies tremendous amounts of money, and it'll save people a tremendous amount of heartache.

Brad Langley: How are utilities using software to solve problems that they previously wouldn't have relied on software to solve?

Marc Spieler: Many industries have always built solutions that fit a certain purpose. And I think if you go back in time, you can look at Sony, right? Sony created the cassette player, the Walkman. We probably both had one back in the day, and that was great. It was an analog tape player, worked great; you could take it with you; it played music. And then they came out with the Discman, which was digital, right? That was their first approach to digital, it was great. You could skip songs, better quality, all of those things, and it was digital. But then Steve Jobs turned the world upside down when he came out with the iPod, right? And this was the first software-defined piece of equipment that I would say at scale, where you could get software updates and you stayed connected to a partner. In this case, it was Apple.

But basically, as they made developments and improvements, you could improve the piece of hardware that you had. And you went from a hardware-centric solution to a software-centric solution. And we see this now across all industries, including automotives, probably the largest, right? All of these autonomous vehicles and EVs and all of those will have a software layer, and they will update when you park in the garage at home, and you'll get better fuel efficiency, you'll get better safety. As algorithms adjust, they can push out improvements, and you can choose whether to adopt them or not. So I anticipate that the utilities are going to start really thinking about what does the software-defined grid look like. We roll out infrastructure, and that's how utilities make money, right? Return on capital deployed, and they push out huge amounts of infrastructure, and that infrastructure is not going to reduce; it's actually going to increase as we electrify.

But the question is, how do you reduce the aspect of stranded assets? Where that hardware rolls out, but five years later, the world's changed. We have more EVs than we expected; we have more of this than we expected, and now no longer does that piece of hardware meet the needs of the environment it was put into. And that's where software-defined will change that. Can we do things with software to change the way in which that product works, allow them to do over-the-air updates, and all of a sudden accommodate it? I believe at some point, a lot of this infrastructure will be just like a peripheral on your computer; whether it be a mouse or a keyboard, or some other device you plug in, you'll update a piece of software and it'll just operate in a different way. And I think the utilities are trying to understand what that looks like for them.

I think the incumbent software providers and infrastructure providers are working fast to understand what that looks like. But I anticipate the requirement for software is going to grow exponentially, and anything that has sensors in the future or doesn't have sensors will be required to have some sort of software layer on it that's not firmware but truly software that you can continue to improve it after it's deployed. And that you can use to feed digital twins and will have some layer of openness so that multiple different component manufacturers can all be integrated into an overall, in this case, a digital twin of the grid, but interoperability between vendors.

Brad Langley: We call this show With Great Power, which is a nod to the energy industry. It's also a famous Spider-Man quote: "With great power comes great responsibility." So what superpower do you bring to the energy transition?

Marc Spieler: I think I'm a big ideas kind of guy who has been lucky enough to surround myself with extremely smart engineers. That is, I think, the benefit that I bring to the industry in this role. There is no shortage of PhD smart people at NVIDIA, and I have, I think, the ability to not only see what could be done and tie it back to a use case in a different industry that's been solved, but then to gain the trust of very smart people within my organization to come together and collaborate to solve those problems.

Brad Langley: Awesome. Well, Marc, thank you so much for your time. I really enjoyed our conversation.

Marc Spieler: Thank you. I really appreciate it, Brad.

Brad Langley: Marc Spieler is the Senior Managing Director for the Global Energy Industry at NVIDIA. With Great Power is produced by GridX in partnership with Latitude Studios. Delivering on the clean energy future is complex; GridX exists to simplify the journey. GridX is the enterprise rate platform that modern utilities rely on to usher in our clean energy future. We design and implement emerging rate structures, and we increase consumer investment in clean energy, all while managing the complex billing needs of a distributed grid.

Our production team includes Erin Hardick and Mary Catherine O'Connor; Anne Bailey is our senior editor. Stephen Lacey is our executive editor. The original theme song is from Sean Marquand; Roy Campanella mixed the show. The GridX production team includes Jenny Barber, Samantha McCabe, and me, Brad Langley. If this show is providing value for you, and we really hope it is, we'd love it if you could help us spread the word. You can rate and review us at Apple and Spotify, or you can share a link with a friend, colleague, or the energy nerd in your life. As always, thanks for listening. I'm Brad Langley.

No items found.
No items found.
No items found.
No items found.
No items found.