Photo credit: Nasir Kachroo / NurPhoto via Getty Images // AI-generated image credit: Gold Flamingo
Photo credit: Nasir Kachroo / NurPhoto via Getty Images // AI-generated image credit: Gold Flamingo
Nvidia is the darling of the artificial intelligence boom.
The software giant’s supercomputers fuel OpenAI’s ChatGPT — the chatbot that marked AI’s mainstream debut — as well as the operations of an ever-growing number of other companies. When CEO Jensen Huang went all-in on AI a full decade ago, the chip-maker effectively managed to corner the market before most people even realized there was a market at all.
In an astonishingly short period, Nvidia has evolved from niche chipmaker to household name; with each round of quarterly earnings reports, the company routinely outperforms expectations. Last spring, the company’s value leapt to over $1 trillion. But away from the headlines, the company is also undergoing a quieter, slower evolution: into a potentially formidable player in the energy sector.
While the company’s presence at the grid edge — via, for instance, the integration of its platform into a smart chip — may seem sudden, Nvidia itself sees the move less as a foray into a new industry, and more as a logical extension of the work it has always done.
But the company still needs to convince the conservative utility industry. Accordingly, Nvidia is approaching the space in a calculated way, knowing that utilities tend to move slowly and carefully in their adoption of new digital technologies.
Nvidia entered the space by building relationships with organizations, including the National Renewable Energy Laboratory and the Electric Power Research Institute, said Marc Spieler, senior managing director for Nvidia’s global energy industry practice.
The company’s approach is to convince influential players of the value of Nvidia’s technology, and branch out from there. These relationships, Spieler said, allow the company to “overcome hurdles” that they would face in approaching individual utilities, such as likely skepticism about an unendorsed technological advance.
“It’s a conscious strategy,” said Spieler. “What's incredibly important to us as we go to market by industry is identifying who are the industry leaders in these key areas, who are trusted advisors.”
Simultaneously, Nvidia believes uncertainty about the future grid is key to its appeal. In an industry defined by hard infrastructure, Nvidia’s technology is software-defined — meaning capable of taking on new and different applications simply by updating the software. This nimbleness, at least in Spieler’s view, makes the tech prepared to weather whatever twists the energy transition brings.
A couple years ago, the company started focusing on the utility industry: both its problems and AI’s promise to help.
The move began, perhaps counter-intuitively, as a response to oil and gas company clients diversifying their portfolios. Nvidia has long served the fossil fuel sector (though recently the public-facing narrative of this work centers emissions reduction), but it wasn’t until a few years ago that those clients started dipping into renewables, storage, and electric vehicles.
The Nvidia team, Spieler said, realized the grid would soon become “exponentially more complex.”
At that point, Nvidia got in touch with NREL, a long-time collaborator. Those conversations, Spieler said, represented the inflection point when Nvidia saw the issues — and therefore the potential — of the evolving grid. NREL then put Nvidia in touch with Utilidata. The two companies have since announced a partnership to develop a grid-edge module that combines Nvidia’s chip and Jetson platform with Utilidata’s distributed AI software.
Grid-edge applications haven’t required that Nvidia change anything about the design of the chips that have been such a staple of the computing world: “It’s a common platform,” said Spieler. “We’ve got lots of different types of silicon to meet the needs.”
However, Nvidia does customize chipsets for specific use cases. In order to be useful in the utility space, Spieler said, chips have to be connectable to a metrology board so that Nvidia “can do real-time ingestion of electric waveform data into the chip to be processed at full resolution in milliseconds.”
“We want to be able to detect if a line drops and cut off electricity before it hits the ground,” said Spieler. If the data is processed in the cloud, he added, “it may not be fast enough.”
Several thousand of the units developed with Utilidata (specifically “meter collar” add-ons to existing meters) are set to be deployed in the field in 2024, Spieler said, probably in the first half of the year. But at the beginning of 2025, facilitated by three infrastructure law grants, the company will be deploying 150,000 additional units. Nvidia is accelerating its customization of that latter batch of modules, so that the company can embed them into the meters themselves.
“We hope to have additional grants as we move forward, because with each additional grant comes more use cases that can be shared across utilities,” said Spieler.
A year from now, he anticipates having multiple OEMs — be they meter vendors or people placing devices on transformers or transmission lines — all incorporating chips at the grid edge.
“Then from there, it will scale very quickly,” Spieler said.
And Nvidia knows how to scale; the company has essentially cornered the market for AI, offering everything from chips to key software. According to research firm Omdia’s analysis, the company accounted for 78% of AI chip sales in 2023, up from 60% in 2022.
Nvidia doesn’t only work with third-party companies like Utilidata, though. Much of its work involves collaborating closely with utilities themselves — and importantly, with the research organizations that utilities rely upon.
Nvidia and Utilidata’s grid-edge advisory board is a who’s-who of AI-curious utilities and energy companies, including founding members American Electric Power, Duquesne Light Company, GM, Holy Cross Energy, PPL, Silicon Valley Clean Energy, and Sunrun.
The three infrastructure law grants that are facilitating the rapid scale-up of smart meters that use Nvidia’s platform were all awarded in the last few months: to Duquesne Light, Portland General Electric, and Commonwealth Edison. (Nvidia hasn’t touted its involvement in the projects, preferring to take a back-seat and allow the utilities themselves to lead with DOE and the public.)
Spieler said the advisory board has already been integral to helping Nvidia communicate “what they’re trying to do,” including in early grant conversations with DOE. The government, he said, seems to recognize that “there needs to be a step-change” in the development of software technology for the grid.
Beyond the meter, Nvidia’s technology is also increasingly being used elsewhere on the grid edge, including for visual inspections of infrastructure (via video cameras on drones or trucks), vegetation management, and grid simulations.
“As a platform company, we don’t develop a lot of end solutions. We provide the infrastructure, the platform, to develop on,” Spieler said. “So as we work directly with utilities, sometimes they have people that can actually develop software and AI platforms; others do not, and they require a third party to come in and help them.”
In the latter cases, Nvidia works with consulting companies like Deloitte and Accenture “to help develop and solve these problems where the utilities may not have the internal expertise,” he added.
Daisy Dunlap, an analyst with Latitude Intelligence, said that many legacy utilities are caught between enthusiasm about how AI can facilitate the grid’s transformation, and fear about how integrating the new tech could lead to problems with data access or cybersecurity.
And an upcoming report from the Latitude Intelligence team found that roughly 70% of utilities don’t have a centralized strategy for evaluating where and when to use AI. The timing of Nvidia’s move into the space, though, makes the company well-positioned to provide help, according to Matt Casey, managing director of Latitude Intelligence.
Casey added that his team’s research showed that utilities can be hesitant about working with AI companies “that aren't specifically utility-focused or don't have a track record in the space given the complexities of utility business models and, more broadly, grid operations.”
According to Casey, if Nvidia wants to encourage utilities to overcome their reservations, the company’s task is to “demonstrate they understand utilities' operational challenges.”
Asked about the tenor of his conversations with utilities about this new technology, Spieler said that “change is hard,” especially “for an industry that is not rewarded when things are running great and punished when things don’t run.”
The biggest barrier to the integration of AI and smart chips like Nvidia’s on the grid, Spieler said, is industry mindset.
“Deployment, that’s not hard,” he said. “It's an industry that's been pretty consistent for 100 years, right? And now we're going to give them insights that they've never seen before. And as they evolve their business models, that's really what's going to take the most amount of time.”
However, the company’s work with utilities is admittedly a tiny part of what Nvidia does, even in the energy realm. Especially as AI continues to advance — and continues to gobble up more computing power and energy with each improvement — Spieler said one of the company’s central goals is to constantly improve the technology’s efficiency, and lower its energy demands.
Because this is the central push-pull of the company’s position: as the energy demands, and attendant emissions, of AI grow, so too do the opportunities for gains in energy efficiency that Nvidia in particular has the chops to make.
Dion Harris is the director of accelerated solutions for Nvidia’s data center portfolio, which means that he works with the company’s HPC end-user segments: everyone from supercomputing to energy. His main purview is accelerating the overall throughput and efficiency of those systems. And while his work was initially a function of a push for continuous performance improvements, energy efficiency gains have become an important knock-on effect. Prior to his work at Nvidia, Harris spent five years at Pacific Gas & Electric, working primarily on process improvement.
“Now we see a lot more customers — whether data center or application users — that are actually adopting accelerated computing, not just for the performance benefits, but for the energy efficiency benefits as well,” Harris said.
Asked whether the company’s work with the energy sector — and work prioritizing energy efficiency writ large — reflects a larger philosophical shift in priorities for the company, Harris said the thesis of “eliminate waste everywhere” is one that is “embedded” in the DNA of Nvidia. And it manifests not only in the prioritization of energy efficiency for data centers, but also in areas like digital twins, including for the grid.
Indeed, Harris said grid modeling and management is a “key area where we think there’s lots of opportunity to really drive efficiency, and [leverage] tools like AI.”
The company is seeing particular enthusiasm from its energy sector clients for a suite of technologies that combines digital twins and AI. For instance, Nvidia has a partnership with Siemens Gamesa to model an offshore wind farm.
“They were able to leverage our technology — both AI and the digital twins — to optimize the production and the output of that wind farm,” said Harris, adding that doing so has cost optimization benefits as well. Nvidia also has a partnership with the Europe-based General Atomics to create a digital twin of a fusion reactor.
“We do believe AI is actually an enabler to go and reclaim a lot of power usage and power consumption,” Harris said.
This mentality also characterizes Nvidia’s continued work with the fossil fuel sector, said Spieler, who spent over 13 years at Halliburton before joining Nvidia in 2019. On the section of the Nvidia website dedicated to energy, the company’s work with fossil fuel companies still dominates. And while Spieler said they are “aggressively” looking for partners focused on cutting emissions, reducing them from traditional fossil fuels is a critical piece of that effort.
“Their goal is to reduce cost. When they reduce cost, it means they drill less holes,” said Spieler. He added that “leveraging accelerated computing and technology to reduce the amount of wells you have to drill” is a way to “keep energy costs low while reducing the environmental impacts.”
However, nothing about the adoption of AI by the energy sector is certain. And therein lies another element of Nvidia’s pitch.
“The reason Nvidia is the company we are, is because we’re a software-defined company,” said Spieler. “We constantly are updating software stacks that can be upgraded on the GPU, and therefore when we find better algorithms or we solve cybersecurity problems, we can do an update to the software and improve the performance.”
Energy infrastructure, though, is both expensive and deployed for 10 to 20 years. And utilities have historically tended toward single-use firmware, said Spieler. This software is “not easy to update — and typically doesn’t get updated.”
But that may need to change, especially given the rapid increases in electricity demand expected in the coming years, and the limited resource capacity on the grid in the United States in particular.
“Being able to predict what the future is going to hold is near-impossible,” Spieler added, which is why software-defined infrastructure can allow utilities a certain flexibility, and even be “how they’re going to stay in front of this uncertainty.”
“Today, load forecasting may be important to certain companies,” he added. “But tomorrow, it could be energy equity.” Nvidia’s several utility partners today all have different priorities. But once the company develops software to meet the needs of one, it will be available to all.
This is because Nvidia operates with an open platform, which Spieler said appeals to the industry and regulators alike. He likens the set-up to the Android’s app store — unlike on a proprietary platform, individual companies can develop new applications on the platform that can then be deployed. At the time of publication, Nvidia has roughly 17,000 start-ups operating on its platform. However, so far just about 400 of those are energy companies.
“These guys aren’t very apt to try new things, because they built a system that tends to work well almost all of the time,” said Spieler. “However, they've not seen the complexity that they now are anticipating. And I think that's opening up a lot of minds.”
“The uncertainty I think has reached a level that they’ve never seen before.”
Editor's note: This story was updated on February 21 to 1) correct the fact that it is Commonwealth Edison, not Con Edison, that received DOE funding, and 2) to reflect Utilidata's updated description of its platform.