Utilities are awash in data. How can they unlock it faster?
There are many forces that could hold back AI in the power system: computing infrastructure, power availability, regulation, and corporate inertia.
The biggest one? Good data. Utilities and grid operators are awash in data. But getting access to it — or making sense of it — is very difficult.
For a better understanding of how to change that, we turn to someone who spends a lot of his time in the so-called data cloud: Titiaan Palazzi, the head of power and utilities at Snowflake.
“Data naturally ends up in different boxes, in different silos," he said. "And when you then want to ask questions of the data, it becomes really hard. You can't ask questions across the enterprise."
In 2018, Palazzi co-founded Myst AI with Pieter Verhoeven, an engineer who built critical demand response applications for the Nest Thermostat. Myst was focused on AI-driven time-series forecasting for the grid.
“In the energy industry, there is a lot of time-series data coming from the grid. At the same time, using AI for forecasting is quite challenging because every time you need to create a new prediction, you need to have the latest data. And so from an engineering perspective, it was quite complicated to do,” said Palazzi.
Palazzi and Verhoeven arrived at Snowflake after Myst was acquired by the company last year.
This week, we feature a conversation with Snowflake's Titiaan Palazzi on busting data silos, some early wins for AI in the power sector, and what phase of the transition we're in.
This episode is brought to you by The Big Switch. In a new 5-episode season, we’re digging into the ways batteries are made and asking: what gets mined, traded, and consumed on the road to decarbonization? Listen on Apple Podcasts, Spotify, or wherever you get your shows.
Stephen Lacey: Last week, we featured an interview with Brian Janous, a former Microsoft VP who had a front-row seat to the AI energy boom and all the grid constraints that are coming with it. It was one of a few conversations we recorded from DISTRIBUTECH on the artificial intelligence theme. Look, if you've been listening to this show for a while, it's no secret that we have AI on the brain at Latitude Media because there's legitimately real commercial activity happening. In fact, just this week, we reported on NVIDIA and Utilidata's partnership with meter maker Aclara to roll out embedded AI to smart meters. Also, because the pathways to getting it embedded across the power system are not simple, a smart meter with a hundred times the processing power is a very cool technology, but utilities have to make the case to pay for it, and their track record for making good use of the previous generation of smart meters is spotty.
There are a lot of things that could hold back AI. The biggie is such a common problem. It's almost become cliché. It comes back to the way data is managed or shared across a utility or the lack of sharing. If you talk to any vendor, this is one of the biggest sources of frustration. So, for a better understanding of how to change that, I turn to a guy who spends a lot of his time in the so-called data cloud, Titiaan Palazzi, the head of power and utilities at Snowflake.
Titiaan Palazzi: Any kind of data naturally ends up in different boxes and different silos, and when you then want to ask questions of the data, that becomes really hard. You can't ask questions across the enterprise. Snowflake is a data cloud platform and we sit on top of Amazon, Microsoft, and Google helping companies to bring all their data together to then create value out of it.
Stephen Lacey: Snowflake is one of the hottest tech companies that you may not have heard of. It has a $53 billion market cap with thousands of customers. Titiaan arrived there after his company, Myst AI. It was acquired by Snowflake last year. In 2018, Titiaan co-founded Myst with Pieter Verhoeven, who built some critical demand response applications for the Nest Thermostat, and Myst was focused on time series forecasting for the grid.
Titiaan Palazzi: Over the last 30 years, linear regression or regression models were one of the main ways in which forecasting was done, so you would actually specify specific weights for every factor to determine the forecast.
Stephen Lacey: Let's say you want to anticipate the output of a wind farm. Under linear regression, you apply weights to things like wind speed, temperature, blade performance, and then do some statistical analysis to make a prediction, but time is also an important factor and machine learning made it easier to integrate temporal factors into forecasts.
Titiaan Palazzi: What you saw was that a lot of the progress which was made in natural language processing, the ability for an AI algorithm to understand words applied to time series data because in the same way that the words in a sentence, their order matters, so too timestamps matter when you're doing a forecast of something like energy demand. So, these were models in which you would not specify any hard-coded weights, but you would basically tell the model, "Here are all the things that will have an influence on the thing we're trying to predict, such as the output of that 200 megawatt wind park. Come up with a prediction."
Stephen Lacey: What they found was that these AI-driven time series models could improve accuracy by 30 to 50%, and that caught the attention of Snowflake where Titiaan and his co-founder are now working on product development and go-to-market strategies in energy.
Titiaan Palazzi: I think that's a big shift that has taken place that in production by companies that are serving real customers, AI is used much more commonly, so a shift from hard-coded pre-setup fully visible models into more machine learning and AI.
Stephen Lacey: But these models do have an Achilles heel. You need access to lots of clean data, and we still have a long way to go to unlock their full potential.
Titiaan Palazzi: In the energy industry, there is a lot of time series data coming from the grid or power generation or electric cars or even from electricity markets. At the same time, using AI for forecasting is quite challenging because every time you need to create a new prediction, you need to have the latest data. So, from an engineering perspective, it was quite complicated to do.
Stephen Lacey: This is The Carbon Copy. I'm Stephen Lacey. This week, a conversation with Snowflake's, Titian Palazzi on busting data silos, some early wins for AI in the power sector, and what phase of the transition we're in.
Where are we today in the advancement and adoption of AI? What's your read on the technological moment we're in right now?
Titiaan Palazzi: I would say that there are a few real changes that are happening. One of them is that we're seeing a new type of model, generative AI, really perform extremely well, and what that means is that some of the previous ways in which AI/ML were done might disappear or become less relevant. So, for example, it's quite possible that in the not too distant future you might actually do forecasting by asking an LLM, can you predict the next few weeks of data, whereas to date, you would use very different algorithms for that.
I think the other big thing that's happening is that previously, whenever you wanted to use an AI model, you would have to write code or maybe you could use a user interface, but now we're able to use AI with just natural language. And I think because of that broadening of the aperture, that ability for people who in plain English or any other language of their choice can ask questions, I think you're seeing an opening up of the user base and that actually leads to the models becoming better. So, it's this kind of cycle and cyclical effect, and I think that's quite powerful and I think it will accelerate development.
Stephen Lacey: Now, you're at Snowflake after Myst AI was acquired last year. What is Snowflake's approach to building out the data cloud and how does energy actually fit into that?
Titiaan Palazzi: Yes. In companies of all types, data gets siloed as the company grows. It's a bit like entropy. It happens naturally unless you try to prevent it. And Snowflake's aim and vision is to mobilize that data, so that business leaders can ask questions that span across the business. For example, which parts of our grid should we do maintenance first to minimize the chance that there is an outage? Or how much capital should we allocate to make sure that we meet the risk requirements of a market like ERCOT?
The Snowflake data cloud sits on top of the three cloud hyperscalers, so we don't own our own data centers and as companies use this data cloud and store all their data in the cloud, that then unlocks workloads like AI and machine learning or applications that run on their data or visualizations. What we really unlock is the ability to combine IT data, so data around unit sales or supply chain data, with OT data which is data from devices, so data from wind turbines or data from the grid, and by combining those two things, that really unlocks a lot of these valuable insights for executives to take action on.
Stephen Lacey: So, the utility industry is one of the greatest data generators of any industry. If I'm sitting inside a utility, let's say, I'm an engineer or a program manager and I need access to a certain data stream to make sense of it, what are the current limitations today? What do I see? What do I have access to? What are the limits? And then, what does a data cloud functionally deliver that's different?
Titiaan Palazzi: Yes, I totally agree. There is so much data in the power and utility sector. To give two examples, one is when it comes to electricity markets, every five minutes, every value is timestamped or then when you look at things like the grid or power generation, you might have data for every second or millisecond that something is in operation. So, completely agreed, huge volumes of data. That's also I think where really sometimes the challenge lies.
Let's say that you are a retail energy provider and you want to forecast electricity demand for your customers for the next few weeks to make sure that you buy adequate supply for your customers. What I've seen is that often, it can take three months or six months or nine months to get access to all the right data, such as smart meter data for your customer, and then only a couple of weeks or a couple of months to actually build and deploy a predictive model. So, what you see is that utilities and other companies in the power sector often need to spend a much larger amount of time on data collection and cleaning than on the actual AI/ML work. So, I think one of the real challenges is to bring all the data into one place, establish a strong foundation, on top of which, you can build predictive models and other AI.
Stephen Lacey: Why is this a problem for AI, this lack of clean data or lack of access to data? I mean, it's very obvious on its face that in order to build powerful models, you need a lot of information, but what are some examples of how this holds AI applications back?
Titiaan Palazzi: Well, I think the real problem is that as part of the energy transition, there are so many challenges that we need to solve really fast. For example, I work with a number of distribution system operators and transmission system operators that need to deal with enormous amounts of new interconnection requests for solar, wind, and battery storage. And the reality is that many of them don't really have visibility into their network, in a way that allows them to dynamically manage both new resources being added, and then to ensure that the grid operates in a reliable manner. And then, as a result, what they have to do right now is they have to essentially limit, with all kinds of levels of extra safety, how much can be added. So, in some ways, the energy transition is being blocked by the ability for energy companies to have access to that data.
I work with a number of grid operators around the world. Many of these companies are trying to add more solar, wind, battery storage and other renewables to their grids, as well as more flexible demand, and one of the key challenges they face is that they don't have good visibility into what's actually happening on their grid. So, that's just one example of where access to data is holding us back, and these companies are typically engaged in enormous programs with thousands of people just to make sure that there are sensors on the grid and that the data from those sensors is captured somewhere in the cloud, so that they can then run analytics on top of that.
Stephen Lacey: Well, let's go to a couple of examples. You've worked with a range of utilities and retail energy providers and you said, grid operators on using AI for asset management, for forecasting. What are the most compelling applications you're seeing today?
Titiaan Palazzi: When I think of all these use cases, I typically think of three areas. The first is assets and operations, so that has everything to do with the physical infrastructure, grid, power generation, et cetera. The next is finance and markets, so that has everything to do around power markets. And the third is everything around customer, customer 360, so making sure that every individual who is part of this energy transition is treated in the right way.
First, in assets and operations, two examples come to mind. One is we work with a retail energy provider in the Midwest that serves more than a million customers with electricity, gas, and distributed energy resources. So, they have an offering, for example, for rooftop solar and I believe about 20,000 of their customers have rooftop solar through the energy retailer. Now, sometimes issues crop up with rooftop solar. There might be soiling on the panel, so the panel might be dirty, or maybe the wires weren't connected properly and the panel isn't actually producing, or maybe a tree grew and now there's more shading.
Historically, this company basically had to wait until a customer called and said, "I'm looking at my bill and it doesn't look like the solar is working as it used to be. Can you come and investigate?" And they built based on the smart meter data coming from the solar panels a predictive model that basically says, "Here's how much solar we would expect based on the location of the solar system, the tilt of the roof, these kinds of things. Here's what we're actually seeing." And if there is a big discrepancy, then the customer service team will get an alert, and they will either call the customer or send a crew to go and check. Now, often before the customer has even noticed that something is going on with their bill, the retail energy provider has already contacted them.
Another example in the same group is we work with one of the biggest renewable asset owners in the world, Lightsource, which is a part of bp. They operate more than five gigawatts of solar globally, and a big problem for utility-scale solar is hailstorms. So, the bulk of insurance claims for solar asset owners is actually hail damage to panels, just breaking the glass. So, they developed a predictive system where they're incorporating hail warnings from a variety of weather sources, and when hail is expected to come near a utility-scale solar system, they will turn the panels because many panels can be turned on a single-axis tracker, so that they are positioned vertically, so that the hail doesn't damage the panels and thereby they avoid huge outages, as well as financial costs of replacing the panels.
Another example is more around the wholesale market side. A utility we work with closely that has about a million electric and gas customers, they collect the smart meter data or AMI data for all their customers in Snowflake. What that allows them to do is that they now create electricity demand forecasts for the next few weeks based on all that data. What is amazing is that this data science team that typically has very little interaction with the business suddenly got a call from the chief financial officer of the company because apparently by improving the predictions, they saved more than $5 million in a single month by avoiding exposure to a big real-time price spike.
Stephen Lacey: It's a good list because it illustrates just how wide the applications are, if you think about where adoption is in the power sector broadly. If we think on one end of the spectrum, we're in the enhancement phase, and then maybe at the other end of the spectrum, it's a full automation, human out of the loop, where are we in that phase, and then what is the eventual phase do you think?
Titiaan Palazzi: Well, the robots are not yet taking over. Utilities are not known to be the fastest adopters, so although there are studies that will say that a very high percentage of utilities are already or see the importance of AI, I think the amount of utilities that have fully productionized use cases for AI live is relatively small. I think that that is the reflection of a few things. First is, as we said earlier, it's sometimes hard to get all the data. Second is that utilities don't always have the most sophisticated technical teams. A third thing might also be culture.
What I see in other industries is that what's really powerful for an organization is to have the culture and the technical systems that allow staff to experiment and to make small prototypes for things because it's now so easy to put things together with LLMs and existing tools, that it is quite powerful if your teams can explore and try new things and see what sticks. I think that there's a long way to go and I think that what we will see is that AI will drive efficiency throughout the process. It will be used in all kinds of business divisions. It will be used in the form of standalone applications sold by software companies, as well as homegrown solutions built on their own cloud platforms, and I'm excited to see what's coming.
Stephen Lacey: Let's go deeper about what needs to happen inside a utility to grapple with that transition and actually make appropriate investments. Many utilities are partnering with outside teams. Some are hiring their own in-house data science teams, AI experts. Some of whom don't have experience in the power sector. They may be AI-specific experts who are just now figuring out how to apply it to the power sector, so you have a number of different approaches inside of utilities. What are the most common approaches now and are there any that you think are particularly effective when it comes to team building, and then eventually testing out products and collaborating and building this sandbox approach, so that they can figure out how to actually scale an application?
Titiaan Palazzi: Yes. I think that there are a few key components. One is, as you mentioned, unlocking a data foundation in which all the data is there, so that you can actually run models. The second one is around experimentation. Something specific I've seen work very well is to embed teams of data scientists and software engineers with the lines of business, to make sure that somebody who is a data scientist is actually sitting with a trader for a week or sitting with the grid management team for a week or sitting with a line crew for a week and maybe even going with them, to get a clearer sense of the issues that they grapple with. And then, part two of that is setting up those technical teams to experiment quickly. For example, creating sandbox environments that do not have the same requirements as the fully skilled production models that power the utilities and customers, in which they can within a matter of days deploy new applications that are initially used by just a small subset of users to see what works and what doesn't.
The final thing I'll say on accelerating AI adoption is that I think sharing data in the utility industry is not yet delivering on its full promise. One of the amazing things about some parts of utilities is that they're actually not competitive. As an example, in California, the smart meter data that the three big investor-owned utilities have actually should be shared with a variety of different players. For example, the distributed energy companies like Sunrun or a SunPower, they can really benefit from having smart meter data for their customers, so that they can make them better offerings. And there are actually technical ways also through Snowflake to make it very easy to share that data from one organization to the other, while maintaining all the right levels of security, so that you don't need to duplicate data and send it around, what today still sometimes happens, through email or FTP, so that you can actually solve these use cases that are not within the bounds of a single company but really within the bounds of the industry.
Stephen Lacey: We're here at DISTRIBUTECH where there are 17,000 people here and hundreds of companies, and if you walk around the floor, you suddenly see AI slapped on everything. I'm sure there's some companies, it's more of a cynical play, but in reality, a lot of these companies are starting to build out AI offerings as an extension of what they were already doing. I mean, when you look at the tech platforms of some of these third-party service providers, how much are you seeing AI getting integrated into their products and teams?
Titiaan Palazzi: I think broadly, AI is being infused in all kinds of solutions for all business units and I think that they will only accelerate. You see it show up in surprising ways maybe. For example, utilities that might have handwritten maintenance reports for things like substations or lines can now actually digitize all the data using the power of LLMs. And now they have a queryable, so questionable database that they might ask, what are the most common fault mechanisms that we see in our substations and which substations should we do preventative maintenance on next? So, even areas that historically seemed, "Oh, AI is never going to get there," are now increasingly accessible to AI.
Stephen Lacey: If you think about, let's say, the next decade of advancement in AI, which is a very long time... I mean, we see such radical improvements in the technology in a 12-month timeframe, so a decade is quite long, but if you think about how it'll play out in the power sector where you see more conservative adoption, what will hold it back and what will accelerate it?
Titiaan Palazzi: Yeah. Well, one thing that could hold it back is certain regulatory changes if those occur at the national or supernational level on what we can do with AI. One thing specific to energy that will hold it back is what we've seen is that in the last 15, 20 years, while the amount of compute and data storage has increased a lot, in some cases more than 10x depending on the geography and timeframe, the energy consumption has actually barely budged. It has barely gone up, and that's mostly thanks to the increasing energy efficiency of data centers, and it looks like that trend may end. So, it looks like, if we look at the coming years actually, data center energy efficiency might not increase that much, while our use of data centers is rapidly growing, in part because of the generative AI hype. So, I think one thing that could hold it back is can we build enough data centers with access to ideally zero carbon power? That will be one thing that could hold it back.
In terms of acceleration, I think a lot of that is already happening, and I certainly am frequently amazed by how quickly new types of models are coming out and what it can be, whether it's now most recently OpenAI's Sora model in which you can generate one-minute videos with just a few lines of text, to all the things that we will see next.
Stephen Lacey: So, if we look at what could hold back AI, it's chip availability and power infrastructure, and there are a lot of questions about what the power demands of data centers look like as AI use expands. So, that has a lot of people hand-wringing about the energy intensity of the data center industry and whether it will cause us to run in place and whether we'll need a lot more clean resources just to make up for AI computational infrastructure, but then there are all these other great benefits that AI can unlock for the grid and even unlock for the benefit of data centers. What do you think about the net impact of AI in the energy space, both as potentially an exponential energy consumer and as an unlock for clean resources?
Titiaan Palazzi: I think we might see an increase in energy consumption from AI. I've seen some utilities in their integrated resource plans mention growth of 2x, 5x, even 10x in the next decade, driven primarily by more data center build-out. First, I'm not sure if that will happen. History tells us that we're not always very good at forecasting what's happening in the long term, so I think maybe it will not be that much increased energy consumption. But then the other thing is I think AI has so many applications to drive energy efficiency and reduce emissions, whether it's about how we manage global supply chains, so how we route planes and ships, or whether it is on all the opportunities within the energy system to run things more efficiently. So, I generally think that we should proceed with the technology innovations that we're seeing, and in fact, we may not really have the full control over not doing so and that the advances that we will get in emission reduction and energy consumption reduction will far outweigh the increased energy consumption of data centers running AI.
Stephen Lacey: Titiaan Palazzi, thank you so much.
Titiaan Palazzi: Thanks, Stephen, a pleasure to be here.
Stephen Lacey: That is it for the show. The Carbon Copy is a production of Latitude Media. It's produced and written by me and Sean Marquand is our technical director. He also mixes the show and wrote our theme song. Go to latitudemedia.com to get all of our stories there, and if you want to get some of your stories on the go, subscribe to our companion podcast, The Latitude. You can find that anywhere you get your shows. We also have transcripts of this podcast and catalysts at Latitude Media, so you can go over there. Subscribe to our newsletter and you'll get it all in your inbox. Thanks to Prelude Ventures for being a supporter of Latitude Media. Prelude backs visionaries accelerating climate innovation that will reshape the global economy for the betterment of people and the planet, and you can learn more about their portfolio at preludeventures.com. If you have feedback on the show and you want to send us some ideas or just have thoughts you want to share, send us a note at editors@latitudemedia.com, and we will catch you next week. I'm Stephen Lacey. This is The Carbon Copy.