As hyperscalers scramble to build new data centers, experts agree that energy is a major bottleneck — but the technologies exist to serve new loads cleanly.
Data centers are an impressive energy success story. Over the last 25 years, internet traffic has climbed more than 500 times while data center electricity use has remained flat.
The servers and energy infrastructure have gotten wildly more efficient, and the biggest tech companies have focused on powering those warehouse-scale computers with renewables.
But a lot of people are suddenly alarmed about data centers again, as energy demand for AI surges.
Data centers are getting built so fast, many utilities are pushing for lots of new fossil gas plants to serve them. And while tech companies have made strong progress on building renewables to match data centers, grid constraints are making it harder. We have a very small window to fully decarbonize the grid – this may make it harder to squeeze through it.
So, are growing concerns over AI’s power demand justified? How are they contributing to America’s growing hunger for electricity? And what technologies and grid management techniques can address it?
This week, we’ve assembled a group of experts to answer those questions: Brian Janous, co-founder of Cloverleaf Infrastructure; Michelle Solomon, senior policy analyst at Energy Innovation; and John Belizaire, CEO of data center developer Soluna.
This conversation was part of Latitude Media’s Transition-AI series. Watch the full event here.
Utility rates could make or break the energy transition — so how do we do it right? On June 13th, Latitude Media and GridX are hosting a Frontier Forum to examine the imperative of good rate design, and the consequences of getting it wrong. Register here.
And make sure to listen to our new podcast, Political Climate — an insider’s view on the most pressing policy questions in energy and climate. Tune in every other Friday for the latest takes from hosts Julia Pyper, Emily Domenech, and Brandon Hurlbut. Available on Apple, Spotify, or wherever you get your podcasts.
Stephen Lacey: Brian Janous is co-founder of Cloverleaf Infrastructure. He's the former VP of energy at Microsoft, and he had a front-row seat into the rise of ChatGPT and how it complicated clean energy procurement. And he's now focused on unlocking new grid capacity for utilities, hopefully without relying on fossil fuels. And Brian, good to see you. Thanks for being here.
Brian Janous: Thanks Stephen. Happy to be on.
Stephen Lacey: Michelle Solomon is a senior policy analyst at Energy Innovation. Michelle co-authored multiple new reports on load growth and the many ways utilities can meet rising demand without building gas plants. And we'll talk more about that research. Hi, Michelle.
Michelle Solomon: Hi. Thanks so much for having me.
Stephen Lacey: Great to see you. And John Belizaire is the CEO of Soluna. John is focused on building data centers for batch processing that utilize excess renewables, focusing a lot on crypto mining and AI. And John, we're delighted to see you. Thanks for being here.
John Belizaire: Fantastic to be here. Thanks for having me.
Stephen Lacey: Okay, let's start with the load growth picture generally first, and then we'll get a little bit deeper into data centers. So questions about the energy intensity of data centers are wrapped up in this bigger trend. Power demand is surging in different pockets of the country and many utilities are suddenly revising their resource planning, sometimes in really dramatic ways. Michelle, over to you. How acute are these demand increases and where are data centers fitting in?
Michelle Solomon: Yeah, so that's a great question and a recent report by Grid Strategies, which is a grid firm, found that demand forecasts were increasing. They found that there is a shift from 2.6% increase in demand over the next five years to 4.7% increase in demand over the next five years. So somewhere around a 1% increase in electricity demand per year across the country is predicted, and that's reflected in filings to the Federal Energy Regulatory Commission.
They found that peak demand in 2028 was going to be increasing by about 17 gigawatts. For context, the total peak demand right now is around 742 gigawatts. That's about 2% of our total peak demand that they saw increasing in the forecasts. AI right now is using about 1% to 2% of our electricity, and some reports expect it to maybe triple by 2030. So these are big increases in demand. 1% of total US electricity demand is not a small amount. But to put that in a broader context, to reach a net-zero economy, we need total electricity demand to totally double or even triple or more by 2050. So while these are significant increases in electricity, we actually need a lot more. And for us at Energy Innovation, we see this as a good moment to start learning how to grow again so that we're preparing ourselves for those much bigger increases in electricity demand that we need to decarbonize.
Stephen Lacey: And as you said, you are making the case for utilities to learn to grow again. What does that mean? What are some examples of utilities that are growing the right way and meeting this new demand and growing the wrong way?
Michelle Solomon: So just to start out, what do we mean by the right way and the wrong way? And many of these utilities have their own stated carbon goals and decarbonization goals. So in some ways this is learning how to grow again the right way to meet their own goals. Of course, there are also US carbon emission reduction goals. There are state carbon emission reduction goals that also need to be met. So growing in the right way would be growing in such a way that they can meet those goals. And also in such a way that reduces costs for customers and reduces health impacts on those communities that are hosting power plants. So that's what we mean about growing the right way. So just some examples of ways to grow well in terms of meeting those carbon goals and saving customers' money would be... One really great example is Xcel Energy.
So they do see a large forecasted increase in electricity demand, but yet they still plan to retire all of their coal plants by the early 2030s and install significant amounts of new renewables. One thing that's really great that Xcel is doing is they're planning to make the most of their existing resources by bringing new clean energy like wind, solar batteries online at retiring and retired coal sites. So reusing that existing interconnection point and they're looking to bring gigawatts of new clean electricity online that way. They also incorporate significant amounts of demand response, energy efficiency, and use of distributive resources to drive down peak electricity demand, which is traditionally the hardest to meet and drives the need for new resources. On the other hand, we see some utilities in the southeast kind of failing to look at what the specific changes are needed to meet the overall changes in the grid.
So Duke Energy for instance, they include plans for nearly nine gigawatts of new gas and almost seven of those are... They are proposed as combined-cycle power plants, that are designed to run a high percentage of the time. Not designed to meet new peaks in demand. Not to mention that gas has historically not necessarily been a reliable resource during times of grid stress, especially during the winter. Their plans also arbitrarily cap the amount of battery storage that can be deployed, which when places like California and Texas are deploying five or more gigawatts a year, you can see that that's not really an innovative approach to trying to meet this new demand, trying to meet their carbon goals. Instead, they're kind of just falling back on their knee-jerk reaction of building new gas plants.
Stephen Lacey: And I know Microsoft recently filed comments suggesting that there's an overstatement, that utilities are overstating some of the data center demand that's to come. So we can talk a little bit about that. Brian, over to you. You worked at Microsoft for more than a decade. You were procuring clean energy to meet corporate operations in data centers. How did that become more challenging after the public release of ChatGPT? How dramatically did that increase computing needs, and a year and a half on how has that challenge only increased?
Brian Janous: Yeah, so I think it happened at a very unique time, because if you look back to the prior decade, and you were referencing all these stories, we were talking about how the internet was going to gobble up all the world's power. There was something happening during that decade of the founding of the cloud. If you go back to say the early days around 2010, there was still a lot of computation that was happening on premise. So there was a period during, around 2010 to 2020, where data centers on the whole had relatively flat load growth, but it wasn't because there wasn't significant load growth happening under the hood. What was happening is you had a lot of businesses shutting down their own very inefficient data centers that were using multiples of the amount of electricity that a hyperscale data center would use. And the hyperscale data center could absorb that with significantly less electricity consumption for more computation, right?
So you had a significant increase in computation over that period of time with almost no increase in electricity. But a lot of people attribute that to efficiencies. Well, data centers were just getting more efficient, but really what was happening, we're just moving environments. So there was a form of efficiency happening there and that you're shutting down old, very inefficient, on-prem data centers, moving that to the cloud. But then it sort of belies what was actually happening, which is data centers for hyperscale cloud were still growing at double-digit percentages year over year. Part of the problem was that was happening early on [inaudible 00:11:10] relatively low denominator. So early in the 2010 timeframe, Microsoft, Amazon, Google, they maybe all had a few 100 megawatts of data center capacity. Well now you get to the end of that decade, now they're all sitting on a few gigawatts of capacity. And then you inject this new form of demand in terms of AI.
And so you have continued double-digit growth rates in cloud computing, native cloud demand. Now you're adding the element of AI on top of that. And now that growth rate becomes pretty material in terms of the tranche size of each of those annual increases, right? We're not talking about adding tens of megawatts a year, but you're talking about adding hundreds of megawatts a year, if not gigawatt scale per year. That becomes really material for utilities. So the timing of that, that all sort of coming together where data centers from a cloud standpoint were already starting to stress utilities and then you inject AI, and then of course you add the fact that we have significant increase in onshoring and manufacturing at the same time. So a whole lot of things happened at the same time that have made this problem acute. And so I think that's where the comparison between what was being said maybe early this century and what's happening now is maybe a little different.
Stephen Lacey: Talk about what that means for actual clean energy procurement and how that informed the model you've developed at Cloverleaf, for these ready to build sites, to meet these acute loads for factories and data centers.
Brian Janous: So the challenge for this industry has always been about speed and scale and sustainability. They want to go really fast. They want to plug in as many servers as possible. They want to grow big. And now big, what's defined as big is very different today than what it was five years ago. So that's creating a new challenge where utilities are not used to customers calling them and asking them for a gigawatt, right? That's just shockingly large for most utilities staff to think about.
But they're still very committed to delivering on their sustainability goals. And so as you were noting, this is creating a challenge for a lot of utilities that are trying to meet this near term demand of speed and scale, but also keeping their eye on the ball around what these customers actually want. And so at Cloverleaf, what we're focused on is really working with utilities to build that roadmap and help them understand that this demand is real. And if you're not getting asked for this today, you will be getting asked for it tomorrow. And that customers are going to continue looking for new locations around the country to site not just data centers, but also things like chip fabs and factories. And so utilities have to prepare and really change their mentality around the fact that we are in an era of growth. And most people that work at a utility today have never worked at a utility during an era of growth.
Stephen Lacey: John, over to you. You're a former software entrepreneur, you're now a data center CEO. What did you see playing out at this intersection of computing and renewables that led you to form Soluna in 2018?
John Belizaire: What we were seeing was this interesting phenomenon in the energy space. I've spent the last five years using all of my understanding of software to learn as much as I can about energy. And what became clear was that energy, especially renewable energy, has a really big problem. It can produce incredible sustainable resources, but those tend to be intermittent. And as a result, you will have mismatches between that production and the demand for that production. And as a result you have wasted energy. And there was this interesting idea, innovation or concept that we started to think about was, computing has infinite demand. Its demand is increasing over time. And in fact five years ago we projected that a small slice of the computing space would sort of dominate in terms of its demand for the entire space. And that is coming true. But what's interesting is if you take a look at those two phenomena, one is energy production that is increasing in its abilities, reducing in cost, and the other computing taking off and transformationally growing to all sorts of innovative things.
If you put those two things together, essentially converge them, you can create something innovative. And that is what we're calling renewable computing. The concept of bringing computing to the location where these power plants are located, to help to drive more of both, more renewable energy because now as the grid sees that demand, it can fire up more of these sustainable power plants. But then reducing the concern of the intermittency by using the computing as a battery or digital battery if you will, and computing can have more. And it can have more access to more sustainable power so it can continue to grow. That convergence is what we saw five years ago, and it's becoming clearer that that is the future of energy. The convergence between energy and computing is inevitable at this point.
Stephen Lacey: Yeah, and how does that actually work? What compute workloads are you batch processing, what does that limit you to, and then how are you actually contracting for those excess renewables?
John Belizaire: Great question. So because the energy is intermittent, we focus on compute that will be resilient to that intermittency. So we look for software and computing that essentially is batchable. So think about Netflix. Everybody watches Netflix or your favorite streaming channel. That's a very real-time process. You need the data center that's serving it up to you or the network of data centers that's serving it up to you to be always on because you don't want that movie to stop or that show to end. We don't focus on those types of applications. So you won't see Netflix, you won't see your ERP system, you don't see your E-commerce system in our data centers. What you will see is a machine learning process that is trying to find different applications for a molecule. You might find a secure security system that's supporting the addition of a new block to the Bitcoin blockchain.
You might find a new inferencing process that's helping to improve customer service in the telecommunication system. Basically anything that is resilient to that power loss is something we call batchable processes. Things that you can do through a series, pause it, or put it to sleep, and then wake it back up when the power is available again. And it turns out that distributed computing has those elements of applications that are like that. And what we do is we deploy those types of computing. So these are specialized facilities that are built for batchable processes. How do we get our power? We go to large utility-scale power plants that are on grid systems. We typically look at existing ones that experience high degrees of curtailment. So that's both technical and economic curtailment, either due to congestion on the lines, pricing dynamics, et cetera. These power plants are unable to monetize all of the power they produce.
And for years they've been focused on, "We need to get more power lines built or get very large batteries built behind there or advance the battery space." And it turns out that the solution to that problem was always there. Computing was a perfect solution to its problem. You can do most batchable computing just about anywhere. So if you brought the computing to the location, we then build the facilities back there. And then we sign power purchase agreements with those power plants or through a retail partner that allows us to get access to that curtailed energy. We set a floor price on that, and below that price we buy that energy and convert it to a global resource. That's the essence of our business. We're essentially a curtailment mitigation solutions company on the one hand, and then we're delivering sustainable cloud and compute resources on the other hand.
Stephen Lacey: So there are a lot of people who are here just trying to get their arms around this problem, and how much is it a problem? And suddenly AI has thrust data centers into this conversation around US grid decarbonization. And there are some who think it's maybe overblown, pointing to some of those historical improvements in the operation of data centers, but as you said, Brian, maybe that's not all it appears to be. And there are others who are saying utilities are using this as an excuse, particularly southeastern utilities to propose a lot of new gas, gas plants that aren't necessarily designed to equip some of these peak loads. Others say AI is just different. It's going to require exponentially more energy. Where do each of you fall on this spectrum of concern, as people come to this and try to figure out just how substantial is this issue? Brian, we'll go to you and then we'll go down the line.
Brian Janous: As I sort of alluded to before, I do think it's very real and I do think that probably the most extreme projections are overblown. I think on the other end, the ones that I've seen on the low end I think are pretty off base as well. We've got a great indicator of investment and growth in the space and it's the CapEx expenditure for all the big cloud companies. And if you look at that quarter over quarter, year over year, they are all accelerating in investment. It's a great leading indicator of how much power, how many chips they're going to plug in in the future. And they're all betting really big on very large AI models. And so that's maybe one of the differences between AI and what we've seen in native cloud, is that there is very much an incentive to use more electricity. There's an incentive to build bigger models, to plug in more GPUs.
And I had someone ask me the other day, "Well, what happens if NVIDIA comes out with a chip that's twice as efficient?" And my response to that is, "Well, then Meta or Microsoft or Google will plug in twice as many chips." If they have the power, they're going to build as big of a model as they can build. And so that's where I think AI is a little different than what we've seen. Not to say there's not incentives for efficiency, because there very much is. But I believe those incentives for efficiency are a function of the fact that these companies don't believe utilities are going to be able to make enough power to meet their demands. And so they are incentivized to make models and make chips and make data centers as efficient as possible. I don't think that's because they particularly want to use electricity. I think it's because they're just very concerned there won't be enough electricity for them to use.
Stephen Lacey: Which is the biggest constraint, Brian? Chips, chip availability or power availability?
Brian Janous: Absolutely power. Absolutely. I think 6 to 12 months ago, I think a lot of the companies were concerned about chip availability. But I think everyone has come around to realize that that is not going to be the biggest constraint.
Stephen Lacey: John, what about you? From a data center perspective, how significant is this challenge?
John Belizaire: The biggest mistake that we can make, folks that are in the power business on the call, is underestimate this situation. It's a big issue, it's a big problem. I want to make two comments. Harken back to the internet phase. Yes, there was a massive explosion and the potential for the internet. The business models were not clear so there was a massive decrease in that momentum. And then suddenly you had a massive explosion again with the growth of cloud as Brian took us to. You always go through those cycles where you're going to have improvements in innovation and technology advancements that essentially lessen on a per unit basis the power need. But what happens is because of the Wright's Law, you're basically going to have huge increases in the use of this technology. What's different this time is you're basically taking a technology that isn't immature.
It's actually quite mature. It's been used for a long time in the background. And we've basically found a way to democratize it. So now anyone can do AI. Everyone will do AI. And the bulk of the AI process is a computing process. It's a compute-intensive process. And the more you integrate it into basically everything we do, enterprises will begin to adapt it, individuals. There's all sorts of talk about individual modeling, that sort of thing. That's going to blow up the explosion and need for that compute because it will be a ubiquitous process. And underpinning all of that is energy. We need to find a way to produce more of it, and we need to do it in a more sustainable way. When you look at today's data center design, it's designed for traditional compute processes and servers that use a 10th of the power that this new technology uses.
It uses 10 times the amount of power, 10 times the amount of power density as in a [inaudible 00:25:08] facility. There's designs of whole new data centers. We're working on our own design. Some of the big hyperscalers are working on their own designs. And these facilities will only be doing AI. It's at that level where the entire building and facility will just be AI processing. How are we going to power that? How are we going to support that from a power perspective? This is a big issue. I liken it to Hurricane Sandy. I'm a New Yorker. And many, many moons ago we had the governor of the state get up on TV and talk about, "Hey, I know you guys, every time we say buckle down for the hurricane, you guys are like, 'All right.' Then nothing happens. I'm telling you this time, something's going to happen and you better be ready for it. So hunker down."
And he was right. Hurricane Sandy was devastating. I mean, there were people who died during the storm because they just didn't take it seriously. And so I think it's super important for us to not sleep on this one. This one's really important. And we have to use innovative ways to solve this problem. We need to rethink the process for bringing energy systems online. The entire process for that. I'm sure we're going to talk about that later, but it's important for me to stress that energy is the primary bottleneck, and I think it could actually be good for us to find solutions to this problem because it will increase energy abundance and more likely increase sustainable energy abundance.
Stephen Lacey: Michelle, over to you. Let's talk about the gas piece here. When we ask how substantial this problem is. In the southeast, we've seen Duke, TVA, Georgia Power propose I think around 11 gigawatts of gas plants. Many of those, most of those are combined cycle. How real is this dash to gas and how do we think about that in the context of these new proposed federal emission standards on gas plants?
Michelle Solomon: That's a great question. So as you mentioned, the utilities in the southeast are proposing somewhere around 11 gigawatts of new gas. So I would say the rush to new gas for many utilities is real. It's kind of their fallback option for what to build when they see new load coming online. And in the context of the new EPA rules, the plants that they regulate as far as new gas plants are combined-cycle gas plants that run at over 40% capacity factors that will need to install CCS by the early 2030s. So kind of a conflation of two issues here, of we need to build new electricity generation, but by having these plants in their integrative resource plans, it perhaps gives the utilities a little bit of cover for pushing back on those EPA rules. So I think there is some reason to believe that the timing of the kind of panic conversation around low growth, in addition to the proposed new combined-cycle plants could have something to do with those proposed EPA rules.
And then just kind of on the question of do we need new gas to meet AI demand? There's no special... AI is not a special unicorn that requires a gas plant to meet its electricity demand. I think there's questions around speed at which this new demand is coming online. So utilities, again, want to just build the thing that they know how to build, but a new gas plant doesn't get built overnight either. So I think we need to be looking at what are all the options for building new energy quickly. A lot of it depends on getting more out of our existing grid so that we can use cleaner resources.
Stephen Lacey: So let's turn to the solution sets that are available today. Brian and John, you talked about how you're pursuing your different sets of technologies. I want to dig into them a little bit more. And then Michelle, get your big picture on what utilities should be emphasizing. You talked about the example of Xcel and maybe we can borrow from that as well. So there are lots of different compelling tech solutions to this to mitigate demand. John, you talked about batchable computing and demand response to make data centers more reactive to grid conditions, connect them to renewables. You got lots of clean solutions like batteries and behind the meter clean energy grid enhancing technologies. How do you think about the mixing and matching of these technologies and what stand out as the most impactful? Brian, we'll go to you. I know it's very site specific in how you're developing a particular site. How do you think about the sets of technologies and what is most available and impactful today?
Brian Janous: Yeah, and it is very site specific and location specific. And it's application specific as well, as John was alluding to, with things like dispatchable workloads. Which is not the sum total of all workloads, as he was noting. There are workloads that do still need 24/7 uptime. But there's a lot of different ways to achieve that. And I think one of the challenges we have in the industry right now is most of these data center operators are very desperate for capacity. So they're going to utilities. They're asking for firm service. They want it delivered 24/7. And then that's then defaulting utilities into thinking about, "Okay, well, I guess we can build some gas plants to solve that." Rather than really peeling back the onion and looking at, "What are all the solutions that we have and all the tools that we have at our disposal?" Which includes things like grid-enhancing technologies, getting more out of the existing system, as Michelle was saying. How do we use storage more intelligently across the system to create more flexibility?
I mean, ultimately it's about how do we sort of even mimic flexible loads? Even if I'm a data center and I need 24/7 uptime, there's a lot of resources that data centers already have on their sites. They have generators, they have batteries. Those can be optimized and they can be expanded and they can be used to provide flexibility back to the grid. So a lot of this is just having more of a conversation between customers and utilities. And from my experience, those conversations are not happening at the level and depth they need to happen. But everyone recognizing that we are trying to manage growth along with peak system loads, which is really what we're trying to solve for. There's plenty of power most of the time. There's a number of hours a year where we don't have enough power. And so that's really what we're trying to solve for.
And so do you build a combined-cycle plant that needs to run 40% to 60% of the time, or do you find a way to deliver more flexibility into the system? And I think that's the challenge that we have right now on how we meet this load growth in a way that, again, achieves the speed and scale demands of a lot of these customers and the imperatives with that, which are all good. I mean, we talk about this as this impending doom, but when we talk about load growth in the United States, it's a great thing. It means jobs, it means economic development, it means GDP growth. All these things are really good. It's just a matter of how we do it and can we do it in a way that meets the business demand and also keeps us on a trajectory to decarbonize the electricity system.
Stephen Lacey: And knowing what you know Brian about how to serve these pockets of demand, does it feel like we can serve them without building a lot of this new gas?
Brian Janous: I think we can. I think there is a lot of levers that we have at our disposal. And I have empathy for my friends in the southeast. I know that they're under a lot of pressure and I know that they're at the tip of the spear here, because not only are they seeing a lot of demand from data center growth, they're actually where we're seeing a lot of demand for factories as well. So they're all under a tremendous amount of pressure to deliver for their shareholders, for their states as much as they can. But there are other levers that we have. And there are ways to manage this. And it just... It's going to take a little bit of time to really work through what are these solutions and how can we deliver on that scale. But I am optimistic that we will be able to create pathways where we can continue to grow the grid and we can do it in a way that is truly sustainable.
Stephen Lacey: So John, over to you. I'm going to weave in some comments here from the audience as they're sending them in. How much of this broader compute workload can you serve with batchable... Within these batchable data centers? And then someone asks, what are you seeing in interest levels from customers for these batchable data centers? What's the split for customers who only need batch ability and those who actually need these real-time 24/7 data centers?
John Belizaire: Great questions. Thanks for sending them in, folks. First thing I would say is you can serve all of it. If you took the approach that we're taking, which is a lateral approach, it's rethinking data centers. We're rethinking the source of power and we're rethinking actually demand response. All in one package. We're delivering that to a behind-the-meter project. And we're making it specialize for these types of processes. And just look at our pipeline alone. We're a young company and growing. We have over two gigawatts of this power that we are slowly developing and building out. And so if you look at the overall demand for AI alone, it's scheduled to get to five gigawatts of that this year. We think there's about 300 terawatt hours of this wasted energy on a global basis available every single year. And so to put that in context, this is not something that's scarce that we're sort of carefully finding as a company.
We're basically just thinking differently about the power and how to deploy it. And batch processing is becoming a much larger... It's the fastest growing pie. It's not the biggest pie, right? Everybody on this call and everybody that's enjoying this is still using real-time processes. Streaming is becoming a big business. And so that's not going away. But if you look at the whole pie of the types of compute that's out there, the batchable portion is growing faster than the rest. And it will quickly become a larger portion of the overall pie and it will quickly have more energy demands than the rest. And so I think that the utilities out there, the folks that are in the energy space need to do what we're doing. Think laterally. Think differently about how you develop power. I know that there are regulatory constraints and so forth, but I think every utility needs a person in charge of innovation.
And to think about how you respond to these data centers coming into your offices and saying, "We need 24 by 7 power." You should ask them, "Well, what are you putting in those data centers? Is it all real-time processes or do you have some batch stuff that we can sort of bifurcate with you and put a plan together where we might want to put some of that out in West Texas or in other parts where there's lots of power. We can't always use that power, but maybe you could put some of your data centers out there. Would you consider doing that?" Ask the data centers how they think about the role of the facility. Is it flexible? Can you actually make this a resource for us rather than just being 24/7? Are there parts of the facility that we can shut down, because that would be really helpful to us?
See those types of conversations unlock innovative solutions to problems that you're going to have as this demand continues to grow and increase over time. As I said, it's not going away. So you're going to have to think differently about the processes you use. I think from a regulatory perspective, I think that there are some opportunities also to inject some new protocols and processes to encourage the drivers of this new increased demand to go to the places where it makes more sense for the grid design. And we can better leverage these more sustainable resources rather than build more gas plants and things like that. So look, the answer to the question is, there's lots of this demand. There's plenty of it out there. The mix between the two batchable is smaller than the traditional compute now. But watch out. It's going to become a much bigger part of it. And then the other is lateral thinking. Think renewable computing, think renewable computing. Where can you put the computing such that it's renewable? And convergence is the way to think about that.
Stephen Lacey: Michelle, over to you. So you outlined some of the solutions that utilities should be evaluating in place of combined-cycle gas plants. But the reality is that we're dealing with this when we have a two and a half terawatt backlog of clean energy projects sitting in the interconnection queue. We have these grid constraints all over the country. And then grid planning issues that are compounding this problem. So as you think about that suite of solutions, whether it be clean generation or batteries, how do you imagine those being integrated given how difficult it is to get a lot of new clean resources on the grid right now?
Michelle Solomon: Yeah. So I mean, the biggest bottleneck to getting new clean resources onto the grid right now is transmission capacity. And that's part of the reason why the interconnection queues are so stuck, is because we need transmission upgrades. And then when a new generator applies to connect to the grid, a lot of times a transmission upgrade is needed. That falls all to that new generator. So then they fall out and don't get built. There's other things going on, but that's a pretty common one. So how do we add new transmission capacity quickly? A great way to do it is using grid-enhancing technologies. Just to call out one of them, we recently worked with UC Berkeley and GridLab on a report about advanced conductors and how they can increase grid capacity quickly. So an advanced conductor is basically a transmission line that can carry more current than the traditional transmission line. And you can replace the existing transmission conductor with this new advanced conductor in as little as 18 to 36 months with much fewer permitting requirements and things like that.
So basically we found that you could add up to four times the amount of transmission capacity that you could in a constrained transmission environment like the one we're in right now by using advanced conductors by 2035. So that is a really excellent way to get that transmission capacity online quickly. It's a great way to allow generated electricity to get to load centers more efficiently without even having to build new generation. So I think advanced conductors are one to really look at, particularly when we're thinking about the speed. I think that's where we're really getting caught up right now is... There's study after study showing that we can have a high percentage clean grid. It's just a matter of getting those clean resources online fast enough so the faster we can expand transmission capacity by using things like advanced conductors and other grid-enhancing technologies in the short term, and then expanding our kind of interregional transmission build out in the long term is so critical.
Stephen Lacey: Brian, how are you seeing some of the supply agreements change? We have a question here about the Amazon-Talon deal. We have a question also about on-site generation and how data centers are thinking differently about procurement. How are you seeing procurement and agreements change?
Brian Janous: Well, I think the Amazon-Talon deals, it's sort of a corollary to what John's talking about where what you're doing is saying, "Go to where the power is." And in John's case, he's talking about, "Go to where the clean power is here." Here you're just saying, "Go to where the capacity is." And you could argue whether it's clean or not. I mean, it's nuclear, it's zero carbon. It's not net new, so it's not bringing new capacity onto the grid, but it's really about a shift in the industry from really a focus on... I mean, if you look at the history of the cloud industry, it started with a focus on fiber. Get as close to the network as possible. That's why you have concentrations in places like Northern Virginia and Amsterdam because those were some of the biggest network hubs in the world. Then it really became about GDP and eyeballs, get as closest to the people as you can.
Now it's really about power. It's get to where the power is. Because as John said, there's a convergence between energy and data. If you think about what data is, it's really just electricity turned into light. It is a form of just energy conversion. And so there's been this realization over the last 18 to 24 months that this whole business is really about getting access to power. And of course access to clean power is super important because a lot of the companies driving this demand are very committed to their climate goals. And I know there's been some questions in the chat about is that going to take a backseat to growth? And I think that is a very real risk, because there's a huge economic pie sitting in front of all these tech companies, which is AI. And whoever can get there first and get the biggest piece of that pie is going to reap rewards for potentially decades to come.
At the same time, I fully believe that the carbon commitments and the sustainability commitments that these companies have made are very real. And they are going to keep investing along those lines. It is going to force some more creativity about how they do that, because I also saw a question in the chat about their climate commitments and how do those comport with utilities building all of these gas plants?
And that's a very valid criticism. Because there is in some ways a disconnect between the way we do carbon accounting for things like net-zero or even 24/7 claims, and the reality of managing peak capacity for new interconnections. And so there is a disconnect there. And so both the things can be true that a company could be a 100% zero carbon by one measure, and a utility is building gas plants to support their load. And that's the problem, right? The carbon accounting rules don't necessarily connect the dots in a way that really drive accountability where it needs to go. Which is managing those peak capacity additions. And that's where technology can come in to say, "Okay, if that's really what we're trying to solve for, how can then we work with our utilities to push on them to deploy grid-enhancing technologies, to think about what we can do behind our meter in terms of investing in our own technology. So that we can collectively manage this growth in a sustainable way."
Stephen Lacey: We've had a bunch of questions come in about nuclear specifically, but I want to broaden it a little bit because I think there's this dichotomy in reaction to this issue. Those who say we need major tech breakthroughs. And those who say we have a lot of the technologies available today to meet this and relieve grid capacity constraints. And so there are a bunch of questions about what is the role of nuclear in serving these data centers? So I'll ask you all broadly, how much do we need to rely on something like advanced nuclear or even fusion? And then how much can we solve the problem with tech that's available today, and just more creative ways to match with renewables and again, relieve grid constraints? John, I'll start with you.
John Belizaire: I'll say it again, lateral thinking. We don't have to go and do rocket science to solve problems that face us today. We just have to think differently about the problem. I'll give you an example. There is a massive queue of power and generation that wants to interconnect to the grid all the time. You can look at all different parts of the country. That's a major issue. And part of that issue is transmission. Part of that issue is there's sort of a disconnect between that generation capacity and demand. What if the utilities did something just as simple as this. Hey, if you're in the queue and you get one of these data centers that keep calling me all the time to build their facility co-located with your power plant, then I'll move you up on the queue. It's that simple. You can start to rethink how all of this plays out.
I've seen on the chat questions about the cycle time of these facilities and the innovation to provide 24/7 when they're behind the meter. Push that pain point back on them. And you know what's going to happen? They will innovate and they'll find new ways to live in that space and still be on with the levels that they need to. And use some of the technologies available now, like some of this grid resiliency, micro grids, all of these technologies that we've been talking about for a long time, that haven't really found a scalable place for them to live. Well, this is an opportunity to leverage all of those technologies and kind of put them together. So the answer is no. We don't need to suddenly jump to fusion or fission to solve these issues. There's lots of solutions available to us right now while this call, this meeting is taking place.
We can stand up from our seats, go and take off the shelf technology to solve these problems. But you have to create the right incentives. I said earlier we have two gigawatts of power available. None of that is nuclear. That's all green electrons being produced by existing power plants. We have partners that are talking to us about power plants that they plan to build, or guess what? That they couldn't build before until they thought about combining them with a data center in this way.
That's additionality. That's more electrons that we can now deploy, that weren't available to us today, because people are thinking differently about what a power plant is. A power plant is not just electrons that get produced and hope they get monetized. A power plant is a combination or the convergence of power and compute. And compute is the actual resource that we're trying to turn on. So if you combine those two things, then you start to have solutions that you can use to address these issues today without investing. And look, over time we will be able to leverage some of these other technologies, but we don't need that right now.
Brian Janous: No, I agree. Listen, I'm incredibly bullish nuclear and the role it's going to play in meeting future demands on the grid. I think especially tech companies probably have wildly overestimated the time horizon that that's going to happen. And I think we need to be thoughtful about the tools we have today. We've been talking about grid-enhancing technologies like reconducturing with new types of conductors. There's also dynamic line rating and topological optimization. There's a lot of stuff we can do plus the access that we have to storage technologies that continue to decline in cost year over year.
So I just think we need to be thoughtful about that bridge on how we get from here to there. And we should keep investing in nuclear. And I'm supportive of what Sam's doing with fusion and what Bill's doing in advanced fission. All those things are great. We just have to be realistic about what they're going to solve for. And then they're not solving any problems this decade. That is a 2030s and probably a late 2030s opportunity. So we have to think a lot about how do we invest in the things that we have today to bridge to that future.
Michelle Solomon: I think we should avoid trying to think about having a generator and a load in a one-to-one fashion. In the scenario that John has described, where you're co-locating a data center with something like a renewable power plant. That's great. It avoids building new transmission to some extent and really fantastic. But we don't need a specific nuclear plant to run a specific data center. We need to look at what the needs are for the grid as a whole. We know that we can operate a clean grid up to 80% or 90% with just solar, wind and batteries. And what we have right now for existing firm capacity. So there's a lot of different clean firm options that are in development, including nuclear, including long-duration storage, including geothermal. And I don't think we need to get cut up on any single one of these technologies, but just focus on what those grid [inaudible 00:50:23] as a whole.
Stephen Lacey: If you could make any one change, whether it be a policy, a regulatory change, or some kind of tech change, what would it be? And Michelle, I'll start with you. If you were a regulator for a day, what's the most important decision that you would make to ensure we meet this new demand as cleanly as possible?
Michelle Solomon: Yeah, I mean, again, one of the biggest questions we have right now is just about the speed at which this new load is going to come online. So if I were a regulator, I would really want to be asking utilities what they are doing to be able to use their existing resources as much as possible? So do they have existing interconnection rights where they can put new wind and solar and batteries so that they don't have to wait as long in the interconnection queue? What lines would be great candidates for advanced conductors to allow more transmission capacity to come online? Can they increase the size of already planned clean projects that do have interconnection points already? So how can they improve their interconnection process to move projects through faster? So I think I would really focus on that set of questions if I were a regulator.
Stephen Lacey: Brian, could be a tech change or a regulatory change or something. Anything.
Brian Janous: I think it would be forcing data centers to become more dispatchable in their operations. And we've seen that in other countries, as grids have become constrained and concentrated, regulators have turned to data centers and saying, "Okay, what can you do? I mean, you already have some of these assets behind your meter. How can you innovate? How can you optimize?" Because as Michelle said, not all data centers can co-locate with generation. They have to really think about how can they become more part of the grid. And part of the grid in the future means becoming more flexible and becoming more dispatchable. And so that's what I would like to see happen. And I think we're starting to see some of that. And I think that trend is going to increase dramatically in the next few years.
Stephen Lacey: John, final word.
John Belizaire: I would do a 10 ideas exercise. If you're a utility, something we do at Soluna to solve hard problems. We get everybody in the room and we basically come up with 10 really strong ideas to address the problem that thinks laterally. And if you can't come up with 10, come up with 20. If you can't come up with 20, come up with 30. There's no dumb ideas. And I guarantee you out of that in wherever you're located, you'll come up with very clever ways to address the potential constraints that you have in your particular grid location. So think differently about how to look at processes, relationships. Maybe data centers shouldn't be at population centers anymore. They should all be behind the meter and be dispatchable processes. And try to think about how to create new protocols that encourage that behavior, that gives you growth and helps you to address some of your constraints as well.
Stephen Lacey: John Belizaire is CEO of Soluna. Brian Janous is the co-founder of Cloverleaf Infrastructure. Michelle Solomon is a senior policy analyst at Energy Innovation. We are going to link to some resources in our wrap-up written by Michelle and her team at Energy Innovation. We'll also include links to Cloverleaf and to Soluna so you can check out more about them. Thank you all so much. This was such a good conversation. A lot of really creative ideas here and grounding on this issue. And of course, you can check out our coverage at latitudemedia.com. Check out our utility AI insights report at latitudemedia.com/research. We are covering this in all kinds of ways, both solutions for AI out in the power sector, and also solutions to the power demand issue. And we'll be covering this on our podcasts as well. So thank you all so much for joining us. We really appreciate it. I'm Stephen Lacey with Latitude Media.