Squeezed by both the need for power and uncertainty of revenue, big tech companies are experiencing “some hesitation” about how much data center infrastructure they should build, said Brian Janous, the co-founder and chief commercial officer of data center developer Cloverleaf Infrastructure and former vice president of energy at Microsoft.
Data center planning is a complicated dance. On one side, data centers are scrambling for power to build more powerful AI models. On the other, uncertainty about when they’ll realize the revenue to cover the massive capital outlays for data center development is leading to “some pullback” — even despite their conviction about the immense scale of the opportunity.
“For any one individual player, the ability to commit billions of dollars to electric utilities to build out more good infrastructure is a hard pill to swallow if you’re not fully convinced that you have a customer on the other side of that to receive it,” said Janous, speaking on Catalyst.
Data centers faced a similar challenge back in the 2000s and 2010s when the demand for cloud computing was growing quickly but was still uncertain. To fill the gap at the time, big tech companies housed their servers at colocation facilities. But now big tech companies build data centers for themselves — and they aren’t sure quite how much to build.
Land used to be the most important factor in data center development, but that has changed. Janous explained that land is only 1% of the total cost of ownership of a data center over a 15-year period. Today, he added, it’s a matter of land “plus clear line of sight to power” in the next 18 months to two years.
That shift from land to energy requires new skillsets from data center developers. Before, developers needed to be “really good at real estate and fiber,” Janous said. “And if you look at the makeup of those colocation companies, they are largely real estate- and fiber-dominated in terms of the talent and the skills, because energy wasn’t a challenge when we were building 50-megawatt data centers or 100-megawatt data centers.”
Of course, land and labor still matter, Janous caveated. “Labor has actually become more important in that, if the data centers we’re building today are 10 times the data centers we were building just a few years ago, the amount of construction labor required is enormous,” he said. “I think in a lot of ways people are underestimating the importance of labor.”
But there are tradeoffs between land, labor, and power that complicate data center planning. For example, one might build a data center where land and solar power is cheap and plentiful like in the desert, but the labor market for skilled construction workers is tight.
The watt-bit spread
Janous’ surprising fix for these challenges: the price of data center electricity should go up. It’s an argument he laid out in an October LinkedIn post titled “The Watt-Bit Spread.”
“ I don’t know that there’s any energy conversion that creates a greater return than turning an electron into a bit,” Janous said. The constant drive to build larger and more sophisticated AI models, he added, “will create a moat” that makes it harder for future companies to compete with the first movers’ technical and capital advantages. In short, more electricity means more powerful models — and a competitive advantage in the AI arms race.
While the value of watts today is up, the price of those watts is not. Electricity rates have not risen to reflect the AI boom, and Janous says this is a problem for utilities.
“ The market is not accurately reflecting the value of those watts,” he said. “It’s a problem in that if it’s not sending the right sort of price signal to a utility or to an IPP, then we’re going to build less energy infrastructure and therefore plug in less GPUs.”
He’s quick to point out he doesn’t want rates to go up for residential customers. But he does want to incentivize the construction of new generation — and he thinks that data center customers could pay for that.
“ If the true value of those watts were reflected to electric utilities and to IPPs, then they would be incentivized to build more infrastructure faster,” he said, adding that they would also be incentivized to stockpile critical equipment like transformers and switchgear.
The outlook from here
“My fundamental belief is that the demand for compute, let’s just say through 2030, will exceed the available power in the market,” Janous said. During this period of shortage, he added, every watt is worth more than it’s currently being priced.
Even though utilities have a growing line of data-center developers at their door asking for power, building more generation to meet a short-term need is a tough sell. Utilities realize revenue from assets over decades, not months or years. Building generation to meet this current shortage now could leave them with overpriced or stranded assets.
And moreover, the rate structure of power markets provides no incentive to rush.
“ A utility sells an electron in 2027 for the same price it sells it in 2030,” Janous said. “So what is the incentive the utility has to go faster? What’s the incentive the utility has to stockpile 345-kilovolt transformers? None. All they do is say, well, okay, lead times are longer. So it used to be that I could connect a customer in two years, and now it’s seven years. And that’s okay in the utility world.”
Janous proposed a possible option for the market — an “advanced grid tariff” (he credits Axios reporter Katie Fehrenbacher for the name). No such structure exists yet but Janous outlined the basics: Utilities would charge data centers higher rates in exchange for the faster delivery of power. The increased revenue would finance that power with novel technologies, such as dynamic line ratings or long-duration energy storage. Janous compared it to green tariffs, which allow large commercial or industrial customers to buy power from a specific renewable project.
For such a system to work, though, data center developers would need to reliably follow through on their projects.
“Some of them are big tech companies and some of them are just two guys with a truck that decided they were going to be data center developers and they go get [an interconnection queue] position,” Janous said. As a result, developers will fall out of the queue, which is rife with speculation.
“If you’re saying you’re going to build a gigawatt data center, or you’re holding onto a gigawatt of power in a queue position, at some point you need to be able to point to billions of dollars of capital that you have access to build out that infrastructure,” Janous added. “And if you can’t do that, you’re probably not a serious player in the space.”


