The conversation around data center load flexibility has reached a fever pitch, largely precipitated by a Duke University paper earlier this year that argued a small amount of curtailment could free up massive amounts of capacity on the grid. Much of the conversation in the months since has centered on how data centers interact with the power system, via demand response, grid services, and new interconnection policies, all focused on getting data centers online faster.
But within the broader debate on whether data centers can truly operate flexibly in key markets, a quieter but potentially faster approach may be gaining momentum.
Hammerhead AI, a startup emerging from stealth mode today, is focused on finding unused power inside data centers and intelligently controlling how that power is used, in order to maximize AI throughput — while respecting existing power limits.
Founded by a team of former Autogrid executives, Hammerhead is launching with $10 million in seed funding, led by Buoyant Ventures; it has been supported by accelerator programs from both SE Ventures and Nvidia. The company is betting its tools can boost the output of existing data centers by around 30%, by optimizing token processing and power utilization.
“All data centers that we have seen and have researched are severely underutilized,” explained Hammerhead CEO Rahul Kar, who previously served as COO at Autogrid. Most data centers are only using between 30% and 40% of their allocated power, a practice known as peak provisioning, which has long been considered necessary for protecting the “five nines” reliability of service level agreements.
That results in a massive gap between peak and average ratio of power usage in a data center, Kar added. When power was cheaper and more readily available, it would have been a tough sell to convince companies to give up that extra headroom, he explained. But in the current power crunch, hyperscalers, co-locators, and enterprises are all looking for ways to squeeze “more juice” out of their systems as quickly as possible.
“If there is slack power available within data centers, why do you need to go to the grid?” Kar asked. “Forget the grid. Forget the utility. You can do way more within the data centers to support all these different AI workloads, particularly inference workloads, which will be the money maker in the future.”
Hammerhead’s solution
Hammerhead’s approach targets “slack” or unused power using software and reinforcement learning (a subcategory of machine learning) to manage and orchestrate workloads and power allocation inside a data center. The goal: to safely unlock that reserved power without risking failure during peak times.
The company’s first product, ORCA (Orchestrated RL Control Agents) deploys many smart “agent” programs across a data center, wherever power is consumed or produced: cooling distribution units, racks, servers, and GPUs, for example.
Those individual pieces of software continuously monitor the conditions in their corner of the data center, and send the data back to a centralized control system that oversees the power usage of the whole data center. This approach is known as multi-agent formulation. That “brain” program coordinates the various agents, making bigger-picture decisions to balance power usage and ultimately ensure the data center is running at the highest capacity possible.
Over time, the agents learn how to make better choices, helping ORCA adapt quickly as conditions in the data center change.
Hammerhead is starting with deployments of ORCA in existing data centers, focusing on those serving AI workloads. But the company is also targeting greenfield opportunities, with the end goal of ensuring new data centers don’t come online running at the same low utilizations they have in the past, Kar said.
And it’s actually the co-location and enterprise data centers — not the hyperscalers — that are Hammerhead’s “primary focus,” he added. “They have a unique problem, where not only are they serving hyperscalers but they’re also serving this whole spate of other use cases, from high performance computing to batch inferencing.”
To date, Hammerhead has finished three demonstrations, which have shown “remarkable results,” Kar said. He declined to name specific data center partners.
The company is “hyper-focused on solving the problem for the data centers themselves,” he added, meaning they’re not working with utilities or power providers. Its business model is solely focused on operational efficiency: Hammerhead uses a performance-based revenue model, taking a share of the additional value or revenue created using the power that its technology unlocks.
According to the company itself, in constrained markets, each additional megawatt of unlocked power can be worth as much as $50 million.
The map of data center flexibility
Hammerhead sits at a unique point in the broader grid of data center power solutions, explained Laura Katzman, a partner at Buoyant Ventures, which focuses in part on efficient computing.
Buoyant’s map of that market, Katzman explained, has to date been made up of four main buckets: companies building efficiency tools to bring down the cost of data center operations (like HVAC cooling), those focused on grid services and flexibility (like EmeraldAI), energy optimization solutions, and workload and compute optimization solutions.
Looking at that market, Buoyant noticed that a lot of pitches offered “more of a point solution,” she added. “We kind of wondered how they would reach full scale.”
Hammerhead is part of a fifth, still emerging category for Buoyant, that Katzman sees as “full stack flexibility solutions.” The company is essentially orchestrating across the other four categories, she explained.
Figuring out exactly how to measure the increase in profits a data center experiences by leveraging ORCA is still in the works. One potential method of measurement is “dollars per token per watt” — in other words, how much AI work a data center can get for each dollar spent on power. Another is utilization, Katzman said: tracking how many megawatts of power under a data center’s management are actively being used.
And though Hammerhead may seem to risk making itself redundant — what is its role if all future data centers are already optimized for power efficiency? — Katzman is confident in the company’s longevity. Greenfield opportunities are still a long way off, and Hammerhead’s first order of business is to meet the needs of its current brownfield customers and really prove its value proposition. “Then they’ll be able to scale, not only to greenfield, but also to all types of data centers that exist today,” Katzman added.
A lot of Buoyant’s confidence in Hammerhead is because of the startup’s founding team. The fact that they had already built and scaled Autogrid, then sold it to Schneider Electric, was key: They understand “the technical complexity of these large scale systems and how that interacts with some operational realities, and they have a lot of credibility with OEMs and utilities and data centers,” Katzman said.
Kar and Hammerhead aren’t the only Autogrid alums now working in the data center world: earlier this year AutoGrid founder Amit Narayan founded Gridcare, a startup using AI to find additional capacity on the grid. “Hammerhead’s work to unlock stranded power for high-value AI workloads is deeply complementary to GridCARE’s mission of unlocking additional grid capacity for AI factories,” Narayan, whose early stage venture fund Aina climate also invested in Hammerhead, wrote on Linkedin.
Hammerhead’s approach may appear at odds with flexibility startups like EmeraldAI, which is focused on enabling flexible, grid-responsive workloads via prioritization and real-time throttling. But Katzman believes there’s room for the tools to work in concert, enabling multi-layered flexibility with infrastructure that’s both optimized and responsive to grid signals.
“When you’re able to intelligently, in real time, orchestrate across both the energy side of things, backup generators, cooling and AI workloads, you’re able to unlock the capacity that the EmeraldAI’s of the world don’t,” Katzman said. “I think these two tools can be very complementary, and they can work within the same data center environment.”


