Chapter 3: Electrons! We Need More Electrons!
- Sean M. Walsh

- Dec 16, 2025
- 10 min read
Updated: Jan 15

The "thing behind the thing", a phrase credited to my friend and colleague James Rose, is the intensifying American electricity shortage that is the hidden bottleneck which will determine who wins the race for AI supremacy.
"One can prophesy with a Daniel's confidence that skilled electricians will settle the battles of the near future. But this is the least. In its effect upon war and peace, electricity offers still much greater and more wonderful possibilities." – Nikola Tesla
"The biggest issue we are now having is not a compute glut, but its power. You may actually have a bunch of chips sitting in inventory that I can't plug in. In fact, that is my problem today." — Satya Nadella, CEO of Microsoft, November 2025
“I’d put my money on the sun and solar energy. What a source of power! I hope we don’t have to wait till oil and coal run out before we tackle that.” - Thomas Edison
Key Points:
1. The real race for AI supremacy to save American sovereignty is actually the race to power all of the necessary computer servers. Electricity is the bottleneck in this problem. 2. This is because there are a litany of growth constraints for expanding datacenter electricity supply in the US, relative to the explosive growth of electricity demand. 3. Walsh has reexamined this problem from square one, and solved it with the novel invention of modular Solar Computing Clusters.
Key Stats:
44 Gigawatts - estimated US datacenter electricity shortage by 2028 (per Morgan Stanley).
31,500,000 - number of average American homes that would consume 44GW of power.
1,400 Watts - approximate fully loaded electricity draw of a single Nvidia H100 GPU (per SemiAnalysis).
10,000,000 GPUs - estimated size of leading AI computing cluster by 2028 (per Situational Awareness).
14GW - approximate electrical footprint of single leading AI cluster, and the approximate power draw of states like Illinois, North Carolina, and New York.
100% - projected percentage of current US electricity production absorbed by AI compute by 2030 (per Situational Awareness).Revelation In Redmond: When the CEO of Microsoft cannot get electricity for his Nvidia GPUs, you know there's a problem.

In late October 2025, Microsoft CEO Satya Nadella sat down for a podcast interview alongside OpenAI CEO Sam Altman. The host, Brad Gerstner, asked whether they agreed with Nvidia CEO Jensen Huang's recent assertion that there was "no chance" of a compute glut in AI over the next few years.
Nadella's response cut through the usual corporate optimism with unusual candor: "The biggest issue we are now having is not a compute glut, but it's power—it's sort of the ability to get the builds done fast enough close to power."
Then came the admission that would have been unthinkable just two years earlier: "You may actually have a bunch of chips sitting in inventory that I can't plug in. In fact, that is my problem today. It's not a supply issue of chips; it's actually the fact that I don't have warm shells to plug into."
Warm shells. The term sounds almost quaint—data center buildings with basic infrastructure in place but lacking the electrical connections and cooling systems required to actually operate. Microsoft, with its $3.8 trillion market cap and $80 billion annual AI investment budget, had purchased thousands of the most advanced AI processors ever manufactured. And they were sitting in warehouses, waiting for electricity.
This is the thing behind the thing, a phrase I credit to my friend and colleague James Rose. The AI race that dominates headlines—the competition for chips, for talent, for models, for data—obscures a more fundamental contest happening at the substrate layer of the entire digital economy. The real race is not for silicon. It's for watts.
Exponential Scaling Meets The Electric Grid

To understand the scale of the challenge, consider how computational power has grown. Leopold Aschenbrenner's analysis in "Situational Awareness" documents a roughly 0.5 orders of magnitude increase in AI training compute per year. That translates to roughly tripling every two years—and the trend shows no signs of slowing.
Each order of magnitude in compute requires roughly an order of magnitude more power. The GPT-4 training cluster, completed in 2022, used approximately 10,000 H100-equivalent GPUs and drew roughly 10 megawatts of power—enough for about 10,000 average American homes. The trend line points to clusters requiring 10 million H100-equivalents by 2028, drawing 10 gigawatts of continuous power. That's equivalent to the total electricity consumption of a medium-sized U.S. state like Illinois or North Carolina.
And that's just for a single training cluster. Factor in inference capacity, redundancy, multiple competing labs, and the reality becomes even more staggering. McKinsey projects total U.S. data center demand will grow from 25 gigawatts in 2024 to more than 80 gigawatts by 2030. By 2035, Deloitte estimates AI data centers alone could require 123 gigawatts—a thirtyfold increase from 2024's 4 gigawatts.
Here's the problem: U.S. electricity production has barely grown 5% in the last two decades. The grid was built for gradual, predictable growth in residential and commercial demand. It was not designed for concentrated, exponential industrial loads that can rival the consumption of entire states.
After 50 Years, Moore's Law Is Slowing Down
There's another factor accelerating this crisis that rarely makes headlines: Moore's Law is dying.

For fifty years, the semiconductor industry delivered roughly exponential improvements in computing performance per watt. Transistors shrank, efficiency improved, and the same power budget yielded ever more computation. This allowed data centers to grow modestly in physical footprint while their capabilities exploded.
That era is ending. As IEEE Spectrum reported, "the last 15 years have seen a big falloff in how much performance improves with each new generation of cutting-edge chips." Dennard scaling—the principle that smaller transistors use proportionally less power—effectively ended in the mid-2000s. The doubling period for peak-output efficiency has slowed from roughly 18 months to nearly three years.
The implications are profound. If you can't make chips more efficient as fast as before, you must deploy more chips to maintain consistent growth in computational power. More chips mean more data centers. More data centers mean more electricity—at accelerating rates.
The AI industry is hitting the efficiency wall precisely as demand is exploding. This isn't a temporary supply chain hiccup. It's a structural collision between the physics of semiconductors and the mathematics of exponential scaling.
And with it, a new economy was born.
The Anatomy of a Power Crisis
Morgan Stanley's analysis quantifies the approaching collision.

For the 2025-2028 period, they project approximately 65 gigawatts of U.S. data center power demand. Against this, they count available capacity: near-term grid access of 12-15 gigawatts, plus roughly 6 gigawatts of data centers under construction. The arithmetic yields a shortfall of 44-47 gigawatts.
To put that in perspective, 44 gigawatts is equivalent to the output of 44 nuclear power plants—more than the entire U.S. nuclear fleet. It's enough electricity to power more than 33 million American homes. And this gap must be closed in less than three years.
But here's where the problem becomes almost comically intractable: the infrastructure required to deliver that power simply cannot be built fast enough.
Transformer lead times have exploded. These critical components—which step voltage up and down at every major junction in the electrical grid—now require four to seven years to manufacture.
Nuclear power plants represent the most reliable form of baseload generation, but new plants won't provide meaningful capacity until the mid-2030s at the earliest. The regulatory and construction timeline for nuclear is measured in decades, not quarters.
Natural gas plants can be built faster—but as discussed in a future chapter, turbines have a 5-7 year lead time—and they require gas pipelines, which face their own permitting and construction bottlenecks.
Transmission line expansion confronts perhaps the most daunting timeline of all. New high-voltage transmission projects routinely face seven-year waits just for interconnection approval.
Grid interconnection backlogs now stretch 5-8 years in many regions. The queue of projects waiting to connect to the grid has become so backlogged that there's currently a seven-year wait on some requests simply to access the grid.
The best-case scenario, accounting for aggressive deployment of natural gas turbines, fuel cells, and conversion of bitcoin mining facilities, still leaves a gap of up to 20 gigawatts by 2028. That's the equivalent of 13-20 nuclear power plants worth of unmet demand—assuming everything else goes perfectly.
But, wait! There's more...

And, more...

Dark Towers in Silicon Valley
The AI electricity crisis is worsening.

In Santa Clara, California—the heart of Silicon Valley, minutes from Nvidia's headquarters—the shortage has already arrived.
Two newly constructed data centers sit completely idle. Digital Realty's four-story SJC37 facility and Stack Infrastructure's SVY02A campus were built to host tens of megawatts of high-density AI hardware. Together, they represent nearly 100 megawatts of ready capacity. The buildings are complete. The cooling systems are installed. The network connections are in place.
They're waiting for electricity.
Silicon Valley Power, the city's publicly owned utility, is racing to expand supply. The city has committed $450 million in grid upgrades scheduled for completion by 2028. But 2028 is three years away, and even then there's no guarantee the power will arrive on schedule. SVP told Bloomberg it is "sequencing power delivery among customers as new substations and transmission lines come online."
Sequencing. The word has an almost quaint quality to it, like rationing during wartime. In the world's premier technology hub, brand-new data centers may sit empty for years while the utility figures out how to actually deliver power to them.
The irony is tragic: Nvidia creates the most powerful AI chips ever manufactured, and its own hometown can't provide the electricity to run them.
The Three-Body Problem of AI Insfrastructure

The data center power crisis is fundamentally a problem of mismatched timelines—what might be called the three-body problem of AI infrastructure.
Data centers can be built in 2-3 years. They're essentially sophisticated warehouses with specialized electrical and cooling systems. Construction is well-understood and can be parallelized across multiple sites.
Power generation now takes 3-10 years depending on the source.
Transmission and other grid infrastructure requires 5-15 years. What's more, new substations, high-voltage lines, and grid upgrades face labyrinthine permitting processes, environmental reviews, and right-of-way negotiations that can drag on for decades.
These three systems must grow together, but they operate on fundamentally incompatible timescales. You can build a data center before the power plant that feeds it exists, before the transmission lines that connect them are even approved. And that's exactly what's happening across America.
The Opportunity in the Gap
Every crisis contains an opportunity, and the scale of the opportunity here is extraordinary.
The 44+ gigawatt shortfall projected by 2028 represents not just unmet demand but a market gap of historic proportions. At current economics, each gigawatt of data center capacity require roughly $50-60 billion in total capital expenditure.
The unfilled demand represents hundreds of billions of dollars in potential revenue for whoever can deliver power to where it's needed.
More importantly, the constraint is temporary only if solutions emerge. If the power gap persists, it doesn't just delay AI development—it determines who controls it. The labs and companies that can access power will advance. Those that cannot will fall behind. Nations that solve their power constraints will lead; those that don't will follow.
What's needed is not incremental improvement but architectural innovation—new approaches to the fundamental problem of delivering electricity to computation at unprecedented scale and speed.
The Solar Computing Cluster does exactly this.
The Modular Solar Computing Solution

I anticipated this shortage nearly a decade ago, when the contours of the AI revolution were just becoming visible. The constraints of the legacy grid were apparent. A collision was inevitable; only the timing was uncertain.
So I began developing a fundamentally different approach: solar-powered data centers designed from the ground up to bypass the bottlenecks that are now strangling the industry.
The key insight is that the constraints facing traditional data centers are artifacts of their architecture, not laws of physics. Conventional facilities require grid interconnection because they were designed around grid power. They need substations because they draw from centralized generation. They face years-long queues because they must navigate the same approval processes as everyone else.
My Solar Computing Clusters sidestep these constraints entirely. They generate power on-site through integrated solar arrays. They store energy locally through advanced battery systems. They operate independently of grid interconnection, bypassing the approval queues that trap conventional projects for years.
This isn't merely a green energy story—though the environmental benefits are substantial. It's a speed story. While competitors wait years for grid connections that may never arrive, Solar Computing Clusters can deploy in months. While traditional facilities negotiate with utilities over power allocations, these systems generate their own. While the industry argues over who bears the cost of transmission upgrades, this approach renders the question moot.
The technical challenges were substantial. Running high-density AI workloads requires consistent, reliable power—something solar alone cannot provide. Integrating generation, storage, and consumption into a single optimized system demanded years of engineering. Achieving cost parity with grid power required innovations across every component of the stack.
But the system works. And more importantly, it scales.
The Window
The AI industry has less than 36 months to close a 44-gigawatt power shortage. Legacy approaches cannot achieve this. The transformer shortage alone—3-4 years just for critical components—makes conventional solutions mathematically impossible within the timeframe.
This creates a window of extraordinary opportunity for alternative approaches. The companies that can deliver power to AI infrastructure faster than the grid can supply it will capture market share that might otherwise take decades to build. First movers will establish relationships with hyperscalers desperate for capacity. Early deployments will prove the technology and build the operational track record that unlocks institutional capital.
The financial opportunity is measured in hundreds of billions of dollars. But the strategic opportunity may be even larger. The entity that solves the power constraint doesn't just capture a market—it becomes essential infrastructure for the most transformative technology of our era.
In the next chapter, we'll examine the technical architecture of solar-powered data centers in detail: how they work, why they scale, and what it takes to deploy them at the speed the moment demands.
The race for AI supremacy will not be won by whoever has the best algorithms or the most training data. It will be won by whoever can actually turn on their computers.

Sources:
Microsoft CEO says the company doesn't have enough electricity to install all the AI GPUs in its inventory, Tom's Hardware
The Free World Must Prevail, Situational Awareness
Can Advanced Materials Address Moore's Law Slowdown, Electronic Design
Silicon Valley Datacenters Totally Nearly 100MW Could Sit Empty For Years, Tom's Hardware
Three Body Problem, Wikimedia
Can US Infrastructure Keep Up With The AI Economy, Deloitte
2025: Year Of The Datacenter Mania, AI-Supremacy.com
Racing To The $1 Trillion Cluster, Situational Awareness
The Huge Problem With The AI Revolution, 44 Nuclear Power Plants By 2028, ZeroHedge
AI Datacenter Energy Dilemma, SemiAnalysis
How Datacenters And The Energy Sector Can Sate AI's Hunger For Power, McKinsey
US Datacenters To Require 22% More Grid-Based Power By End Of 2025, DatacenterDynamics




