top of page
  • X - Sean Walsh
  • LinkedIn

Introduction: The Quiet Crisis

Updated: Jan 2

How a retired aerospace engineer saw the coming collapse of American AI supremacy, and built the solution before most even knew there was a problem.


Source: @TheKobeiessiLetter , Dec 19, 2025
Source: @TheKobeiessiLetter , Dec 19, 2025

"The future is already here—it's just not evenly distributed." — William Gibson


While watching the sun melt into the horizon on a warm evening in 2018, on a terrace overlooking the Caribbean in Dorado Beach, Puerto Rico, I considered electricity.


Not in the abstract way most people probably think about it. I was thinking about electricity from an engineering perspective: as a highly perishable commodity upon which every facet of our modern world now relies. Dangerous, expensive to produce and store, difficult to move, controlled by powerful interests with sometimes questionable motives. 


I had spent the previous five years building and operating some of the largest Bitcoin mining facilities in the United States. Before that, I had spent years helping to manage a $2 billion Silicon Valley investment firm. Before that, I had over a decade of success with a variety of dot-com businesses. Before that, I had helped design the operating system and in-orbit antennae for DirecTV satellites. Before that, I had been a research programmer using Cray supercomputers at the National Center for Atmospheric Research; machines so powerful they required their own cooling systems and dedicated electrical substations. Before that, I’d studied at the University of Colorado Boulder, and the United States Air Force Academy, And, before that, I had grown up in a tiny Michigan farm town with a population of about 1,000.


In all of that work, spanning several decades, from supercomputers to satellites to cryptocurrency, one problem appeared again and again, underappreciated but also sort of obsolete in some ways.


Electricity production and consumption.


Not the physics of it. The generation of it. The delivery of it. The reliability of it. The availability of it at the scale that serious computation demands. And, the independence of it.


Every datacenter I had ever built, operated, or contracted, faced the same fundamental constraint: they were completely dependent on, and tethered to an electrical grid that was aging, congested, increasingly unreliable, and structurally hostile to the kind of continuous, high-density power that 21st Century advanced computation requires. We could buy the fastest processors money could build. We could write the most elegant algorithms our minds could conceive. But none of it mattered if the power flickered, if the utility delayed our interconnection request, if the grid simply couldn't deliver the megawatts we needed.


As I sat on that terrace watching the shimmering Atlantic swallow the sun, a question formed that would consume the next seven years of my life:


What if datacenters didn't need the grid at all?


This book exists because I found the answer.


Not a theoretical answer. Not a whitepaper answer. A real, operational, commercially proven answer that has been generating electricity and powering computation continuously for more than three years in the New Mexico desert, without drawing a single electron from the American electrical grid.


I named the invention a Solar Computing Cluster. It converts photons into computational power through a self-sufficient system that I’ve now protected with multiple issued U.S. patents covering every critical aspect of the system: the physical layout, the power management, the energy storage integration, the sizing parameters, the commercial applications.


For these past years, whenever I make video calls from Puerto Rico to the team on my New Mexico off-grid datacenter campus, they work flawlessly, day or night. That small fact, mundane on the surface, represents something profound. It means that sovereign, grid-independent computation is no longer a dream or a projection. It is infrastructure that exists and works.


To make matters even more exciting, my electricity production cost is below $0.03 per kilowatt-hour, among the cheapest electricity anywhere in the United States. The datacenter operates 24 hours a day, 365 days a year, with uptimes that match or exceed grid-connected facilities. It uses battery technology that costs eighty percent less than lithium-ion, poses no fire risk, and recycles with ease.


But this book is not only about the technology I built. This book is also about why I built it. It is about the crisis that made it all necessary.


The United States faces an infrastructure emergency that few are discussing honestly.


We are in the early stages of the most consequential technological transformation since the invention of the atomic bomb: the rise of artificial intelligence as a general-purpose technology that will reshape every sector of the economy, every institution of government, every dimension of human life.

This transformation requires computation at a scale that is hard to fathom. Training a single frontier AI model will soon demand more electricity than some small countries consume in a year. The cumulative datacenter power required to support the AI revolution must grow from under 20 gigawatts today to somewhere between 100 and 450 gigawatts by the end of this decade. For context, the entire US only uses about 450 gigawatts today.


The American electrical grid cannot deliver this power.


Not because of policy failures that could be reversed. Not because of temporary supply chain disruptions that will resolve. Because of structural, physical, mechanical constraints that cannot be wished away.


The grid is old. The average large power transformer in America has exceeded its design life by about 30%. Equipment installed when Woodrow Wilson was president still carries electricity to homes and businesses. The US electric grid has accumulated over $2 trillion dollars in deferred maintenance; repairs that were postponed year after year until postponement became the operating assumption.


The grid is slow. Connecting a new power source or load to the grid now takes three to five to seven years on average. The so-called interconnection queue contains 2,600 gigawatts of proposed projects, more than twice the total installed generation capacity of the United States, waiting for approval that may never come. Companies that need power in 2026 are being told to expect connections in 2031 or 2032.


The grid is constrained. The transformers that step voltage up and down for transmission and distribution cannot be manufactured fast enough. Ninety percent of large power transformers are imported, many from countries that may not have America's best interests at heart. Lead times exceed two years. Some require five years to procure.


The grid is contested. Communities across America are blocking grid expansion, and datacenter construction. More than $160 billion in projects have been delayed or cancelled due to local opposition. Electricity bills in datacenter-heavy regions are rising fast enough to cause genuine financial hardship for residents who had no say in hosting these facilities.


And the grid is vulnerable. Chapter 8 of this book documents evidence that Chinese-manufactured transformers installed in American substations contain backdoor kill-switches that could allow hostile actors to disrupt electrical service at will. This is not speculation; it is the subject of active federal investigation and has prompted emergency procurement actions by utilities nationwide.


Against these constraints, consider the demand. Morgan Stanley projects a 44-gigawatt shortfall in AI datacenter electricity supply by 2028, equivalent to 33 million American homes. McKinsey estimates $6.7 trillion in required investment by 2030 just to keep pace with AI-driven demand. Anthropic, one of the leading AI research companies, warns that the United States needs at least 50 gigawatts of AI-dedicated capacity by 2028 merely to maintain technological parity with China.


China, for its part, allegedly deployed more than 400 gigawatts of new generation capacity in less than 24-months, recently. The United States managed several dozen.


As I write this in December 2025, China is widely believed to possess a record 3.75 terrawatts of electricity generation capacity, nearly THREE TIMES AS MUCH as the US, which has ~1.3 terrawatts.


The gap between what AI requires and what the grid can deliver grows wider every month. Every AI datacenter that fails to come online on schedule represents training runs that don't happen, models that don't ship, competitive advantages that accrue to America’s rivals. In an industry where capabilities double every six months or less, a four-year delay in power availability isn't a setback, it's surrender.


This is the crisis. Not a crisis of algorithms or talent or investment. A crisis of electricity supply for AI datacenters. A crisis of infrastructure. A crisis so fundamental that most of the people building AI systems have not yet fully grasped its implications.


I wrote this book because I believe Americans deserve to understand the looming problem before it’s too late.


The chapters that follow will walk you through the evidence systematically. You will learn about the interconnection queue that has paralyzed grid expansion and stranded 2,600 gigawatts of generation capacity in bureaucratic limbo. You will learn about the transformer crisis, the critical shortage of electrical equipment that forms the backbone of power distribution, and the foreign supply chain dependencies that make this shortage a matter of national security. You will learn about the cascade of retirements removing reliable generation capacity from the grid faster than replacement capacity can be added. You will learn about the community opposition movement that has blocked billions of dollars in datacenter investment and shows no signs of slowing.


You will learn, in short, that the conventional, legacy pathways to powering America's AI future are closed. Not closing. Closed.


Then you will learn about a solution.


Solar Computing Clusters are not incremental improvements to existing approaches. They are a fundamental reconception of how computation and energy relate to each other. Instead of building datacenters first and hoping the grid will eventually deliver power, the Solar Computing Cluster produces its own power on-site, from sunlight, storing enough in batteries to operate continuously through nights and cloudy weather.


This architecture eliminates every bottleneck that has paralyzed legacy datacenter development:


No interconnection queue. The system doesn't connect to the grid, so it doesn't need permission from utilities or regulators who are years behind schedule.


No equipment shortages. The system uses lower voltages, and DC power architecture that bypasses the transformers backlogged in legacy supply chains.


No turbine lead times. The system doesn't use gas turbines that currently require four to seven years to procure.


No community opposition. Off-grid solar facilities don't raise electricity bills for urban residents or threaten property values in suburban neighborhoods.


No labor shortages. Solar Computing Clusters can be assembled by any of the millions of construction workers and low-voltage electricians in the US. They do not require specially-trained high-voltage electricians like legacy grid and datacenter systems.


And because the system produces electricity at a fraction of grid costs, the economics are extraordinary. The same kilowatt-hour that costs pennies to produce can be transformed into computation worth dollars—artificial intelligence workloads, cloud computing services, cryptocurrency mining. The spread between production cost and output value creates margins that would be impossible for grid-dependent operators to match.


Researchers from Scale Microgrids, Stripe, and Paces published a whitepaper in December 2024 confirming what I had demonstrated years earlier: off-grid solar microgrids are "likely the only clean solution that could also achieve the scale and speed requirements" for AI infrastructure. They identified more than 1,200 gigawatts of suitable deployment capacity in the American Southwest alone—enough to power the entire projected growth of U.S. datacenters through 2030 four to forty times over.


What they described as a promising possibility, I had already designed, patented, and built at commercial-scale.



I want to be clear about my purpose in writing this book.


I am an engineer. I have spent my career solving problems, and building things. The technology I developed exists because I saw a structural failure in critical infrastructure. As luck would have it, I possess an unusual combination of experience—aerospace engineering, electricity systems, supercomputing, datacenter operations, renewable energy, Silicon Valley tech investing—to understand both the problem and a pathway to solution.


I am also a man of firm Christian faith. I believe the work we do in this world matters beyond the immediate economic returns generated. I believe sovereign electrical infrastructure can shape human advancement in ways that transcend quarterly earnings. I believe the families, communities, and nations that will depend on reliable computation in the decades ahead deserve infrastructure that is low-cost, decentralized, resilient, and independent.


The people I have worked with on this technology share that conviction. Tim Shaler, the CEO of 639 Solar in New Mexico, managed a $9 billion energy bond portfolio at PIMCO before he saw the technical and moral clarity of what we were building and invested a substantial portion of his personal wealth. 639 Solar’s construction manager, the journeyman electrician who helped construct our first microgrid campus, values working close to home because he has a child with serious medical needs; the stability we could offer changed his family's life. Ramez Naam, the renowned futurist and renewable energy expert, began as a skeptic, reviewed the operational data, and became an investor within days.


These are just a few of the people who've invested parts of their lives into datacenter infrastructure that works for America's future; infrastructure that exists because someone decided to build it before the crisis became undeniable.


I wrote this book because I believe the crisis is now close enough that more people need to understand it. The race for AI supremacy will not wait for the US electric grid to catch up. The companies, institutions, and nations that secure reliable computational power, which requires electricity, will shape the future. Those that don't will be shaped by it.


The question is not whether new approaches to powering computation will emerge. The physics and economics make that inevitable. The question is whether those approaches will be designed intentionally, deployed responsibly, and aligned with American interests, or whether they will be improvised in desperation while critical advantages slip away.


I have tried to answer that question with tangible construction in the beautiful New Mexico desert. The Solar Computing Clusters operating in New Mexico today represent a commercial-scale proof of concept that can scale to meet national needs. The patents securing this technology represent protection for the design space that any serious developer will eventually occupy.

The opportunity is real. The technology works. The window is open.


What remains is the decision to act.



The pages that follow will take you through the crisis and my solution in detail. You will encounter statistics, technical specifications, and financial projections. But I hope you will also encounter something else: a realization that the infrastructure we build today determines the possibilities available to our children and grandchildren.


Computation is the substrate on which 21st century civilization runs, our communications, our commerce, our medicine, our defense, our science. The electricity that powers computation is no longer a commodity to be taken for granted. It is a strategic asset whose availability shapes what nations can achieve.


The United States faces a choice. We can continue hoping that the grid will somehow deliver the power that AI requires, despite overwhelming evidence that it cannot. Or we can build something new, computing clusters that produce their own power, operate independently of centralized systems, and scale to meet whatever demands the future brings.


I have made my choice. 


I built the prototype. 


I secured the patents. 


I successfully built two commercial-scale off-grid 100% solar/battery datacenters at enormous personal expense. 


I wrote this book.


Now the choice is yours.


Sean Walsh

Dorado Beach, Puerto Rico

December 2025







For more information on Solar Computing Clusters, my patent portfolio, and partnership/acquisition opportunities, email me, or contact me on LinkedIn.




About the Author


Sean Walsh is a retired aerospace engineer, serial entrepreneur, inventor, and investor whose career spans three decades at the intersection of advanced computation and energy systems.


Walsh began his technical career building components for NASA, and as a research programmer at the National Center for Atmospheric Research, writing code for Cray supercomputers. He's presented at NASA's Jet Propulsion Laboratory. Afterward, he developed satellite operating systems for DirecTV. During the dot-com era, he built digital platforms for Sony Pictures and Sony Music, co-founded Fotochatter (the second-largest mobile social network globally at its peak), and led online customer acquisition for Countrywide Financial, making him the eighth largest advertiser on the Web for a time.


Later, in Silicon Valley, Walsh helped manage a $2B AUM investment firm overseeing a dozen portfolio companies; the firm was named Middle Market Firm of the Year during his tenure. He was an angel investor in Bitcoin mining starting in 2013 and built one of the largest cryptocurrency mining operations in the United States before exiting via public transaction in 2017.


In 2018, Walsh began inventing the technology for off-grid solar-powered computing clusters. He has been awarded seven U.S. patents covering the architecture, power management, energy storage integration, and commercial applications of Solar Computing Clusters. A version of this system has operated continuously at a commercial-scale facility in New Mexico for more than three years, at a company founded by Walsh, called 639 Solar.


Walsh holds an aerospace engineering degree from the University of Colorado, Boulder. He resides in Puerto Rico.


bottom of page