I spent over two decades trading North American energy. I tracked power flows across every major grid in the country, modeled capacity markets, and watched utilities struggle to keep up with demand spikes they could see coming for years. The one thing I learned above everything else: the energy business is brutally, mercilessly slow. Permitting, financing, construction, interconnection: from the moment someone decides to build a new power plant to the day it generates its first kilowatt-hour, you’re looking at years. Sometimes a decade.
That slowness was manageable when demand was predictable and grew at 1% a year.
The AI Industry Built an Energy Crisis in About 18 Months
Here’s a number worth sitting with: U.S. electricity demand was essentially flat for fifteen years. From 2008 through 2022, the grid hummed along at roughly the same total consumption, give or take some weather variation. Efficiency gains offset economic growth, and the system held steady.
Then AI happened.
Goldman Sachs Research projects that data centers (the physical infrastructure running AI training and inference) could drive 3% annual electricity demand growth through 2030. That doesn’t sound dramatic until you model out what it means for infrastructure. A 3% annual demand increase, layered on top of 104 gigawatts of hydrocarbon plant retirements, means the U.S. needs approximately 232 gigawatts of new capacity by 2030. That’s the equivalent of adding more than two hundred large power plants, all in the next six years.
Meanwhile, the lead time for a combined-cycle gas turbine is 3-5 years, minimum. The interconnection queue for new power plants and renewable projects has over 2,000 gigawatts waiting for approval. A new transmission line takes a decade from concept to energization, with permitting often consuming more than half that time.
CNBC put it bluntly in February 2026: “Electricity prices will keep rising on AI data center demand.” Goldman’s analysis suggests a 10% annual increase in utility revenue requirements to cover the infrastructure investment, translating to roughly 7% annual real electricity price growth, before inflation.
Someone is going to pay for AI’s power appetite. Right now, the answer is: you are.
Why Building Our Way Out Won’t Work
When confronted with a capacity problem, the traditional utility response is reflexive: build more. More plants, more transmission lines, more substations, more copper and concrete. It’s the only playbook the industry has ever known.
The math on that playbook doesn’t work this time.
The DOE estimated that replacing America’s aging grid infrastructure alone (not adding capacity, just modernizing what’s there) would cost trillions of dollars. Adapting distribution systems just for EV charging and rooftop solar could add another trillion. And that was before factoring in the AI-driven demand surge that nobody fully anticipated.
They’re cost problems and time problems. Even if every new power plant started construction today and every transmission upgrade began permitting tomorrow, the capacity still wouldn’t arrive in time. The demand curve is running faster than the build curve, and the gap is widening.
There’s also a structural irony buried in this math that the industry doesn’t like to talk about: the grid operates at maximum capacity for only a small fraction of the year. On most days, most hours, there is enough generation. The grid’s problem is peak supply, not average capacity: those brutal afternoons in July and August when everyone runs their AC simultaneously, and now increasingly those same hours when data centers are drawing full power.
Building an entirely new power plant and running it at 15-20% capacity utilization, just to handle those peaks, costs roughly $99 per kilowatt-year. That is a spectacularly inefficient use of capital. But it’s been the only answer available, so the industry accepted it.
There is another answer. It’s been sitting in driveways and basements across America for years, waiting to be deployed at scale.
The Solution Has Been Here for Eight Years. We’ve Been Building It.
When I founded Emporia in 2018, the company’s core insight wasn’t about building the cheapest energy monitor or the fastest EV charger. The insight was about the grid itself, and what it needed to become.
For the last century, the electrical grid operated on a one-way premise: centralized generation, transmitted outward, consumed passively. The grid’s job was to balance from the supply side. When demand went up, you ramped supply. When demand dropped, you curtailed generation. The home was entirely passive in this equation, just a load at the end of a wire.
That model is collapsing under the weight of a world with solar panels on rooftops, batteries in garages, and EVs in driveways. We’re sitting on top of an enormous distributed energy resource that the traditional grid architecture simply isn’t designed to use.
The solution to the AI grid crisis isn’t building more. It’s building smarter, and the intelligence lives at the edge.
Home Energy Management Systems, combined with distributed battery capacity, give us something we’ve never had before: the ability to balance the grid from both sides simultaneously.
Here’s how it works in practice. A home with a battery system and a smart energy management platform stores cheap electricity during the hours when the grid has plenty: midday when solar production peaks, overnight when industrial load falls. Then, during the 4 PM to 9 PM window when AI data centers are running full tilt and residential demand spikes and utilities fire up expensive peaker plants, those home batteries discharge. They power the home from stored energy instead of drawing from the grid.
Multiply that across thousands of homes in a neighborhood, tens of thousands across a city, and you have a Virtual Power Plant: a distributed, software-orchestrated asset that looks exactly like a traditional power plant to the grid operator, but without a single turbine or transformer to build.
The DOE has quantified what this is worth. A 400-megawatt VPP costs $43 per kilowatt-year to deploy and operate. A gas peaker plant costs $99 per kilowatt-year. The distributed solution is less than half the cost, deploys in months instead of years, and gets more valuable every time a new battery or EV is added to the system.
The Data Centers Are Starting to Understand This
In February 2026, President Trump announced the Rate Payer Protection Pledge at the State of the Union. Amazon, Google, Meta, Microsoft, xAI, Oracle, and OpenAI all signed on. The pledge requires tech companies to cover the full cost of electricity price increases their data centers cause to consumers, funding on-site generation and reducing strain on the public grid.
That’s a political acknowledgment of what the engineering has been saying for years: the AI industry cannot continue to externalize its energy costs onto homeowners and ratepayers.
But the pledge is missing something: the cheapest, fastest, most scalable solution to the AI energy problem isn’t on-site generation at the data center. It’s distributed storage at the edge, in homes, in driveways, orchestrated by software into dispatchable capacity.
Zach Dell at Base Power said it directly when they raised their $1 billion Series C last year: batteries install capacity faster and cheaper than anyone else in the market. He’s right. What Base Power has proven in Texas, Emporia is building nationally, with one critical difference. We’re not just deploying batteries. We’re deploying the full Home Energy Management System: the monitor that sees everything, the EV charger that knows when to draw and when to give back, the battery that stores and dispatches on command, and the software that orchestrates it all.
The AI grid crisis isn’t a power generation problem. It’s a peak management problem. And peak management is exactly what distributed HEMS is designed to solve.
What This Means for Your Home and Your Bill
When a neighborhood adopts distributed home energy management at scale, four layers of infrastructure spending get avoided or deferred simultaneously.
At the home level: smart energy management can handle 300+ amps of appliances within a standard 200-amp service, eliminating the $4,000 to $15,000 panel upgrade that many EV-owning homes currently face. That’s thousands of dollars per household that never need to be spent.
At the distribution level: local transformers that were headed toward overload get relief when homes shift their load. Con Edison’s Brooklyn-Queens Demand Management program is the proof case: they spent $200 million in distributed resources instead of $1.2 billion on a substation upgrade. That’s an 80% cost reduction, verified and operational.
At the transmission level: managing demand at the edge reduces peak congestion on high-voltage lines. The DOE estimates this creates $69 per kilowatt-year in avoided transmission costs, real savings that either accrue to ratepayers or fund more distributed deployment.
At the generation level: every kilowatt of peak demand shaved by a home battery is a kilowatt that doesn’t require a new gas peaker. At the national scale, the DOE estimates that deploying 60 gigawatts of VPP capacity could save $15 to $35 billion in infrastructure costs over the next decade.
For homeowners who participate in utility VPP programs today, these benefits are already real and measurable. Massachusetts’s Connected Solutions program pays $275 per kilowatt-year to enrolled batteries: a 16 kW system earns $4,400 annually. Xcel Colorado’s Renewable Battery Connect pays $193 to $265 per kilowatt-year. Efficiency Maine runs a 10-year contracted program at $200 per kilowatt-year. These are operational programs paying homeowners right now.
Eight Years of Building the Invisible Infrastructure
Emporia started with an energy monitor because you cannot manage what you cannot measure. We wanted to see the home the way a grid operator sees the grid: every circuit, every load, every moment of the day. 400,000+ homes later, we have that visibility across the country.
We added smart EV chargers because the car in your driveway carries 75 to 125 kilowatt-hours of battery capacity that sits idle 95% of the time. That’s an enormous untapped grid resource. Our PowerSmart technology already manages that asset in real time, shifting charging to off-peak hours and preventing grid overload without the homeowner thinking about it.
We’re launching home batteries designed as appliances: functional, durable, installed on a concrete pad next to your AC unit. The goal isn’t to impress your houseguests. The goal is to get this technology into millions of homes, not thousands.
And we’re building bidirectional EV charging capability because the endgame isn’t a home battery. The endgame is every EV in America as a dispatchable grid asset, drawing power when it’s cheap and clean, pushing it back when the grid is stressed and prices spike. At full scale, the distributed battery capacity sitting in American driveways today dwarfs everything being planned in utility-scale storage.
All of this has one purpose: to turn homes into the smart, two-way nodes the 21st-century grid actually needs.
The Urgency Is Now
The AI buildout is not slowing down. Investment in U.S. data center infrastructure is projected to reach $3 trillion by 2030. That demand is coming regardless of whether we’re ready for it. The grid is already facing this pressure.
The old solution, building conventional generation, upgrading wires, passing the cost to ratepayers, requires timelines that don’t fit this problem. Gas turbines are backordered through 2028. Transmission projects take a decade. The capacity won’t arrive when it’s needed.
The distributed solution, deploying Home Energy Management Systems, home batteries, bidirectional EV charging, and aggregating into VPPs, can be executed on a timeline measured in months. Each home battery installed is immediate capacity. Each bidirectional charger enrolled is a dispatchable asset available the next time the grid hits its peak.
The economics work. The technology is proven. The regulatory framework is in place. The DOE’s VPP Liftoff report laid out the case clearly. FERC Order 2222 opened wholesale markets to distributed resources. States from Massachusetts to Colorado to Maine are running operational programs that pay homeowners for exactly this capability.
What’s required now is scale. Millions of homes, not thousands. And that requires the appliance-form-factor, zero-upfront-cost, TPO-financed deployment model that makes distributed energy accessible to any homeowner, not just early adopters with $20,000 to spare.
The Grid of the Future Is Already in Your Neighborhood
The AI industry has created the biggest electricity demand crisis in a generation. The instinct is to solve it the same way every past crisis was solved: build something big, centralize it, run transmission lines to it, and wait.
That instinct is wrong this time. The timeline, the cost, and the operational logic all point to the same conclusion. The fastest, cheapest, most resilient path to serving AI’s power appetite runs through American homes.
Your house already has more grid potential than most people realize. The car in your driveway. The solar panels on your roof, or your neighbor’s roof. A home battery on a concrete pad. A smart management system that coordinates it all and responds in seconds to whatever the grid needs.
That’s what Emporia is building, deploying, and connecting to the grid today.
The AI companies signed a pledge to protect ratepayers from rising electricity bills. The policy is right. But the solution isn’t just on-site generation at the data center. The real solution is putting intelligence and storage at the other end of the wire, in every home, in every neighborhood, orchestrated as the distributed power plant America actually needs.
We’ve been building this for eight years. The urgency of the moment has finally caught up to the vision.