Across the United States, regional grid operators are projecting steep increases in electricity demand, with AI as the fastest rising factor. PJM, which serves 65 million people from Washington, D.C., to Chicago, expects peak demand to rise by about 70,000 megawatts over the next 15 years, a 42 percent increase. In Texas, ERCOT projects peak demand will nearly double from 2024 levels by 2030. Nationwide estimates vary. One analysis projects the U.S. will need 128 gigawatts more by 2029, roughly a 16 percent increase. Another expects approximately a 25 percent higher demand by 2030 and 78 percent more by 2050, using 2023 as the baseline. Data centers already consumed more than 4 percent of U.S. electricity in 2023. Government analysts estimate that could reach as much as 12 percent within three years.
What is driving the load
Modern AI models have billions of parameters and run across thousands of chips. The power draw is not only from compute. Memory and cooling add heavily to consumption. As these systems scale up, heat removal becomes a central challenge. Liquid cooling is expanding, but it requires careful water and thermal management. Even with advanced gear, data centers often run below potential because software, hardware, and networks are not fully coordinated. The result is wasted energy and underused resources.’
Industry pivots to supply power quickly
Oil-field service companies are shifting from fracking support to data center power because they already deploy rugged on-site generators. Solaris, Liberty Energy, Atlas, ProPetro and ProFrac are entering this business. The draw is clear. “Microsoft plans to spend $80 billion on data centers supporting AI this year,” a figure that is “80 percent more than what major oil companies Exxon and Chevron combined are expected to spend on capital expenditures.” Solaris has an agreement to co-own and operate 900 megawatts of gas turbines for xAI, with units tied to the Colossus 2 supercomputer in Memphis, according to city and SEC documents. Liberty’s chief executive, Ron Gusek, said it can take “less than 18 months” from ordering equipment to power up. Solaris estimates “about a year to 2½ years” to reach operations. By contrast, “some utility-scale natural-gas turbines have five- to seven-year wait lists.”
Off-grid speed versus long-term cost
Modular, off-grid gas units can plug directly into a data center, bypassing interconnection queues and local debates over who pays for lines and substations. They are well suited for near 100 percent uptime. Texas now requires large power-demand customers to disclose whether they have on-site backup before connecting to the grid, which can keep these units in place even after grid service arrives. The tradeoff is efficiency. Smaller packages burn more fuel per megawatt hour and require more frequent replacement. Liberty argues these units can give customers price certainty compared with the grid. Still, the modular approach is unlikely to be the cheapest option over the long run.
Who pays for the grid
Electric rates have already climbed. National average residential prices are up more than 30 percent since 2020. A June analysis from Carnegie Mellon and North Carolina State estimates average bills up about 8 percent nationwide by 2030, and as much as 25 percent in Virginia because of data centers. In Ohio, a typical household’s bill rose at least 15 dollars a month starting in June in part due to data centers.
But this is a bit misleading. Inflation has been more than 30% since 2020 and the underlying resources to produce energy have been volatile. Energy providers may be short of generation equipment at the moment, but a power hungry data center is a clear path to building greater capacity, perhaps even cleaner energy with greater efficiency.
Utilities want large users to commit to paying for the capacity they request so that residents are not left with the bill if projects stall. In a first-of-its-kind showdown in Ohio, regulators voted five to zero against a proposal from major tech companies and approved an approach that protects non-data-center customers. “Today’s order represents a well-balanced package that safeguards non-data-center customers,” the commission chair said. Utility leaders are blunt. “My No. 1 priority in all of this is to keep the lights on,” said the chief executive of a large utility group. Microsoft’s energy executive added, “We don’t want to see other customers bearing the cost of us trying to grow.”
Protests of data centers are largely based on ignorance of data center operations and energy generation economics. Risks are easily mitigated in this respect.
Examples when projects went sideways
Virginia approved a $42 million substation and line to serve a data center project near Manassas. The campus lagged for years. During delays, local ratepayers covered costs for infrastructure that sat underused until a customer finally signed years later. In Ohio, Microsoft announced three new campuses, then paused the projects months later. The shift left utilities and communities uncertain about build-out timelines. Memphis offers a different example. xAI deployed dozens of methane-fired turbines to power an AI supercomputer. Reporting highlighted that many turbines initially lacked common pollution controls and permits. Local officials later issued permits for a portion of the units with emission standards after strong community pushback.
Environmental and community risks
As electricity demand jumps, some regions are meeting new load with gas and, at times, more coal. Communities near fast-tracked fossil units face higher local air pollution and additional noise. Data centers also consume water for cooling. Even when companies pledge to pay their way, assigning exact upgrade costs is complex. Without clear rules, residents and small businesses can end up covering part of the cost through higher rates. Consumer comments in Ohio captured the mood. One wrote, “Our wallets cannot be strained anymore. Make them pay their own bills like we do!”
The efficiency wildcard
There is real progress on efficiency. Google reports that the energy per typical AI text request dropped by a factor of 33 in one year. “We estimate the median Gemini Apps text prompt uses 0.24 watt-hours of energy, emits 0.03 grams of carbon dioxide equivalent, and consumes 0.26 milliliters of water,” which the company compares to “about nine seconds of TV viewing.” The largest share of energy during a request is spent on custom AI accelerators, so Google focused on model and system design. The company credits mixture-of-experts routing, compact models, improved data center orchestration, and hardware-software co-design for the gains. It also arranged more renewable power, which further cut emissions per unit of energy. The volume of requests still matters, but rapid efficiency gains could soften the long-term curve.
Are today’s forecasts overhyped
Veteran researchers warn that the industry could repeat the dot-com era’s mistakes. Jonathan Koomey recalls late-1990s claims that computers would soon use half of all U.S. electricity. “It turned out that across the board, these claims were vast exaggerations.” He argues utilities and tech firms both benefit from “eye-popping numbers.” Experts point to “phantom” data centers that file multiple interconnection requests in different regions, inflating projections even though many sites will never be built. Alex de Vries adds a hard constraint. The supply chain for advanced AI chips is “fully utilized,” which limits how far anyone can project. “You can’t really predict much further than like one to two years into the future.”
Tech leaders admit that power is holding them back. “The single biggest constraint is power,” Amazon’s chief executive told investors, adding the company could have had higher sales with more data centers. Utility leaders emphasize cost fairness. “Just pay your fair share of the grid,” one said. Grid economists counter that since data centers account for most expected demand growth in some regions, separate rate classes are justified. Skeptics also question the business side. Some frontier AI firms reported large losses in 2024. As one analyst put it, forecasts reflect “what the tech industry wants to happen.” Economist Paul Krugman warns that the bottleneck itself could end the boom. “This is the way the bubble ends… Not with a pop, but with smog and brownouts.” He argues that firms may “hit the brakes” once they see they cannot power new campuses, creating broader economic risk if AI investment has been propping up growth.
Physical constraints that slow buildouts
Developers face long lead times for utility-scale turbines, a shortage of qualified construction crews, and backlogs for transformers and high-voltage equipment. Transmission expansion is slow, and interconnection queues are crowded. These constraints are why off-grid modular units are attractive in the near term, even if they cost more to run. They can arrive in a year or two rather than five to seven.
Utilities recover grid investments over decades through general rates. When data centers ask for rapid service, utilities must spend sooner. Without special terms, households and small businesses share the cost of upgrades. Ohio’s decision and similar proposals in Virginia push large users to commit to a high percentage of requested capacity and make upfront payments to reduce the risk of stranded costs if a project is delayed or canceled.
Fast power is becoming a product category. Oil-field service firms with proven mobile generation and real data center contracts could be a lower risk way to participate in AI growth. On-site backup and microgrids can become long-term revenue streams as states require resilience disclosures. Chip-level efficiency, liquid cooling hardware, and smarter orchestration software are growth markets. Utilities can earn regulated returns on prudent upgrades if they set clear guardrails that protect other customers.
One path features rapid efficiency gains, strong cost allocation rules, and modular power as a bridge to cleaner, large-scale sources. The other path sees delays, local backlash, and cancellations that leave stranded assets and higher bills. The outcome may blend both. Policymakers can push for transparent rate design, require meaningful commitments from large users, and accelerate siting for lines and plants where the grid can absorb them. Industry can prioritize models and systems that cut energy per request while building backup that communities can live with.
AI is real and hungry. The expected demand is large enough to stress grids, raise rates, and ignite local fights. It also opens opportunities for companies that can deliver megawatts quickly and responsibly, for utilities that build wisely, and for engineers who keep driving down the energy it takes to answer every prompt. The winners will lower the watt-hours per request, pay a fair share for upgrades, and power growth without leaving families to shoulder the bill.
FAM Editor: These data centers are coming. Communities can either embrace them and enjoy the benefits of the increased tax revenue, or they can exclude them and get nothing. So many opportunities, so many yapping fools squandering those opportunities.