The debate over whether artificial intelligence will devour the U.S. power grid has moved out of academic journals and into boardrooms, utility commission hearings, and consumers’ monthly bills—and the conversation is finally demanding honest numbers, not hype. Recent reporting has made two things frighteningly clear: grid operators and consultants now forecast large, rapid increases in peak electricity demand that they attribute largely to AI-ready data centers, and utilities are already acting on those forecasts by planning new generation and charging ratepayers for long-term infrastructure. At the same time, a growing corps of energy researchers and consumer advocates warn that some of those forecasts may be inflated by duplicate and speculative interconnection requests and by optimistic extrapolations from the tech industry. This article synthesizes the current state of the debate, verifies core claims against multiple independent sources, and lays out the technical, financial, and policy trade-offs that will shape whether AI becomes an energy-driven utility crisis or an engineering-managed transition.
Grid operators across the U.S. — the organizations that balance supply and demand second-by-second — are now forecasting material increases in peak electricity demand over the next decade, driven in large part by hyperscale data centers provisioning AI workloads. The PJM Interconnection, the nation’s largest regional grid, reports its summer peak could climb by roughly 70,000 megawatts over the next 15 years. ERCOT, the Texas grid operator, has publicly warned peak demand might nearly double from recent records by the end of the decade. Independent consultancies and industry analysts produce similar headline-grabbing numbers: Grid Strategies projects nationwide peak demand could rise by about 128 gigawatts by 2029, while ICF models have suggested ~25% higher national demand by 2030 and up to 78% by 2050 under some scenarios. Each claim has been widely reported in the trade press and mainstream outlets. (publicpower.org)
These projections matter. Utilities and regulators use them to justify investments in new generation, transmission upgrades, and “firming” capacity that are recovered through rate cases, which ultimately influence consumer electricity bills. Tech companies and some utilities, for their part, have economic incentives to emphasize long-term, high-growth scenarios: utilities can earn returns on new assets, and hyperscalers seek preferential grid access and site options for latency and redundancy. Critics counter that many interconnection requests are speculative — sometimes the same project is submitted to several utilities while developers shop for the best terms — producing an inflated picture of actual future demand. Journalistic investigations and grid-operator disclosures have documented large volumes of interconnection requests that will never materialize. (wsj.com)
This is the terrain of the current dispute: the difference between “requests in the queue” and “committed load that will be built and powered,” and the policy and financial choices that follow from how the difference is treated.
Why this matters: 70,000 MW is roughly the output of dozens of large gas-fired plants or many gigawatts of firm capacity; powering that incremental load requires transmission upgrades, fuel assurances, and often new dispatchable generation. PJM’s auction signals and capacity price spikes reflect that mismatch in the short run. (reuters.com)
Key takeaways on efficiency:
But the current conversation is unsettled by two simultaneous risks: (1) the upside risk that AI and reshoring truly do produce the high-load scenarios utilities are planning for, creating an urgent need for rapid investment in generation and transmission; and (2) the overbuild risk that utilities and regulators base long-lived investments on speculative demand that will not appear, leaving ratepayers to cover the bill. Both outcomes are plausible — and both require deliberate policy choices to manage.
The pragmatic path forward is neither alarmism nor complacency. It is disciplined planning: require credible developer commitments, improve cross-jurisdictional transparency, accelerate interconnection reform, expand demand-response and curtailment contracts, and pursue energy-efficiency and algorithmic improvements that reduce per-query energy needs while preparing backup firm capacity where and when it’s demonstrably required. For IT professionals, Windows admins, and consumers, this means paying attention to the evolving economics of cloud services and the potential for regionally variable energy costs and reliability. The future of AI’s grid impact will be shaped not by prophecy but by procurement rules, regulatory frameworks, and the details of contracts signed in the next 24 months.
This debate is far from settled. Readers who follow the thread on local data-center proposals — including community-level filings that show where some of the proposed capacity would land — should keep an eye on development milestones, binding power purchase agreements, and utility filings that specify which megawatts are supported by deposits and contractual commitments versus exploratory inquiries. Those markers will separate the speculative numbers from the ones utilities and ratepayers will actually have to live with.
Source: Straight Arrow News Researchers question AI data centers' 'eye-popping' energy demands
Background / Overview
Grid operators across the U.S. — the organizations that balance supply and demand second-by-second — are now forecasting material increases in peak electricity demand over the next decade, driven in large part by hyperscale data centers provisioning AI workloads. The PJM Interconnection, the nation’s largest regional grid, reports its summer peak could climb by roughly 70,000 megawatts over the next 15 years. ERCOT, the Texas grid operator, has publicly warned peak demand might nearly double from recent records by the end of the decade. Independent consultancies and industry analysts produce similar headline-grabbing numbers: Grid Strategies projects nationwide peak demand could rise by about 128 gigawatts by 2029, while ICF models have suggested ~25% higher national demand by 2030 and up to 78% by 2050 under some scenarios. Each claim has been widely reported in the trade press and mainstream outlets. (publicpower.org)These projections matter. Utilities and regulators use them to justify investments in new generation, transmission upgrades, and “firming” capacity that are recovered through rate cases, which ultimately influence consumer electricity bills. Tech companies and some utilities, for their part, have economic incentives to emphasize long-term, high-growth scenarios: utilities can earn returns on new assets, and hyperscalers seek preferential grid access and site options for latency and redundancy. Critics counter that many interconnection requests are speculative — sometimes the same project is submitted to several utilities while developers shop for the best terms — producing an inflated picture of actual future demand. Journalistic investigations and grid-operator disclosures have documented large volumes of interconnection requests that will never materialize. (wsj.com)
This is the terrain of the current dispute: the difference between “requests in the queue” and “committed load that will be built and powered,” and the policy and financial choices that follow from how the difference is treated.
The current numbers — what planners are saying
PJM: the 70,000 MW headline
PJM’s 2025 long-term load forecast updates are among the most-cited pieces of evidence for dramatic future growth. PJM estimates its summer peak could climb by about 70,000 MW over the next 15 years — a step change from historical flat-to-moderate load growth. That would lift PJM’s summer peak to roughly 220,000 MW in the planning horizon the operator published. PJM itself points to concentrated new loads — notably data centers in northern Virginia and related regions — as a major driver. Multiple independent outlets have published these numbers and the underlying PJM forecast. (publicpower.org)Why this matters: 70,000 MW is roughly the output of dozens of large gas-fired plants or many gigawatts of firm capacity; powering that incremental load requires transmission upgrades, fuel assurances, and often new dispatchable generation. PJM’s auction signals and capacity price spikes reflect that mismatch in the short run. (reuters.com)
ERCOT: a near-term doubling scenario
In Texas, statements from ERCOT leadership and state fiscal analysts warn of equally rapid growth. ERCOT’s CEO has described a path in which peak summer demand could approach the mid-100s of gigawatts by 2030 — nearly double the 2023 record. ERCOT has also disclosed enormous volumes of large-load interconnection requests in its queue. This emergent picture is a mix of firm commitments and speculative proposals, but its net effect is the same: ERCOT and its utilities are planning for a very different demand trajectory than the one that held for decades. (comptroller.texas.gov)Grid Strategies and ICF: national-scale forecasts
- Grid Strategies’ assessment of utility filings found U.S. peak load five-year forecasts grew by 128 GW through 2029, concentrated in a handful of regions (PJM, ERCOT, Georgia Power territory, Pacific Northwest, SPP/MISO footprints). The firm attributes much of the shift to data center load and near-term industrial builds. (utilitydive.com)
- Consulting firm ICF produced a national scenario in 2025 estimating ~25% higher electricity demand by 2030 than 2023 baseline, and ~78% higher by 2050 in one of its modeled pathways. These national scenarios treat AI data centers as one of several accelerating drivers — alongside manufacturing reshoring and electrification. (utilitydive.com)
Why skeptics say the forecasts could be wrong
Phantom and duplicate interconnection requests
One of the clearest practical issues in planning is the presence of “phantom” data centers — utility interconnection requests that never materialize. Companies commonly submit multiple requests in parallel while evaluating sites and negotiating tax and municipal incentives. Utilities often report far more megawatts in their queues than they expect to see online, and they do not routinely reconcile duplicate requests across service territories. That inflates the raw “requests” number and can mislead planners and the public if treated as committed demand. Investigations and earnings calls at multiple utilities have illustrated the scale of this challenge. (wsj.com)Supply-chain and chip constraints
Beyond speculative siting, physical constraints may cap short-term growth. Advanced AI workloads require accelerators—primarily GPUs from a tiny set of suppliers (for example Nvidia’s H100/H800/H100 successors)—and at times the industry has been operating near supply limits. Several analysts caution that chip manufacturing lead times, availability of AI-optimized hardware, and the capital intensity of hyperscale builds make it unrealistic to assume every data-center request will convert to a fully operational facility in the timeframe implied by some forecasts. When hardware or other supply bottlenecks exist, load growth can be delayed or restructured. (spglobal.com)Financial viability and profitability of AI companies
Another skeptical strand points to the business case for massive new data-center footprints. Some pure-play AI firms reported large operating losses in 2024 and 2025 even as investors funded rapid expansion, which raises questions about how sustained revenue will materialize to support ten-year growth scenarios. OpenAI’s reported losses in 2024 and similar reports about other startups have been cited as reasons to doubt that the biggest speculative multi-gigawatt builds will all occur. That said, major cloud providers (Microsoft, Google, Amazon) retain deep pockets and strategic incentives to build capacity for enterprise AI. (cnbc.com)What utilities are doing — and why it raises consumer risk
Utilities are responding to queue volumes and interconnection requests in three main ways: (1) they are requesting new generation (often gas-fired) and transmission upgrades; (2) they are seeking regulatory approvals that allow cost recovery and a guaranteed return; (3) they are negotiating long-term power purchases and plant investments with hyperscalers and third parties.- New gas-fired capacity: Global trackers and Reuters reporting show that planned natural-gas capacity in the U.S. has surged — over 114,000 MW of proposed gas-fired capacity is currently in the pipeline in mid-2025. These projects are frequently pitched as “firm” capacity to serve AI loads, and they can be sited adjacent to data centers to bypass some transmission constraints. (reuters.com)
- Rate recovery and guaranteed returns: When utilities build new generation or transmission, they typically seek regulatory approval to recover capital costs through rates and earn a regulated return. If planners overestimate the actual future demand and build excess capacity, ratepayers can be left paying for unnecessary assets. Critics warn that this creates a conflict of interest: utilities benefit from larger rate bases, and the tech industry benefits from a market that projects massive future capacity needs.
- Direct contractual strategies: Hyperscalers are also pursuing direct energy solutions: long-term PPAs, investments in nuclear and hydropower refurbishments, and even behind-the-meter generation aligned with data center clusters. These deals can mitigate grid stress but can also concentrate bargaining power with large corporate actors. (cnbc.com)
Physical and engineering constraints: how the grid responds in reality
The theoretical ability to add megawatts is one thing; the practical pace is another. A few technical realities matter:- Transmission lead times are long. Building new high-voltage lines, substations, or interconnects often takes years and faces siting, permitting, and supply-chain hurdles. PJM and other operators have acknowledged long interconnection backlogs that delay both renewable and firm projects. (reuters.com)
- Dispatchable capacity requirements: AI data centers are continuous, high-capacity loads and therefore need reliable, dispatchable sources (or contracts that guarantee availability). Renewables plus short-duration storage can’t always substitute for this firm capacity without substantial long-duration storage or flexible gas/nuclear capacity. (spglobal.com)
- Cooling and water: large AI facilities have non-electrical infrastructure demands — cooling systems, water for evaporative cooling in some designs, and building cooling distribution — that add complexity to permitting and local environmental concerns.
The efficiency wildcard: can models and systems cut demand growth?
There is a parallel story that undercuts worst-case projections: software and algorithmic efficiency advances. A prominent example is DeepSeek, a Chinese AI lab whose 2024–2025 models reportedly achieved similar task performance while claiming orders-of-magnitude lower GPU-hours and lower energy use in training and inference. If such efficiency gains are broadly adopted, the energy intensity of AI workloads could be materially lower than initial industry extrapolations — reducing the need for large-scale expansion of grid capacity. However, efficiency gains introduce an economic paradox: lower operational cost per query can increase demand and offset energy savings (the Jevons paradox). Thus, higher efficiency could internalize more AI into daily life and still increase absolute energy consumption. Multiple independent outlets documented DeepSeek’s claims and the industry reaction; most emphasize that the DeepSeek result is uncertain and emerging rather than a proven counterfactual to the grid stress story. (reuters.com)Key takeaways on efficiency:
- Efficiency improvements are plausible and are already occurring in research labs and chip design. (spglobal.com)
- Efficiency does not guarantee lower total energy demand: cheaper compute usually spurs more consumption. Efficiency is necessary but not sufficient to cap overall load growth.
- Forecast sensitivity: many of the high-growth projections depend on assumptions about per-query energy intensity and the mix of large training runs versus cheap inference workloads. Small changes in those assumptions lead to very different load forecasts.
What this means for consumers and IT professionals
- Expect higher near-term upward pressure on electricity prices in regions with concentrated data-center demand unless planners can match resources to actual committed loads. Capacity auction spikes in PJM and localized rate increases already signal this effect. Households in high-growth regions should anticipate upward rate pressure as utilities recover investment costs. (reuters.com)
- For IT decision-makers and Windows-based organizations relying on cloud services, latency, availability, and cost arbitrage will shift. Hyperscalers may increasingly favor regions with firm long-term energy deals, or they may price AI services to reflect the underlying cost of guaranteeing power. Localized outages or constrained capacity could influence where you host critical workloads.
- For community advocates and local governments, the practical trade-off is clear: data centers bring jobs, tax revenues, and infrastructure investment, but they also bring demand impacts, water use concerns, and potential rate impacts if planners overbuild to meet speculative requests. Local regulatory oversight and enforceable development commitments (letters of credit, milestones, site plans) are critical to avoid paying for infrastructure that will never be used. (utilitydive.com)
Policy and market fixes that can reduce downside risk
- Improve interconnection discipline and data transparency. Require meaningful deposits, enforceable milestones, and cross-utility reconciliation of duplicate requests so that queue numbers reflect realistic commitments rather than exploratory shopping lists. This is already being discussed at FERC and other forums. (reuters.com)
- Shift from “requests-based” planning to risk-weighted, staged commitments. Treat queue megawatts probabilistically and design transmission and generation procurement to match risk-adjusted demand curves. This avoids premature overbuild. (utilitydive.com)
- Expand demand-response and automated curtailment agreements with hyperscalers. Big cloud providers already participate in demand-response pilots; formalizing these into contract language can reduce the programmatic need for immediate new dispatchable capacity. (reuters.com)
- Accelerate grid modernization tools and planning automation. PJM and others are piloting AI-assisted grid studies to clear backlogs faster and produce more accurate plans — ironically, AI tools may help solve the planning bottleneck. (reuters.com)
- Regulatory protections for ratepayers. Ensure that utilities must demonstrate prudence and avoid automatic pass-throughs for speculative assets; regulators can require that costs associated with unbuilt projects be borne by developers or deferred until completion. This mitigates the incentive to build on ‘hype.’
Strengths and weaknesses in the current debate
Strengths (what’s been done well)
- Grid operators and consultants are transparent about queue volumes and are publishing long-term forecasts, which gives planners and the public an information baseline they did not have a few years ago. PJM’s long-term load forecast and Grid Strategies’ cross-region synthesis are examples of improved visibility. (publicpower.org)
- Industry is experimenting with demand-response measures and direct corporate power contracts (nuclear refurbishments, hydropower purchases, behind-the-meter generation) that can offer pragmatic firming strategies while renewable capacity builds out. (cnbc.com)
Weaknesses and risks
- Counting raw interconnection requests as committed demand is misleading and policy-harmful; it increases the chance of over-investment paid for by captive ratepayers. Empirical evidence of speculative, duplicate requests is strong and should temper headline figures. (wsj.com)
- Many public forecasts rely on uncertain assumptions about AI energy intensity and business viability of new large-scale AI-only operators. The DeepSeek example shows how quickly assumptions about energy intensity can change; it also demonstrates the fragility of long-run extrapolations. (reuters.com)
- The market and permitting environment — particularly for long-lead items like gas turbines, transmission rights-of-way, and nuclear projects — are not agile enough to deliver capacity on the aggressive timelines assumed in some scenarios. That mismatch elevates short-term price volatility and political stress. (gasprocessingnews.com)
Bottom line and conclusion
The energy demands associated with AI data centers present a real and non-trivial planning challenge. Grid operators, consultants, and utilities are right to flag potential capacity shortfalls and to begin planning for large, concentrated new loads. Those signals have value: they prompt investment, transmission planning, and market reforms that otherwise would lag.But the current conversation is unsettled by two simultaneous risks: (1) the upside risk that AI and reshoring truly do produce the high-load scenarios utilities are planning for, creating an urgent need for rapid investment in generation and transmission; and (2) the overbuild risk that utilities and regulators base long-lived investments on speculative demand that will not appear, leaving ratepayers to cover the bill. Both outcomes are plausible — and both require deliberate policy choices to manage.
The pragmatic path forward is neither alarmism nor complacency. It is disciplined planning: require credible developer commitments, improve cross-jurisdictional transparency, accelerate interconnection reform, expand demand-response and curtailment contracts, and pursue energy-efficiency and algorithmic improvements that reduce per-query energy needs while preparing backup firm capacity where and when it’s demonstrably required. For IT professionals, Windows admins, and consumers, this means paying attention to the evolving economics of cloud services and the potential for regionally variable energy costs and reliability. The future of AI’s grid impact will be shaped not by prophecy but by procurement rules, regulatory frameworks, and the details of contracts signed in the next 24 months.
This debate is far from settled. Readers who follow the thread on local data-center proposals — including community-level filings that show where some of the proposed capacity would land — should keep an eye on development milestones, binding power purchase agreements, and utility filings that specify which megawatts are supported by deposits and contractual commitments versus exploratory inquiries. Those markers will separate the speculative numbers from the ones utilities and ratepayers will actually have to live with.
Source: Straight Arrow News Researchers question AI data centers' 'eye-popping' energy demands