The boom in generative AI — and the data-center buildout that powers it — is colliding with an electricity system built for a different era, and regulators, utilities and tech companies are scrambling to decide who ultimately pays for the upgrades. A landmark ruling by the Public Utilities Commission of Ohio (PUCO) requiring large new data-center customers to shoulder a significant share of contracted energy costs has sharpened a debate that stretches from Northern Virginia to the PJM grid and into the boardrooms of Google, Microsoft and Amazon. Federal and state forecasts show data-center electricity demand rising from a small slice of U.S. consumption in 2023 to a potentially material share within a few years, while hyperscale cloud providers pursue novel workarounds — from behind-the-meter power purchases and nuclear contracts to demand-response agreements — that reshape the market and raise new questions about fairness, reliability and the public interest.
Data centers have always been heavy power users, but the arrival of large-scale AI training and near-constant inference workloads changed the math. The U.S. Department of Energy and Lawrence Berkeley National Laboratory estimate that data centers consumed roughly 4.4% of U.S. electricity in 2023, with a forecast that that share could rise substantially — to between roughly 6.7% and 12% within a few years under current trajectories. Those projections reflect both explosive investment in AI-optimized facilities and higher intensity per server for training and inference workloads.
Buildouts are concentrated: a handful of regions — notably Northern Virginia, parts of Ohio, Texas and segments of the Midwest — are seeing hundreds of megawatts of new demand aggregated in dense, short timeframes. That clustering stresses local transmission systems, changes reserve and capacity calculations, and turns typical utility planning cycles — measured in years and based on modest growth — into urgent, high-stakes battles over who pays for new wires, substations and generation. (datacenterdynamics.com, utilitydive.com)
The result: regulators are rethinking rate design and contractual rules; utilities are using moratoria and new tariffs to manage pipeline risk; and tech firms are using every tool at their disposal — long-term power purchase agreements, direct plant deals, private generation subsidiaries and, more recently, demand-response contracts — to secure the energy they need without taking on unlimited cost exposure.
The Ohio model shows one credible path to force the question: if a company demands dedicated, firm capacity, it should accept a commensurate contract that underwrites the physical assets required. But a patchwork of state-level solutions risks uneven outcomes and could push projects to jurisdictions with weaker consumer protections. A coordinated federal-state approach — combined with clearer market rules at FERC and regional transmission organizations — would better reconcile economic development, reliability and fairness.
Finally, the industry’s nascent use of demand response for machine-learning workloads is a constructive innovation: it lowers near-term reliability risk and can reduce the need for immediate capital-intensive transmission upgrades. Scaled and standardized, demand-side flexibility can be a powerful tool in the toolbox — but it cannot fully substitute for the firm capacity and long-lead transmission that many regions will need if data-center growth continues unabated. (reuters.com, energy.gov)
Resolving who pays for a future increasingly defined by AI will require clearer rules, better data, and cross-jurisdictional coordination that balances innovation with consumer protection. Without that, communities and households risk footing bills for infrastructure sized for a speculative future, while tech firms pursue speed and scale that reshape power markets in real time. The path forward should aim for a pragmatic compromise: incentivize investment, preserve grid reliability, and ensure that the costs and benefits of the AI era are allocated transparently and fairly. (energy.gov, blog.google)
Source: TechRepublic AI Data Centers' Soaring Energy Use: Who Pays Those Costs?
Background: what changed — and why it matters
Data centers have always been heavy power users, but the arrival of large-scale AI training and near-constant inference workloads changed the math. The U.S. Department of Energy and Lawrence Berkeley National Laboratory estimate that data centers consumed roughly 4.4% of U.S. electricity in 2023, with a forecast that that share could rise substantially — to between roughly 6.7% and 12% within a few years under current trajectories. Those projections reflect both explosive investment in AI-optimized facilities and higher intensity per server for training and inference workloads. Buildouts are concentrated: a handful of regions — notably Northern Virginia, parts of Ohio, Texas and segments of the Midwest — are seeing hundreds of megawatts of new demand aggregated in dense, short timeframes. That clustering stresses local transmission systems, changes reserve and capacity calculations, and turns typical utility planning cycles — measured in years and based on modest growth — into urgent, high-stakes battles over who pays for new wires, substations and generation. (datacenterdynamics.com, utilitydive.com)
The result: regulators are rethinking rate design and contractual rules; utilities are using moratoria and new tariffs to manage pipeline risk; and tech firms are using every tool at their disposal — long-term power purchase agreements, direct plant deals, private generation subsidiaries and, more recently, demand-response contracts — to secure the energy they need without taking on unlimited cost exposure.
Ohio’s ruling: a new model for allocating risk and cost
The decision and its specifics
On July 9, 2025, the Public Utilities Commission of Ohio adopted a settlement that implements a data-center-specific tariff for AEP Ohio. The core elements require very large new data-center customers (those taking new load above certain thresholds) to commit to paying for a minimum percentage of subscribed energy — effectively requiring them to underwrite much of the infrastructure that serves them. The approved framework calls for an obligation to pay for at least 85% of subscribed monthly capacity for up to 12 years (with a four-year ramp), exit fees for project cancellations and financial assurance requirements intended to reduce stranded-investment risk for the utility and its ratepayers. AEP framed the change as necessary to prevent cost-shifting to non-data-center customers as the utility invests to meet sudden load growth. (aep.com, content.govdelivery.com)Why Ohio matters nationally
Ohio’s ruling is one of the first comprehensive, regulator-approved attempts to formalize who bears the cost of transmission and distribution investments driven by data-center load. It changes the default allocation of risk away from general ratepayers and toward the large load customer that caused the need for upgrades. That matters because it sets an observable precedent for other jurisdictions where utilities face similar surges: if regulators elsewhere adopt similar tariffs, the capital economics of data-center deals change, potentially slowing speculative projects but also altering how companies price their services and site builds. (powermag.com, aepohio.com)The pushback the ruling prompted
Major cloud providers and data-center developers opposed the AEP proposal in various forums, arguing the tariff reduces flexibility, raises project costs and could discourage investment and the jobs and tax revenue data centers bring. Tech firms argued for more market-consistent approaches and flexible contracting rather than a rigid 85% take-or-pay-style obligation. The dispute — and the companies’ subsequent petitions for reconsideration — signal that the legal and policy fights over cost allocation are far from settled. Meanwhile, utilities and consumer advocates argue that without stronger protections, residential and small-business customers will pay for infrastructure used only intermittently or not at all. (vorys.com, powermag.com)On the ground: regional flashpoints and the Virginia case
Northern Virginia: ground zero for demand growth
Northern Virginia is the world’s largest data-center market and illustrates the dual dynamics of local economic benefit and system stress. A state-commissioned study warned that unconstrained growth could force the state to add large amounts of generation and transmission capacity and that residential customers could see higher bills if utilities must build to meet peak contracted loads. The Joint Legislative Audit and Review Commission (JLARC) and independent analyses point to a need for rapid additions of generation and transmission — or for policies that change who pays. Some estimates in Virginia’s analyses translate to meaningful per-resident bill increases over time if costs are recovered broadly. (rga.lis.virginia.gov, datacenterdynamics.com)Why localized builds create system-wide effects
Data centers operate close to 24/7 and often require firm, dispatchable capacity. When dozens of such customers aggregate in a service territory, utilities must plan transmission reinforcements and, in some cases, additional local generation to meet coincident peaks. Transmission projects are expensive and long-lead; building them to match speculative demand that may not materialize creates stranded-cost risk for existing customers. Regulators are now asking whether those risks should be absorbed by ratepayers, utilities, developers, or the tech customers that directly caused them. (datacenterdynamics.com, utilitydive.com)The tech-industry response: buying power, building plants, and demand management
Buying the juice: PPAs and direct generation deals
Hyperscalers have become the largest corporate buyers of clean energy and are now moving beyond conventional PPAs toward direct contracts with generators — including nuclear plants, advanced reactors, and merchant gas — to secure firm, dispatchable baseload-like power. These contracts, and at times full acquisitions of generation assets or exclusive purchase agreements, let companies control part of their supply stack and reduce exposure to hourly market price spikes and capacity auctions. Major corporate procurement programs have pushed the market, but they’ve also re-shaped the wholesale market and raised questions about fairness and the visibility of long-term commitments. (spglobal.com, theverge.com)- Examples include agreements tied to small modular reactors or deals to resurrect or partner with existing nuclear plants.
- Utilities and independent power producers have responded with bespoke offers designed to capture the lucrative, long-term revenue streams these large loads offer. (cnbc.com, reuters.com)
Owning or operating generation — and the regulatory gray area
Some large tech firms have structured subsidiaries that develop, finance and in some cases operate generation assets. Those subsidiaries may sell power into wholesale markets or under contract. That activity blurs the lines that historically separated regulated utilities (which face constraints on asset ownership and market behavior) from unregulated corporate buyers. Policymakers and market monitors are now evaluating whether these arrangements produce public benefits commensurate with their market impacts. Critics warn that privately negotiated deals could advantage big buyers and leave smaller customers to shoulder system costs. Supporters say these contracts bring new capital and life to underused plants and speed delivery of low-carbon power into the system. (reuters.com, timesunion.com)Demand response: a pragmatic compromise
Not all solutions involve new wires or new plants. In August 2025, Google announced formal demand-response agreements that allow it to reschedule or pause non-urgent ML workloads with Indiana Michigan Power and the Tennessee Valley Authority to ease grid stress during peak hours. The company framed this as a scalable tool to help grids manage near-term strain while enabling faster interconnection of new loads. Demand-response programs pay or incentivize large users to shift or reduce consumption during events, and rolling out those programs for machine-learning workloads opens a new set of grid-flexibility tools. Google’s move — and similar approaches by other cloud providers — signals an industry willingness to participate in operational load management as a complement to long-term supply investments. (blog.google, reuters.com)The economics: who pays — and how the math shifts
Traditional utility cost allocation vs. take-or-pay models
Historically, utilities recover investments through rate structures that spread costs across broad customer classes. When a utility invests in capacity or transmission for general system needs, the cost is socialized. The data-center surge flips that script: massive, discrete loads prompt utilities to propose bespoke commercial terms that force large customers to underwrite specific assets serving them. The practical upshot is a form of cost-causation pricing: if a customer requires dedicated transmission, the customer should pay. That principle is reasonable in theory but difficult in practice because it changes project financing, real estate economics and the incentives for both developers and utilities. (aep.com, aepohio.com)The consequences for tech firms and ratepayers
If more jurisdictions require data-center customers to commit to long-term minimum payments or build and pay for connections, some projects will become uneconomic — especially speculative builds that rely on selling capacity later. That could reduce land grabs and speculative campus builds but also raise the per-kW cost of hosting AI workloads. For residential customers, the alternative — broadly socializing the cost of upgrades — risks meaningfully higher monthly bills if utilities must expand generation and transmission to meet contracted but underutilized capacity. Regulators are thus balancing economic development goals against consumer protection and grid reliability imperatives. (powermag.com, rga.lis.virginia.gov)Market dynamics and public-interest concerns
Are tech companies distorting wholesale markets?
Large, long-term off-take agreements and behind-the-meter arrangements can reshape wholesale prices and capacity outcomes. Some critics claim that tech firms’ deals — especially exclusive, long-dated arrangements with merchant generators — can edge out other buyers and compress capacity margins, all while shifting costs to others through regional price-setting mechanisms. Others counter that these contracts provide financing that keeps generators operating and can accelerate clean-energy investment. Independent oversight and transparent market rules are now central to ensuring competitive fairness and protecting ratepayers. (reuters.com, apnews.com)The nuclear pivot — practical, political and environmental questions
Tech firms’ increasing interest in nuclear power — from advanced reactors to refurbished plants — reflects the problem of securing firm, low-carbon capacity. Nuclear offers continuous baseload output that pairs well with AI’s round-the-clock compute needs, but it raises regulatory, permitting and public-acceptance hurdles. Reviving or contracting out entire plants creates political attention and scrutiny over whether such arrangements serve broader public needs or prioritize corporate customers. The nuclear option also pushes policymakers to confront long-term energy planning choices: invest in dispatchable clean capacity, rely on gas peakers and storage, or offtake long-term firm contracts from private generators. (theverge.com, reuters.com)Where claims need more verification — and what’s still unsettled
Several high-profile claims circulating in coverage deserve caution. For instance, some reporting has suggested that tech-affiliated generation sold more than $2.7 billion into wholesale markets over the past decade; that precise figure and attribution to individual corporate subsidiaries could not be corroborated in public filings or regulatory summaries reviewed during research for this article. Similarly, competing forecasts differ on how rapidly data-center demand will grow (estimates vary by projection year and scenario), so absolute numbers — especially those expressed for a single year like 2026 or 2030 — should be treated as model-dependent. These uncertainties make policy design difficult and underscore the need for transparent data: publicly available, disaggregated load forecasts, contractual filings and consistent market monitoring. (businessinsider.com, energy.gov)Practical implications for IT planners, sysadmins and enterprises
- Site selection will increasingly factor in utility tariff structure and local regulatory posture. Expect developers to prefer jurisdictions with clear, stable interconnection processes and predictable cost-allocation rules.
- Contract negotiations for leased colocation or hyperscale arrangements will include more granular clauses on energy commitments, demand-response participation and potential pass-throughs for grid upgrades.
- Enterprises buying cloud AI services should prepare for higher marginal costs or new surcharges tied to the supplier’s local energy economics — and for potential variability if major clouds relocate compute or throttle non-urgent workloads for grid reasons. (blog.google, aepohio.com)
Policy takeaways and recommended guardrails
- Make cost causation explicit: regulators should craft rules that align investment risk with the party causing it, while protecting smaller customers from speculative buildouts through financial assurance, transparent contract filings and proportional cost-sharing when public benefit is demonstrable.
- Require transparency: mandate public disclosure of long-term energy contracts and material generation ownership so regulators and market monitors can assess systemic implications.
- Encourage demand-side flexibility: expand and standardize demand-response programs for ML/AI loads, paired with fair compensation structures that recognize the value of flexibility to the grid.
- Plan for capacity and transmission: federal and state agencies should accelerate transmission permitting and targeted generation procurement (including firm clean resources) in high-growth corridors to lower overall system costs.
- Use pilot programs and conditional approvals: permit data-center projects to proceed under staged commitments that scale with demonstrated load, reducing the risk that utilities build for phantom demand. (content.govdelivery.com, blog.google)
Strengths, risks and the road ahead
The industry’s aggressive approach to securing energy — from corporate clean-energy procurement to demand response — demonstrates a pragmatic recognition that the grid must evolve to support AI-era loads. Tech capital has accelerated new generation and catalyzed investment in renewables, storage and even advanced nuclear R&D. That said, risks loom: market concentration in a few hyperscale buyers may distort wholesale dynamics; opaque contracts risk cost-shifting; and the pace of data-center construction could outstrip the slower, regulated world of transmission and generation planning.The Ohio model shows one credible path to force the question: if a company demands dedicated, firm capacity, it should accept a commensurate contract that underwrites the physical assets required. But a patchwork of state-level solutions risks uneven outcomes and could push projects to jurisdictions with weaker consumer protections. A coordinated federal-state approach — combined with clearer market rules at FERC and regional transmission organizations — would better reconcile economic development, reliability and fairness.
Finally, the industry’s nascent use of demand response for machine-learning workloads is a constructive innovation: it lowers near-term reliability risk and can reduce the need for immediate capital-intensive transmission upgrades. Scaled and standardized, demand-side flexibility can be a powerful tool in the toolbox — but it cannot fully substitute for the firm capacity and long-lead transmission that many regions will need if data-center growth continues unabated. (reuters.com, energy.gov)
Conclusion
The AI-driven data-center expansion has exposed a fault line between fast-moving private investment and deliberate, public-interest-oriented grid planning. Ohio’s recent tariff order crystallizes one policy direction: make large load customers pay for the infrastructure they trigger. The tech sector’s countermeasures — buying generation, signing bespoke PPAs, experimenting with demand response, and even owning generation subsidiaries — are legitimate market responses but raise serious questions about transparency, market power and the equitable distribution of costs.Resolving who pays for a future increasingly defined by AI will require clearer rules, better data, and cross-jurisdictional coordination that balances innovation with consumer protection. Without that, communities and households risk footing bills for infrastructure sized for a speculative future, while tech firms pursue speed and scale that reshape power markets in real time. The path forward should aim for a pragmatic compromise: incentivize investment, preserve grid reliability, and ensure that the costs and benefits of the AI era are allocated transparently and fairly. (energy.gov, blog.google)
Source: TechRepublic AI Data Centers' Soaring Energy Use: Who Pays Those Costs?