What’s at Stake in FERC’s Large Load Proposal?
Photo: Nathan Howard/Getty Images
A new Federal Energy Regulatory Commission (FERC) order could determine whether the U.S. grid can scale quickly and responsibly enough to meet AI-driven demand.
In an assertive use of its authority, the Department of Energy directed FERC on October 23 to initiate a rulemaking (Docket No. RM26-4-000) to standardize how large loads—those over 20 megawatts (MW)—and colocation facilities interconnect to the transmission system. In its Advance Notice of Proposed Rulemaking (ANOPR), FERC grounded its proposal in its authority over transmission facilities, wholesale markets, and practices affecting wholesale rates under Section 201(b) of the Federal Power Act.
Responding to the ANOPR, utilities, data center developers, power producers, and state regulators submitted comments to FERC. This analysis draws upon aggregated comments and issue tracking provided by Halcyon, as well as a review of representative filings. These materials highlight the key questions that FERC must resolve before its April 30, 2026, deadline for a final rulemaking. Hyperlinks in this brief point to sample comments.
FERC’s decision will set an important precedent for how regulators handle hybrid interconnections and, more broadly, how they integrate new load onto an already constrained grid.
Q1: What counts as a “large load?”
A1: Responses broadly reflect two distinct viewpoints: stakeholders comfortable with the proposed 20 MW floor, and those arguing for a higher threshold—often 50, 75, 100 MW or even 200–300 MW. For context, 20 MW is roughly enough to power 16,500 homes or a large university campus such as Louisiana State University. Supporters argue that the 20 MW threshold, aligned with FERC’s existing generator interconnection procedure, would create a coherent regulatory framework for any customer large enough to affect the transmission system materially.
What once qualified as a “large load” at 20–30 MW now appears modest in a landscape where developments increasingly push into the hundreds of megawatts. Critics warn that 20 MW would sweep in midsized manufacturers and facilities traditionally served under state tariffs, effectively federalizing retail matters. Many stakeholders instead converge around 50–100 MW as a more practical threshold for federal involvement, noting that today’s system-shaping data centers routinely exceed 200–300 MW.
Q2: How far does FERC’s authority extend over large loads?
A2: The commission argues that very large loads resemble generators in how they affect reliability and system costs, and should fall under federal oversight when they interconnect to the bulk power system. Adopting this interpretation would establish a more uniform national framework, but it has drawn legal and political resistance from states.
Stakeholder comments split into two groups: those who want to accelerate development and those who want to preserve existing regulatory authority. Developers and technology companies favor a stronger federal role to reduce friction, shorten timelines, and manage transmission system impacts that extend beyond state boundaries. In contrast, states and many utilities oppose the proposal, viewing it as a threat to their traditional authority over retail service, distribution, and resource planning.
Q3: Who pays to upgrade the grid to serve large loads?
A3: U.S. data centers drove a 22 percent increase in electricity demand in 2025 and could triple by 2030, making them one of the largest sources of load growth. Yet national transmission studies show that the United States is building only a fraction of the high-capacity lines it needs each year. Because the grid cannot scale as fast as demand, policymakers now treat cost allocation not as a technical ratemaking issue but as a central question of who pays for grid expansion—and how fast it happens.
FERC’s proposal adopts a 100 percent participant funding model that would require large load customers to pay the full cost of the network upgrades their projects trigger. Supporters argue this follows the logic of cost causation: when a data center or industrial campus connects in a constrained pocket of the grid, it should pay for the transmission it requires.
This policy marks a sharp departure from the traditional socialized model, in which transmission upgrades are treated as shared infrastructure and recovered through regional transmission rates. Proponents argue this design reflects the physical and economic reality: Once built, transmission improves reliability, reduces congestion, and enables power flows for everyone. Industrial manufacturers such as steelmakers also note that while large technology firms may be able to absorb major interconnection costs, capital-intensive facilities such as electric arc furnace steel mills often cannot. At the same time, socializing costs raises the political risk that ratepayers will view higher electricity bills as subsidies for corporate expansions, especially in regions already struggling with affordability or reliability.
FERC itself acknowledged this tension in the ANOPR, asking whether a later crediting mechanism could offset network upgrade costs. In response, state commissions, utilities, and technology firms proposed an intermediate model: Large loads would fund upgrades upfront but receive partial refunds or credits if those facilities later delivered systemwide benefits.
Q4: What does “bring your own power” mean for hybrid projects?
A4: Stakeholders generally agree that hybrid projects—large loads paired with colocated generation or storage—should move through a single, coordinated interconnection process rather than be treated as separate facilities. The dispute lies over how to model them. Developers and technology companies argue that when a data center “brings its own power” (BYOP), studies should measure its net use of the grid based on injection and withdrawal rights. From their perspective, colocated generation reduces system impacts and should translate into lower upgrade costs and faster processing.
Utilities and grid operators reject that logic, arguing planners cannot assume on-site generation will always perform as promised. They therefore insist hybrid facilities be studied on a gross basis—using full load and generation assumptions—so the grid can withstand contingencies when on-site power fails. Regardless of how hybrids are modeled, there is broad support for requiring physical protections—such as relays and control schemes—that prevent injections or withdrawals beyond approved limits.
Q5: Is load flexibility a real grid resource or regulatory fiction?
A5: Stakeholders broadly agree that flexible loads can help integrate new demand. The question is whether FERC treats that flexibility as a core resource or a secondary accommodation.
Recent studies suggest flexibility can materially reduce system stress and cost exposure—but only under strict conditions. Research from Camus, encoord, and Princeton University’s ZERO Lab finds that pairing flexible interconnection with BYOP can speed interconnection while limiting both system expansion and ratepayer exposure. Similarly, Duke University’s Nicholas Institute reports that enforcing load flexibility during a small number of critical hours can reduce both system stress and capacity requirements. Together, these studies frame load flexibility as a valuable planning tool, but not a substitute for transmission investment or reliability oversight. Its value depends on whether operators can observe it, enforce it, and incorporate it into planning.
Commenters disagree on how FERC should weigh that evidence. Some developers want FERC to offer expedited interconnection as an incentive to loads that agree to curtail, arguing controllable demand can ease congestion and defer network upgrades. Others—especially state commissions, consumer advocates, and utilities—warn that treating loads as dispatchable could increase reliability risks for ratepayers without strong safeguards. They note that most hyperscale data centers still operate as firm load, with limited willingness to curtail in practice—a point some technology firms, including Meta, acknowledge. The North American Electric Reliability Corporation likewise warns that rapidly changing large loads can destabilize frequency and voltage and raise fault risks unless planners model them conservatively.
Aaron Yang is a research intern with the Energy Security and Climate Change Program at the Center for Strategic and International Studies (CSIS) in Washington, D.C. Ray Cai is an associate fellow in the Energy Security and Climate Change Program at CSIS. Joseph Majkut is director of the Energy Security and Climate Change Program at CSIS. Mathias Zacarias is an associate fellow and energy transitions fellow in the Energy Security and Climate Change Program at CSIS.
Aaron Yang