Search
March 17, 2026

SEIA, WoodMac Chart Whiplash in U.S. Solar Industry

Nearly 40% fewer U.S. solar power projects reached completion in the fourth quarter than in the third quarter as developers pivoted to start new projects in time to qualify for tax credits.

But while the 43.2 GW of solar capacity installed in 2025 was 14% less than in 2024, it nevertheless made up 54% of all new capacity added to the U.S. grid in 2025, the Solar Energy Industries Association (SEIA) and Wood Mackenzie said in a new report March 10.

2025 was the fifth straight year solar was the top source of new U.S. power generation capacity.

Looking ahead, the analysis predicts nearly 500 GW of additional photovoltaic capacity will be installed nationwide through 2036, even with the headwinds created by a president hostile to renewable energy. The costs of alternatives are high enough that solar remains a value proposition even without the lucrative investment and production tax credits that are being sunsetted sooner than originally planned.

“It’s clear that solar will continue to be the dominant source of new power capacity in the United States, even as gas generation continues to grow,” said Michelle Davis, head of solar at Wood Mackenzie and lead author of the report. “Strong demand growth combined with escalating costs of new gas plants will allow solar to remain competitive, even without tax credits.”

The “U.S. Solar Market Insight 2025 Year in Review” acknowledges the many uncertainties facing solar power. The baseline prediction of 490 GW more solar capacity by 2036 could be 11% higher or lower due to a series of factors, but the projected variation for utility-scale solar (6-7%) is much less than for distributed solar (23-28%).

Distributed solar is more sensitive to changes in retail rates and cost-impacting policies such as tariffs and import guidance, the authors state, while utility-scale projects are more likely to be affected by interconnection bottlenecks, supply chain constraints and power demand growth.

Another unknown factor is President Donald Trump.

Trump’s signature on the One Big Beautiful Bill Act in July 2025 moved forward the expiration of tax credits in the Inflation Reduction Act.

Projects now must start construction before July 4, 2026, or be placed into service by Dec. 31, 2027, to qualify for full tax credits. This led to significantly fewer completed projects in 2025 than expected — the value of completing them was outweighed by the imperative of beginning work on the next projects in the pipeline in order to safe harbor their tax credits.

While the president relentlessly boosts fossil fuels over renewables, he does not show the same level of hostility to solar panels as to wind turbines. There even have been a few hints in early 2026 of solar opposition softening among some MAGA influencers.

solar

Solar power’s growth as a U.S. power generation technology is shown. | Wood Mackenzie

Whatever the motive for this turnaround, solar has some effective selling points in 2026: It is faster and less expensive to deploy than gas or nuclear; U.S. solar component manufacturing has expanded greatly; and battery energy storage systems to smooth out its intermittent performance are proliferating in number while decreasing in price.

SEIA interim President Darren Van’t Hof indicated that while the federal uncertainty has not gone unnoticed, it is not insurmountable: “Solar and storage continue to dominate new capacity additions to the grid despite policy headwinds. American households and businesses of all sizes are demanding solar + storage because they deliver fast, affordable power to help meet rapidly rising demand,” he said.

“Washington must deliver policy certainty for the market to work and to keep pace with growing energy demands. Without this certainty, less solar will get built and Americans will pay the price with higher energy bills.”

Even with this churn, all of Wood Mackenzie’s U.S. power projections show solar constituting nearly half of all U.S. capacity additions each year through 2060.

The 2025 outlook calls for an average annual addition of 44 GW through 2036, which is an increase over previous projections based on the increase in the near-term utility-scale pipeline and continued growth in energy demand expectations.

SERC Speakers Warn of Rapidly Evolving Security Threats

Speakers at a SERC Reliability-hosted webinar praised the commitment by utilities and other stakeholders to address the grid’s ongoing and emerging risks but warned that their adversaries also remain inventive and committed.

“There’s this saying, ‘A rising tide lifts all boats,’” Chad Kitchens, senior lead analyst at Entergy, said at the regional entity’s 2026 Regional Risk Webinar on March 10. “It seems every year the tide is coming in higher, and we have to keep our head above water, [which] requires a greater investment in newer technologies [and] newer control measures.”

The webinar focused on risks identified in SERC’s biennial Regional Risk Report, last released in 2025 and covering the years 2024 to 2026, with a focus on extreme physical events including sabotage and attacks, and exploitation of cybersecurity vulnerabilities. (See Weather, Supply Chain Top SERC Risk Rankings.) Participants included security specialists from electric utilities, law enforcement and consulting firms.

In the first panel, Kitchens joined Jon Carstensen, utilities vertical manager at security firm IQSIGHT, and Mike Hazell, private sector coordinator for the Florida Fusion Center, an information-sharing program managed by the Florida Department of Law Enforcement, to discuss the industry’s response to physical threats. Hazell warned listeners that recent news about attempts by violent extremists to target electric infrastructure showed “the age of security through obscurity is over for us.”

“As the industry adapts and starts to invest in [security], our adversaries are also adapting, and they’re adapting to the point where they’re doing deliberate targeting,” Hazell said. “They are doing the research to understand what … a critical asset is for, not only the energy sector, but the sectors that have interdependencies with the energy sector.”

Hazell pointed to a worrying trend of extremists borrowing tactics and even motivations from each other in what he called “a salad bar approach to rhetoric and ideologies.” He said the attack in February by a New York man against a substation in Boulder City, Nev., was an example of this development.

After Dawson Maloney of Albany, N.Y., drove his rented car, loaded with weapons, through the substation fence and killed himself with a shotgun, authorities found several documents in a hotel room he rented, including military pamphlets on improvised weapons, books on magic from the 17th century and novels promoting white supremacy and terrorism. (See Police, FBI Seeking Motive in Nevada Grid Attack.)

Hazell said the incident shows the need for collaboration among utilities, law enforcement and the cybersecurity community “to outpace our adversaries” by sharing threat intelligence while protecting sensitive information that adversaries could use for their attacks.

Kitchens agreed that “we have to up our game,” suggesting that utilities consider machine learning and artificial intelligence technology to help keep up with threats. As an example, he mentioned video monitoring systems with analytic functions that can classify observed objects and notify operators to anything out of the ordinary.

“I think the biggest thing that I’m excited about is just being able to [use] those analytic and AI capabilities to filter out the noise that traditionally you get when you do that,” Kitchens said. “It’s making more efficient use of the people you have, which allows you the potential to ingest more sites into your [security operations center] and provide a larger sense of protection.”

ISO-NE Proposes Cut to Performance Payment Rate

ISO-NE has proposed to reduce its performance payment rate (PPR) by more than 60% in response to concerns that excessive penalties will have unintended consequences for the capacity market.

Capacity resources in New England have incurred significant performance penalties during scarcity events over the past two years. These penalties have been particularly consequential for slower-start fossil units. Over two events in 2024, net penalties for combined cycle gas and oil generators totaled $44.3 million, while penalties for steam turbine residual-oil units totaled $25.8 million.

Some participants have argued the risk of these penalties could drive up capacity prices in future auctions and push resources out of the market.

The performance rate determines penalties and credits during scarcity events. The RTO’s Pay-for-Performance (PFP) construct is designed to insulate ratepayers, with underperforming resources paying for the incentives for overperforming resources.

The RTO’s per-megawatt-hour performance rate has grown in recent years, increasing from $2,000 to $3,500 in 2021, to $5,455 in 2024, and to $9,337 in 2025.

ISO-NE announced at the NEPOOL Markets Committee meeting March 10 that it plans to cut the rate back to $3,500. It also plans to move forward on an expedited schedule to implement the changes as quickly as possible, targeting a technical committee vote in May.

“Some resources may find the increased PPR, and the volatility associated with it, makes the risks and potential costs of selling capacity too high,” said Chris Geissler, director of economic analysis at ISO-NE. “This could result in retirements from resources that can still make meaningful contributions to system reliability.”

He added that a high performance rate increases the risk that individual resources hit their stop-loss limits, which cap the total penalties each resource can accrue per month. When resources hit this limit, ISO-NE charges unrecovered penalties to all capacity resources that have not hit the stop-loss limit.

The reduced PPR still should provide adequate incentives for performance, Geissler said, estimating that incentives from PFP and elevated energy market prices likely would total around $6,000/MWh.

“History suggests that resources make investments and perform strongly at this rate,” he said.

Stakeholders generally reacted favorably to the proposal, while some expressed concern that a $3,500 rate may be too low to adequately incent performance during scarcity conditions.

Treatment of Exports

Also at the MC meeting, ISO-NE detailed its plans to subject certain exports to the performance rate.

This change, recommended by both of the RTO’s market monitors, is intended to prevent a market loophole that could allow participants to earn performance credits without sending any power.

Under the current rules, during a capacity scarcity event, if a participant schedules exports that equal imports scheduled by a different participant, the export would not be charged performance penalties, but the import would earn performance credits.

“These two transactions collectively result in no power flowing but do not net in settlement because they are submitted by different market participants,” said Enrico De Magistris, economist at ISO-NE. “The market participants could transact outside the ISO-NE system to share the PFP credits.”

He noted that ISO-NE is not aware of any instances in which a participant has exploited this loophole.

To fix the issue, the RTO proposes to charge the performance rate during scarcity conditions to all exports “not associated with a specific generator in the ISO-NE system.”

Unlike “system-backed exports,” exports associated with a specific generator would not be charged the performance rate. These exports would reduce the amount of performance revenues the associated generator could earn or subject it to performance penalties for not meeting its capacity supply obligation (CSO).

De Magistris said ISO-NE likely will remove system-backed exports from the calculation of its balancing ratio, which it uses to determine capacity resources’ obligations during scarcity events.

ISO-NE calculates the systemwide balancing ratio by dividing load and reserve requirements by total CSO. System-backed exports are currently included in the calculation as load, while generator-backed exports are excluded.

Balancing Ratio Cap

ISO-NE also discussed its proposal to cap the PPR balancing ratio in compliance with an order issued by FERC in January.

The ruling stemmed from a complaint by the New England Power Generators Association after the balancing ratio exceeded 1.0 for the first time ever during an event in June (EL25-106). (See FERC Directs ISO-NE to Cap Pay-for-Performance Balancing Ratio at 1.0.)

In designing the tariff changes, ISO-NE has tried to “keep the ‘effective’ payment rate for overperformance as close to the tariff-specified [PPR] as possible,” said Megan Sweitzer, lead analyst at ISO-NE.

Under the proposal, if the cap on the balancing ratio leads to the under-collection of performance charges, this deficit would cut into the performance credits allocated to overperforming resources.

“This change ensures resources performing at their CSO megawattage are not charged” and “lowers the ‘effective’ PPR for overperformance when a deficiency exists,” Sweitzer said.

Notably, the treatment of deficits caused by the balancing ratio cap would differ from the treatment of deficiencies caused by the stop-loss mechanism, which will still be charged to all capacity resources.

While NEPGA argued against ISO-NE’s allocation of stopped losses in its complaint, FERC sided with ISO-NE’s argument that the stop-loss mechanism benefits all capacity resources and therefore it is fair to charge capacity resources for the costs of its implementation.

Duke Files Settlements in Carolinas on Proposed Utility Combination

Duke Energy has entered a pair of settlements in North and South Carolina on its proposal to combine Duke Energy Carolinas and Duke Energy Progress, which still needs approval from both states’ regulators.

Duke said combining its Carolina subsidies would help it meet the states’ growing energy needs at a lower cost. (See Duke Energy Says Combining Carolina Utilities Would Save Billions.)

The deal before the North Carolina Utilities Commission was filed in late February and signed by North Carolina Public Staff, the North Carolina Attorney General’s Office, Google, Nucor, Walmart and others.

“We’re pleased that public staff and the attorney general’s office agree our customers will see significant future cost savings and other meaningful benefits from combining our two utilities,” Duke Energy North Carolina President Kendal Bowman said in a statement on March 10. “It reduces customer costs, simplifies operations, promotes regulatory efficiencies and supports economic growth across the Carolinas.”

The deal pending before the Public Service Commission of South Carolina was filed on March 6 and was endorsed by the state’s Office of Regulatory Staff, Nucor, Walmart, Vote Solar, the Sierra Club and others.

“Our engagement has been laser-focused on consumer protections and affordability for South Carolina families and small businesses, and one of the best ways to do that is by investing in alternatives to building new costly and polluting resources,” Sierra Club’s Paul Black said in a statement. “Duke’s regulators at the Public Service Commission must turn their attention to establishing strong consumer protections that require tech companies, not families, to pay for all of the energy and infrastructure costs for new data centers, and the Sierra Club has laid the groundwork to make that happen.”

Duke Energy Carolinas owns 20.8 GW of generation and serves 2.9 million customers across a 24,000-square-mile territory, while Duke Energy Progress owns 13.8 GW to supply 1.8 million customers across a 28,000-square-mile territory.

The filings with both states include commitments from the utility to save hundreds of millions of dollars through lower production costs from more efficient operations and lower capital costs from more efficient planning.

The proposal already has been approved by FERC. Assuming the two states approve the settlements, Duke expects to combine the subsidiaries effective Jan. 1, 2027. (See FERC Approves Duke Proposal to Combine Carolinas Subsidiaries.)

EIA Expects No Impact on Domestic Natural Gas Prices from Iran Conflict

The war in Iran is not expected to lead to higher domestic natural gas prices in part because higher oil prices tied to the closure of the Strait of Hormuz mean more oil production and related gas supply from the Permian Basin, the U.S. Energy Information Administration said.

In its monthly Short-Term Energy Outlook, released March 10, EIA explained how Iran’s closure of the strait in response to a bombing campaign by the U.S. and Israel raised global oil and LNG prices.

The Brent crude oil spot price was up sharply since the start of military action in the Middle East, settling at $94/barrel March 9, a 50% boost since the start of the year and the highest since September 2023.

“We make the assumption in our modeling that the effective closure of the Strait of Hormuz will cause oil production in the Middle East to fall further in the coming weeks,” EIA said. “We assume this shut-in production will gradually ease as transit through the strait resumes.”

Nearly 20% of global oil trade flows through the strait, which is between Iran and the Arabian Peninsula, along with about 20% of global LNG, mainly from Qatar to East Asia. Global LNG prices have shot up, but U.S. export capability was already operating near capacity before the bombing began.

In the short term, EIA predicts national average natural gas prices of $3.80/MMBtu, 13% lower than last month’s figure, as more of the fuel was left in storage than it had expected.

“The Henry Hub spot price averages nearly $3.90/MMBtu in 2027, 12% lower than our forecast last month,” EIA said. “Lower prices in 2027 mostly reflect more associated natural gas production as a result of the recent increase in oil prices and the related increase in production later in the forecast.”

Higher crude production results in more associated natural gas production, and EIA expects the latter to rise 2% from 2025 to 121 Bcfd this year and an additional 3% in 2027 to 124 Bcfd. The 2027 figure is 2 Bcfd higher than EIA forecast a month ago.

“Elevated oil prices will drive more oil-directed drilling in the Permian, which will contribute to greater volumes of associated natural gas production,” EIA said.

National average residential electricity prices are expected to rise slightly this year and next, going from 16.5 cents/kWh in 2025 to 17.3 cents/kWh in 2026 and 18 cents/kWh in 2027.

“We expect U.S. electricity generation will grow by 1.2% in 2026 and by 3.1% in 2027, which follows recent upward trends in generation to meet growing electricity demand,” EIA said. “Between 2010 and 2019, electricity generation was essentially unchanged, as electricity demand from a growing population was offset by the use of more efficient appliances and heating and cooling equipment. But since 2021, U.S. generation has been growing [at] an average of about 2% per year.”

The biggest growth is in ERCOT, where EIA said generation is expected to grow by 7.3%, leading to increases across all technology types. The rest of the country is expected to see less generation from natural gas plants, as the delivered price of the fuel for generators is up 8%.

Higher gas prices tend to favor generation from coal plants as a substitute, but with operators currently planning to retire 4% of coal capacity and the growth of renewables, EIA forecasts coal generation will drop 7% this year, mostly in the Midwest and Southeast. Plans to retire coal plants are subject to change, the agency noted.

Utilize the Grid Better to Save $100B+, New Coalition Urges

A new industry coalition calling itself Utilize has begun a campaign to make electricity less expensive and quicker to connect by unlocking underused grid capacity.

Utilize announced its launch March 10 and said it soon will release a Brattle Group research report showing that better use of existing grid infrastructure could save more than $100 billion over 10 years.

The coalition’s charter members are a cross-section of energy providers and users including Carrier, Google, Renew Home, SPAN, Sparkfund, Tesla and Verrus.

Utilize is designed as a nonpartisan campaign focused on influencing state-level regulators, elected officials, utilities and stakeholders.

In its announcement, Utilize emphasized one of the salient themes of the 2026 campaign season: consumer costs. It said the $100 billion in savings would accrue to consumers on their electric bills and allow consumers to connect to the grid more quickly. But it also said better utilization would help states meet the growing power demands created by data centers, manufacturing and electrification without delay or excess cost.

The power grid is built for peak demand, and the excess capacity in non-peak periods has been cited repeatedly as a potential resource to meet new non-peak demand, particularly if more users were more flexible in their peak demand.

Utilize cited an influential 2025 Duke University study showing the existing grid could handle 126 GW of new demand with no additional generation if data centers cut their power use as little as 1% in peak periods. (See US Grid Has Flexible ‘Headroom’ for Data Center Demand Growth.)

In the 13 months since the study was released, many other people have reached the same conclusion, and Duke issued a follow-up report that drilled down on the benefits. (See Duke University Study Quantifies Benefits of Data Center Flexibility.)

The Federal Reserve Bank of St. Louis recently charted U.S. grid capacity utilization dropping from just over 100% in July 1999 to just over 68% in August 2025. It averaged 71.27% in January 2026, the most recent month charted.

Now Utilize wants to translate research into action.

“For decades, we’ve built the grid to meet peak demand, even though large portions of it sit unused for most hours of the year,” Executive Director Ian Magruder said.

“It’s like building an airplane that only flies with full passengers a few times a year. That excess capacity is hiding in plain sight, and new technologies give us the opportunity to unlock it. Better grid utilization is one of the fastest, most practical levers states can pull to reduce power bills while supporting economic growth.”

Utilize said it will support technology-neutral policies that align planning, incentives and regulatory framework to meet the objectives of affordability, reliability and speed.

The goal is to make better grid utilization a core principle of U.S. grid planning.

Utilize cited the 2025 Duke study’s finding that the 22 regional power systems examined operated at just 53% of capacity on average.

Utilize also pointed to a 2025 Stanford University study showing that even during peaks, most Western U.S. transmission lines were carrying only 18 to 52% of their available capacity, with most clustered around 30% of capacity. But the excess capacity is not consistently accessible due to operational and planning constraints; Utilize said better utilization would allow for more demand to be served and would spread the fixed grid costs across more electricity sales.

The new Utilize coalition adds some prominent names to the push to better utilize the existing grid. Some other recent efforts:

A new partnership announced the same day as Utilize announced itself will design flexible data centers. (See Emerald AI, InfraPartners Team up to Deploy Flexible Data Centers.)

Google has funded analyses on flexible data center models and signed some flexibility agreements of its own. (See Analysis Offers Blueprint for Faster Data Center Interconnection and Google Strikes Demand Response Deals with I&M, TVA.)

A blueprint is being created for placing smaller data centers near stranded power to speed their deployment. Research has shown flexibility would be part of a suite of tools that could limit the financial impacts of data center buildout. Other research has highlighted the value of demand management and energy efficiency.

If “flexibility” seems like a buzzword lately, that’s because it is. RTO Insider columnist K Kaufmann recently explained the phenomenon. (See Why 2026 will be the Year of Flexibility.)

MISO’s 3rd Expedited Queue: 8 GW of Gas and Batteries

MISO announced a third, 8-GW cycle of generation projects to enter its fast-tracked interconnection process, its largest cluster yet.

MISO’s expedited interconnection queue continued its theme of a high proportion of thermal resources, with gas plants outnumbering storage facilities on a capacity basis 5.8 GW to 2.2 GW. Storage accounted for eight entries, while gas submittals took the remaining seven slots.

In its second 15-count queue class, gas also tipped the scales and accounted for 4.3 GW of the 6-GW group. (See MISO Accepts 6 GW of Mostly Gas Gen in 2nd Queue Fast Lane Class.)

The grid operator said it expects this collection of projects to be in service no later than 2031.

This batch of expedited interconnections includes Northern Indiana Public Services Co.’s coal-to-gas transition at its R.M. Schahfer Generating Station. NIPSCO submitted two combined-cycle plants totaling 2,639 MW. NIPSCO cited its 2024 integrated resource plan to back up the need for the plants, which was developed before the U.S. Department of Energy stepped in to prevent the Schahfer plant from retiring as planned at the end of 2025.

The Schahfer plant is on emergency stay-open orders through March 23. So far, DOE hasn’t let any of its 90-day operating extensions lapse, issuing a chain of orders before the last has a chance to expire. Schahfer also needs expensive, time-consuming repairs before the plant’s Unit 18 can function. (See Enviros Warn NIPSCO Against Rebuilding Coal Unit on DOE Emergency Order.)

NIPSCO also put forward 500 MW of battery storage at its Mitchell site in Gary, Ind. NIPSCO said both the Schahfer gas plants and Mitchell battery storage “are necessary for resource adequacy to serve growing data center, advanced manufacturing and other economic development project load requirements.”

Xcel Energy, DTE Electric, NextEra Energy, Swift Energy Storage, Hackett Energy Storage and Brickyard Energy Storage also submitted battery facility plans ranging from 100 to 300 MW in Michigan, Minnesota and Indiana.

Xcel Energy’s plans included its Sherco South BESS project, part of the utility’s planned Sherco Energy Hub in central Minnesota, which reimagines the site around the coal-fired Sherburne County Generating Plant (Sherco) into a solar and storage format. Xcel closed Sherco Unit 2 in 2023 and plans to idle units 1 and 3 in 2026 and 2030, respectively.

Gas plans, on the other hand, involve one of Entergy Louisiana’s three plants to serve the sprawling Meta data center in Richland Parish, a 478-MW plan from Entergy Texas, and Basin Electric Power Cooperative’s 250-MW turbine in South Dakota.

Gas submittals from Alliant Energy subsidiaries Interstate Power and Light Co. for a 750-MW plant in north-central Iowa and Wisconsin Power and Light Co. for a 150-MW turbine addition in eastern Wisconsin also made the cut.

“The interest we continue to see reflects both the urgency and the opportunity to develop a clear, timely path to interconnection, and [the Expedited Resource Addition Study] is helping us provide that in the near term,” Vice President of System Planning Aubrey Johnson said of the batch of applicants.

MISO said that, to date, its queue fast lane has attracted 53 applicants representing almost 27 GW of nameplate capacity, which the RTO has either agreed to study or awaits approval.

The RTO said it has completed studies on more than 11 GW of proposed capacity and the developers behind the first 10 projects already have struck generator interconnection agreements.

MISO’s temporary queue express lane is capped at 68 projects, and MISO said it will entertain the last project submissions through mid-2027 before the process winds down on Aug. 31, 2027, if not sooner.

Johnson said the queue fast lane is part of MISO’s larger work to get its regular interconnection queue unstuck and pick up the pace on achieving a one-year processing timeline.

Express Lane Dropouts

Developers have withdrawn eight projects since the fast-tracked interconnection lane opened in 2025.

The most recent projects to drop off are two NextEra battery storage projects in Hoosier Energy’s territory. NextEra withdrew its 275-MW Sandcut and 400-MW Merom four-hour storage projects in mid-February. They were meant to serve a Solvenz data center.

NextEra also shelved its expedited request for its restart of the Duane Arnold nuclear plant in Iowa. The plant is set to rumble back to life by early 2029. Google signed a 25-year deal to buy power from the plant in October 2025. By November 2025, NextEra withdrew its fast-track request, though plans to restart the plant remain.

Alliant Energy’s Interstate Power and Light Co. also scrapped its request for expedited processing on a planned, 950-MW natural gas plant near Duane Arnold in November.

MISO confirmed to RTO Insider that any projects it lists as “withdrawn” were withdrawn by their respective developers.

The RTO also said it allows other developers to take the place of withdrawn projects only if it can be done quickly. The grid operator said it doesn’t backfill fast lane spots if the developer doesn’t withdraw its project before it begins its round of studies.

Emerald AI, InfraPartners Team up to Deploy Flexible Data Centers

Digital infrastructure firm InfraPartners and Emerald AI announced a new partnership to construct data centers designed to be flexible grid assets.

The Flex-Ready Data Centers combine Emerald’s energy management software solutions with InfraPartners’ off-site manufacturing approach to constructing and upgrading data centers, the companies announced March 10.

“The innovation here is to put together the data center design with the needs of the software from the beginning, so that the data center is delivered as a flex-ready data center, so there is no retrofitting later,” Emerald’s chief scientist, Ayse Coskun, said. “There are no additional components needed later.”

The software needs telemetry from all aspects of data centers, which includes computing, cooling, any behind-the-meter generation or storage, and other uses of electricity at the facility, she said.

The main attraction for data centers to become flexible grid assets is speed-to-power, but flexibility offers clear benefits to the operation of the grid, InfraPartners Chief Technology Officer Harqs Singh said.

The Electric Power Research Institute has “a data center flex program with all the utilities in it, and so they’re very interested in being able to have data centers become assets, rather than just consumers,” Singh said. (See EPRI Launches DCFlex Initiative to Help Integrate Data Centers on the Grid.)

With Emerald’s management services, data centers can respond to energy availability, match up with intermittent renewables or just respond to prices, Coskun said.

“So, this interface enables not just speed to power, but more broadly a more amicable relationship between the large data center loads and the grid,” she added.

Compared to “traditional” data centers — those used for cloud computing and data storage — AI data centers have a very high “power density,” which is why they have made headlines about massive loads ranging from the hundreds of megawatts to gigawatts.

“The power density of a rack — a cabinet of servers — is increasing like 10 times compared to a typical cloud rack,” Coskun said. “The AI data centers are running a mix of training, inference and other AI loads, and there are differences. For instance, training loads tend to be more spiky, changing the power up and down more rapidly compared to cloud loads.”

Cloud computing data centers must respond to consumer requests, such as when someone accesses a database or streams video, while AI data centers have more batch processing, long-running training and heavily use their computer hardware, she added. Using energy management techniques can help smooth out their highly variable demand.

“I consider this a welcome side effect of controlling power that the spikes are reduced,” Coskun said. “Because essentially, it’s not only necessarily just reducing the power during a high demand time, but also you can set up overall power limits to gently curb the power without adversely impacting performance, at least beyond the performance constraints, and then reduce these spikes.”

The grid does not respond well to major, fast fluctuations in demand or supply, so flexibility can make AI data centers much easier to handle on the bulk power system, she said. Energy management can also smooth out ramps from spiking energy during training and as they are responding to signals from the grid itself.

“In our work so far with power grid operators and utilities, we received both requests — ‘can you reduce the power over a gradual window of 10 to 15, minutes? We don’t want to see this sharp drop,’” Coskun said, and “‘can you respond within seconds in an emergency, if needed?’ And we demonstrated both. So, there’s flexibility on how quickly we can tune power as needed, depending on the needs of the grid.”

InfraPartners can build it all from the start with its approach of building data center infrastructure at a central manufacturing site and then deploying it where needed, Singh said. That can help with initial construction, but also as new chips constantly improve and existing chips wear out and need to be replaced.

“We are going to have to be a lot more agile,” Singh said. “We’re going to have to adapt a lot more.”

The biggest constraint the industry faces now is power supply, and one way of handling that will be to install more efficient chips as they become available, he said.

“That means that the data center needs to evolve to deploy the latest chips all the time,” Singh said, “and being really good grid partners, working with the grid, showcasing to them how are the loads changing. How do we manage our assets on the data center side with grid assets, such that we’re good partners and be able to power the performance improvements that are coming? … It’s what we call ‘the upgradeable data center’: having a data center that upgrades with different chip technologies that are coming.”

A lot of the contracts for chips last about five years, but how often the chips are going to be swapped out is somewhat uncertain at this point, he added.

PJM Plans to Release Reliability Backstop Design in April

VALLEY FORGE, Pa. — PJM has updated its thinking on the design of its reliability backstop procurement to meet rising data center load, gravitating toward a model in which the RTO would determine the amount of capacity to be purchased and act as the administrator and counterparty to the resulting agreements.

Rebecca Carroll, executive director of market design and economics, repeatedly stressed during a workshop March 4 that PJM does not have a proposal yet and will be working on its final design through at least April 10. The RTO aims to file a proposal with FERC by late May.

PJM is considering allowing data centers that procure capacity through the procurement to avoid being enrolled in its proposed Connect and Manage system, which would require them to curtail ahead of demand response resources during strained system conditions. While the amount of capacity purchased in the backstop would be determined by PJM, Carroll said the buyers may be able to submit their own preferred amount to purchase.

The procurement is intended to be a one-time measure that awards 15-year capacity commitments, possibly starting in 2030.

Core questions Carroll said PJM’s package must answer include how the RTO should balance reliability and over-procurement risks; whether changes to credit and collateral rules would be required to account for the greater risks associated with 15-year commitments; and when resources would need to be capable of coming online to participate in the backstop — with 2029 or 2030 being possible requirements.

A model in which PJM is the counterparty could pose substantial risks if either the data center or generation default, which could force the RTO to pick up the remainder of the multiyear commitment or to suddenly procure capacity for the data center. PJM presented examples of how securitization could be used to shift the risk to investors in a model similar to the bonds issued in the wake of February 2021’s Winter Storm Uri.

To submit a commentary on this topic, email forum@rtoinsider.com.

PJM Chief Risk Officer Carl Coscia said such a high collateral requirement would likely make any project unviable; however, the threshold should be high enough to prevent backstop participants from walking away from their commitments. The RTO’s risk provisions were designed around a three-year advance capacity auction awarding one-year commitments and are not well positioned to account for the uncertainty with 15-year obligations, he said.

Gwen Kelly, PJM senior director of credit risk and collateral management, said if the current credit policy was applied to a 15-year, 1-GW unforced capacity commitment at $400/MW-day, there would be a $662 million pre-auction credit requirement, $224 million of which would be returnable. Coscia said this accounts for deficiency charges over the 15 years.

PJM CFO Lisa Drauschak repeated that staff still do not have a proposal, and the presentation only illustrates possible pathways and outcomes.

Several stakeholders have presented their own perspectives and proposals during several workshop meetings held over the past month. (See PJM Stakeholders Begin Discussions on Reliability Backstop Design.) The workshops will be on hiatus over the next month until PJM has a complete package to present.

Many of the same sticking points dominated the discussions: how to define the amount of capacity to be procured; whether the procurement should be one-time; which resources are eligible to offer; and whether PJM, utilities or the data centers should be the counterparties to backstop commitments.

Independent Market Monitor Proposal

The Independent Market Monitor proposed a separate backstop procurement auction awarding 15-year commitments to new resources that reach agreement to serve data centers in the same locational deliverability area (LDA).

IMM Joseph Bowring said the key goal of any acceptable backstop auction design is to ensure the data centers pay for the costs they impose on the system. Bowring said the Monitor’s proposal is the only one that does not attempt to shift costs and risks to other customers and therefore the only proposal consistent with the statement of principles from the White House Office of Energy Dominance and the PJM governors.

The backstop auction would use the basic Base Residual Auction (BRA) design, modeling capacity transfer capability and limits between LDAs and providing a single clearing price up to a maximum based on the net cost of new entry (CONE) for the reference resource. Unlike the BRA, the backstop maximum price would be based on an assumed 15-year lifespan for the reference resource to match the commitment term. The contracts would cover the full cost of energy, ancillary services and capacity.

The Monitor’s backstop would not be a one-time measure and would be run after each BRA to procure capacity for data center load, which would be excluded from the standard capacity auctions. Bowring said limiting the number of backstop auctions is desirable, artificially limiting the number to one auction would counter the goal of the White House and governors to ensure that data centers pay for the costs they impose rather than shifting those costs to others. In the absence of future backstop auctions the increased demand by additional data centers would be borne by other customers, but just delayed by a year.

Seller eligibility would be limited to new generation, with no allowance for uprates, demand response or resources that canceled deactivations or did not clear in the capacity market.  Bowring said this is the only way to counter the strong incentives to game the process and attempt to be designated as new.  Consumers could offer varying bids into the auction, with the highest winning if there is insufficient supply offered.

GQS New Energy Strategies Principal Pamela Quinlan, representing the Data Center Coalition, said it would be a difficult task to tie LLAs to specific customers and seeking to allocate costs to a class of consumers based on how the electricity is used would be undue discrimination.

Bowring responded by saying the capacity shortfall PJM is facing is due to data center load growth, so the costs to mitigate those reliability risks should be assigned to those customers. Bowring said that costs must be directly assigned to data centers in order to prevent costs from being shifted to other customers.

“This is not an allocation to a class of customers. It is a direct assignment via a bilateral contract between willing participants that covers all the mutual risks in the contract. The concept of allocation means that the costs could be reallocated to other customers if a data center fails,” he said in an email to RTO Insider. “The idea that it is difficult for large corporations to reach a bilateral deal is far fetched.”

She argued the Monitor’s analysis assumes available capacity would remain the same in the absence of data center load growth, ignoring the likelihood of resources deactivating without that growth.

Quinlan said using a 15-year net CONE to set the maximum price, on the grounds that the commitment term would be 15 years, misses that resources could participate in PJM’s other markets after the commitment has expired.

Bowring responded that the basic BRA would have higher prices and would continue to attract the capacity needed to meet the needs of other customers. “As our analysis has shown, the current crisis in the capacity market is a result of data centers and not the organic growth of other customers.” It is fine that data centers participate in PJM markets at the end of the 15 year contracts.

“The proposal from the Data Center Coalition would have PJM serve as the counterparty. This means that the data center risk would be imposed on other PJM members. That is inappropriate and inconsistent with the basic goal of a backstop auction to address the costs imposed by data centers. It is not a general concern to be put off to the uncertain future but must be a core element of any backstop proposal,”he said in an email.

Data Center Coalition

Quinlan presented a set of priorities the Data Center Coalition believes should be incorporated into PJM’s design, centering around the position that the RTO should not make substantial changes to the capacity market while designing a one-time procurement structure.

The coalition recommended a backstop design in which PJM would be the counterparty and limited to participants that could be in service for the 2028/29 delivery year, with some allowance for the following year. The RTO’s design should not seek to determine resource adequacy for specific load-serving entities or use “uncertain” long-term forecasts to determine the need for capacity.

Concurrent with the procurement, the RTO should initiate a comprehensive review of the capacity market design, including improvements to load forecasting and consideration of “LSE-based frameworks,” Quinlan said.

Responding to questions on how the risk of a data center default could be managed, Quinlan said risk allocation is an important question to consider, but one that should be part of a long-term discussion. The ideal way to manage the risk associated with multiyear commitments is to ensure that the backstop is a short-term measure that buys time for more substantial market changes, she said.

Quinlan said the coalition considered ways of allocating costs that did not fall to LSEs, but there are practical questions on implementation and whether that can be accomplished in time for a May filing.

Google

Google recommended PJM adopt a backstop in which it procures capacity on behalf of load and allocates the costs across the region, leaving it to states to develop end-user rates. While the company shared several design components it prefers, it stated it does not have a complete proposal.

The company expressed support for one-time solution targeting a specific delivery year with well defined needs, leaving long-term capacity commitments as a separate issue. The backstop should focus on a fuel-neutral framework for incentivizing high-accreditation resources, with the capacity to be purchased defined by the deficiency in a particular auction — rather than targeting individual customers or a class of end-users.

Joint Stakeholders

A cohort of generation owners presented a backstop focused on meeting the shortfall expected for the 2028/29 BRA, scheduled to open in June 2026. The proposal was signed onto by Constellation Energy, Vistra, AlphaGen and Earthrise.

The one-time auction would be conducted in September and mirror the 2028/29 BRA clearing price for commitments up to 15 years. Resource offers would clear first based on the delivery year in which the project can come online, then by the length of the commitment term the offer requested. Procurement would be capped at the reliability requirement for the 2028/29 delivery year.

Seller eligibility includes new resources, uprates, DR, reactivated resources and existing resources that cleared above the maximum price in the 2028/29 auction.

Constellation’s Erik Heinle said the proposal is agnostic on how costs would be allocated, though it specifies that it would respect bilateral contracts. The risk of over-procurement and large loads not coming online would be managed by restricting procurement to load that is already accounted for in the capacity auction through capping the amount purchased at the reliability requirement for the 2028/29 delivery year.

Voltus

Voltus advocated for PJM allowing behind-the-meter capacity to participate in the backstop, arguing behind-the-meter resources have some of the quickest development times — making them well suited to a process intended to rapidly bring on new capacity.

Senior Manager of Regulatory Affairs Kimaya Abreu said PJM should be focused on procuring new capacity from resources not receiving a sufficient price signal from BRAs. That effort would be best served by taking a tech-agnostic approach which allows behind-the-meter capacity to participate. Not allowing DR, behind-the-meter storage, and other DERs’ participation would run afoul of requirements that BTM resources be treated comparably to generation, outlined by FERC in orders 719, 745, 841, and 2222.

Voltus argued including behind-the-meter resources in the backstop is consistent with proposals stakeholders made throughout the 2025 Critical Issue Fast Path process focused on meeting large load growth, as well as the statement PJM’s Board of Managers released at the conclusion of the process. (See PJM Board of Managers Selects CIFP Proposal to Address Large Load Growth.)

The company also endorsed a proposal by the Natural Resources Defense Council to define new capacity, which would allow resources that have completed the third phase of the interconnection process, or are in the surplus interconnection service process, to qualify so long as they are not already subject to the capacity must-offer requirement. For DR, resources that did not offer into the capacity market between the 2025/26 and 2027/28 auctions would be permitted, as well as those seeking to increase the amount of capacity offered.

NRDC

The NRDC’s proposal included an auction design in which capacity would be procured for a pool of buyers that would share the costs and risks, while sellers would receive 10- to 15-year commitments. If participating consumers default or do not come into service, either the capacity payments would be reduced, or the remaining load would pay more. The auction would be a permanent addition to the capacity market, conducted during each queue cycle’s final agreement phase. For Transition Cycle 2, this would be December 2026 or the following month.

Participating resources would be required to offer into BRAs during their commitment terms, with the revenues flowing to load with long-term commitments, which would also be responsible for capacity deficiency penalties. The auction would be open to large loads as well as LSEs seeking to offer long-term firm service to new customers.

The maximum procurement would be set at the amount bid into the auction, and any load that does not receive a commitment would be required to go through PJM’s proposed Connect and Manage system.

Eligibility would be limited to projects that have already cleared the interconnection queue but not yet entered service, as well as DR. The NRDC said the backstop should not be allowed to become another expedited interconnection queue following the example of the Reliability Resource Initiative, which allowed 51 projects to be inserted into TC2. Several of those projects have dropped out of the queue after running into high network upgrade costs.

Questions Raised over Ratepayer Protection Pledge

DALLAS — Figures in the energy industry are casting doubt on the White House’s proposal to shield ratepayers from the costs of interconnecting large loads, saying it ignores the jurisdictional responsibility between regulatory authorities.

The Ratepayer Protection Pledge secured commitments from developers to pay for the full cost of power plants and any required delivery infrastructure upgrades, whether the data centers use the power or not. The pledge asks the data centers to strengthen the grid’s resilience by making their backup generation resources available during times of scarcity to prevent blackouts and power shortages in their communities.

Leaders of seven large Big Tech companies signed the nonbinding pledge during a March 4 ceremony in Washington. (See Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.)

Rob Gramlich, president of the D.C.-based consulting firm Grid Strategies, said during the Federal Reserve Bank of Dallas’ Powering AI conference March 4-5 that there was a “deal to be had between the richest corporations the world has ever known” and the power sector and its end-use customers.

He found gathering regulators in the same room with the tech companies and getting the companies to agree on a “political level” to paying their “fair share” was “quite impressive.”

“That’s an important step,” said Gramlich, who served as an economic adviser to Pat Wood during the latter’s FERC chairmanship. “I know from my eight years being in a regulatory agency that having policymakers agree to, ‘Here’s the deal. Here’s kind of what we’re trying to achieve,’ and then go work out the details … that’s an important step.

“Those first two components are important and check two boxes,” he said. The third phase, implementation, is “really complicated … with a whole new set of complications,” Gramlich said, pointing to jurisdictional issues between the federal government and the states.

“Every market structure is different. Every state has a different arrangement of who’s responsible for transmission, generation, the planning and the cost allocation in ‘FERC land’ outside of Texas,” he said. “The retail-wholesale split is extremely complicated. FERC can’t right now just go and say, ‘Oh, data centers, you pay for this thing.’ Those are retail customers. FERC can’t tell what one retail customer versus another retail customer can do without the state saying that’s the way.”

Gramlich said FERC could assert jurisdiction over the states and go through seven years of litigation. “But there’s not seven years to go through that process,” he warned.

To submit a commentary on this topic, email forum@rtoinsider.com.

Andrew Schaap, CEO of developer Aligned Data Centers, said he is a firm believer in the necessity of the U.S. winning the artificial intelligence race.

“A lot of our adversaries are not having [these discussions]. They’re just doing it. They’re just going to go as fast as possible,” he said, noting China is building 1.5 TW of solar power a year.

“The fair rate pledge is a way to give latitude to operators like ourselves and hyperscalers to do behind-the-meter generation. I build my own power plant; I build my own systems,” Schaap said. “Is that the most efficient way to do it? Probably not. The most efficient way is to do it with the grid.”

Schaap did allow that the pledge is a “good incentive to get there faster.”

“One of the things that we’re all struggling with is there’s just not enough capacity fast enough,” he said.

Nick Elliot, who recently left the Department of Energy’s Grid Deployment Office to join the White House’s National Energy Dominance Council as a senior policy adviser, was in the room where the pledge was signed before he took a red-eye flight through thunderstorms to Dallas.

The transmission piece of the pledge was the hardest component “to get right,” he said holding a cup of coffee. The large load will have to cover 100% of the direct cost for a tie line, he explained, but connecting large generation to the facility is going to affect the entire system’s upgrade requirements.

“They will benefit you, but they’re going to benefit everyone else,” Elliot said. The hyperscalers are “willing to engage” in innovative regulatory structures, paying “full freight” or over time.

“We’ll build the highway,” Elliot said, speaking for the large loads. “And to the extent you end up building a whole bunch of other hotels on the highway, allocated to those other people later, we will backstop, and we’ll take the risk.”

“I certainly understand why Nick says that transmission is harder, because it’s harder,” said Stu Bresler, PJM’s executive vice president of market services. “The benefits of transmission do flow to many customers once it’s built. I think the challenges with allocating the cost of new generation on the system are equally difficult to transmission when there’s so much uncertainty about how much load you’re actually building for and who the customers will actually be. We have to get through all these cost allocation issues. They’re extremely foreign.”

Texas Addresses Rising Costs

Google was one of the seven companies that signed the pledge. Doug Lewin, who recently left his consultancy to join the company as the Texas lead for energy market development, made it clear that Google wants to be connected to the grid.

“You have several advantages to that, both from the data center side and from the public side,” he said. “We just have to have a historical perspective here and remember that for the entire life of the grid, over 100 years back to the earliest days, system use matters.

“It’s a simple division problem, right? Whatever your fixed costs are, can you spread that across as many users as possible that lowers the unit cost?” Lewin added. “That’s the basic economics of the grid as it has existed since the 1910s, and that principle still holds. So, we think it’s not only good for us to be connected, and this would go for any data center, but also for all customers.”

Google and Lancium, an energy technology and infrastructure firm, have filed joint comments on the Texas Public Utility Commission’s proposal to set interconnection standards for large loads (58481). They argued the proposal requires “large, upfront and nonrefundable financial commitments without providing clear study outcomes, defined interconnection timelines or a predictable path to energization.”

“This sequencing shifts significant risk onto customers before system feasibility and deliverability are known,” the companies said, referencing a flat $100,000/MW nonrefundable interconnection fee they said may result in overcollection beyond true costs.

ratepayer pledge

Google’s Doug Lewin and Emerald AI’s Arushi Sharma Frank share a laugh during their panel discussion. | © RTO Insider 

Instead, they have suggested a five-year, 50% minimum demand charge to fund infrastructure builds and share in costs. Lewin said that is a “very tangible way” large loads can shift around costs based on ERCOT’s Four Coincident Peak (4CP) program. Under the program, industrial customers are charged a fee for 4CP based on the amount of electricity consumed during a defined period in the previous year when demand on the grid was at its highest.

“Large loads can get away from paying a transmission charge,” Lewin said. “We have come forward with other partners and said, ‘We want a minimum transmission charge.’”

Emerald AI’s Arushi Sharma Frank, Lewin’s partner on the panel, applauded the Google-Lancium proposal.

“As the load comes in, it pays for the transmission upgrades, and if the load comes before that, great,” she said. “But if upgrades come first, then the loads need to still be there to foot the bill because they are going to eventually use it.”

ERCOT General Counsel Chad Seely said these discussions are part of policy issues being discussed in Texas.

“We’re trying to figure out what the best process is to study [large loads] reliably and make sure that we’re making the best decisions as far as building out the transmission infrastructure and making sure that they have enough skin in the game,” he said. “This is really a pivotal year for ERCOT and our stakeholders to kind of put forward these policy frameworks that will have long-lasting implications as we move forward to manage this tremendous amount [of load].”