Search
February 24, 2026

FERC Commissioners: Solutions Emerging to Large Load Conundrum

WASHINGTON — Data centers’ demand and speed-to-power prerogatives continue to dominate discussions in the electric industry, but some commonsense policy answers are starting to emerge, two FERC commissioners said recently.

More state regulators are adopting large load tariffs that address the challenges brought up by the new customer classes, FERC Commissioner Judy Chang said during the American Clean Power Association’s Interconnection Summit on Feb. 11.

“The states and the regulatory commissions are at the forefront of dealing with how to connect new load and how to serve them, and how to ensure that those services also protect other customers,” Chang said. “So, I think again, back to alignment: I feel like we are very much aligned in the goal of meeting the need of this growth while protecting customers.”

Utilities are starting to integrate the new large loads in a way that protects residential customers and other businesses from higher power bills, Commissioner David Rosner said.

“My perspective is, we can walk and chew gum,” Rosner said. “There are some hard questions to ask, but when you have 13 governors from PJM all getting on the same page with the White House, when you have the questions that we got from Congress, I think [that] indicates support for the kind of commonsense arrangements that we’re starting to see more and more of. I think that’s good.”

The two were speaking just after NARUC’s Winter Policy Summit had ended and about a week after FERC commissioners appeared in front of the House Energy and Commerce Subcommittee on Energy. (See FERC Oversight Hearing Focuses on Affordability and Reliability.)

When it comes to FERC’s role, both commissioners have highlighted the need to improve grid planning, with Rosner bringing up PJM’s Reliability Resource Initiative, which sought to fast-track dispatchable resources, including a natural gas plant in Ohio.

“It had an upgrade cost of $1.2 billion, which included something like over 150 miles of either new or reconductored or rebuilt 345-kV transmission,” Rosner said. “And then if you look down the list, that’s the most extreme example, but there’s, I think, about a half a dozen examples of $400 million upgrade costs for things like batteries.”

FERC has spent plenty of time implementing changes to the generator interconnection queues, but it has work left to do on the transmission side, Chang said.

“Even if we can do the studies faster — even if we can use AI and simultaneously do multiple scenarios and then filter out which resources need to connect where and how much upgrades — we still need to upgrade the system,” she said.

The example of the Ohio natural gas plant shows that the grid is not ready to integrate the new loads and the generation they require, Chang said, but FERC Order 1920 (which addresses long-term transmission planning and cost allocation) should improve things.

“I think the 1920-style of forward-looking transmission planning — figuring out and getting the states into the room to agree on ex ante cost allocation — those are all the planning steps that are necessary, and it’s even more urgent now than it was when we just started on this commission,” Chang said. “So, I think again, this is all a package deal. You can’t just solve the interconnection issue without figuring out the transmission issue.”

SPP CEO Lanny Nickell has been in the industry for 34 years, and even before large loads took over the conversation, he said he was seeing changes unlike anything he had witnessed. Speaking on another industry panel a couple of years ago, when the focus was on the transition of the generation fleet to clean power, the moderator asked participants to be more positive.

“When they turned to me, I said, ‘Well, I’m positive, absolutely positive, that I’m more nervous now than I’ve ever been in my career,’” Nickell said.

Now with large loads exacerbating pre-existing resource adequacy issues, the industry must figure out how to serve the new customers in a way that does not deteriorate reliability even more. SPP’s answer is its high impact large load (HILL) service, which was approved recently by FERC. (See FERC Approves SPP Large Load Interconnection Process.)

“If the large loads are willing to bring the generation with them, either co-located or no more than two buses away, they can be studied together in 90 days or less, so you don’t have to go through the generator interconnection queue,” Nickell said. “And I know that that could create issues in the queue.”

The requirement that paired generation be no more than two buses away means any large load customers that use the process will be less likely to impact the queue, he said. Now SPP has a pending proposal at FERC for conditional HILL service, where flexible large loads paired with generation can get connected quickly.

“We think that’s also a very innovative speed-to-power solution,” Nickell said. “You don’t have to wait for transmission, which we all know takes a long time to get built.”

The rising demand from large loads has shown its impact on the wholesale market in PJM like nowhere else, which led to the aforementioned meeting at the White House with 13 of the region’s governors. (See White House and PJM Governors Call for Backstop Capacity Auction.)

Their calls to help large loads integrate are in line with ideas endorsed by PJM’s Board of Managers, said Asim Haque, the RTO’s senior vice president of governmental and member services. When supply and demand are out of whack, either new load needs to be stopped from coming online, or more generation needs to be added. Policymakers in PJM prefer the latter, he said.

“We have to say to ourselves, ‘How do we get a lot more supply on the system to meet this incredible influx in demand?’” Haque said.

In vertically integrated states like Virginia, the integrated resource plans should keep up with demand, but the rapid growth there has left it increasingly reliant on imports, Haque said. “In our restructured jurisdictions, that is a more dynamic issue to try and solve.”

The RTO is working on a backstop capacity auction to close the gap of 8 GW from its last base residual auction, but stakeholders still are working out the details. That is the short-term plan. Over the long term, PJM will take a holistic look at its markets to ensure they are aligned with the industry’s investments needs.

“We need to look at this and determine whether or not our markets are actually collectively signaling the right incentive opportunities for new supply,” Haque said.

Electricity Rates are the Political Livewire Threatening the Industry

Dej Knuckey

Families struggling with painfully high power bills are making headlines across the country. An Oakland resident racked up $1,400 of power bills while taking unpaid leave to deal with her high-risk pregnancy. A New York parent’s power was shut off for six months after her power bill tripled at the same time she lost her job. A disabled Florida senior was forced to choose between paying for his medication or his power bill.

Residential electricity bills have moved from being background noise in discussions about resource adequacy, decarbonization and transmission expansion to being the loudest political and business risk. Households do not experience the grid through integrated resource plans or RTO stakeholder processes. They experience it once a month, in a single number.

If that number rises too far, too fast, the system’s social license begins to erode.

Policy Costs Embedded in Retail Rates

Retail bills now carry policy objectives that extend well beyond the marginal cost of electricity.

Clean energy mandates, public purpose programs, low-income assistance mechanisms and legacy regulatory decisions are embedded in rate design. Many jurisdictions rely on non-bypassable charges and fixed cost recovery mechanisms that have grown as a share of total bills.

Retail rates rose in all states in the 5 years up to October 2025 | Charles River Associates

For customers, these line items are opaque. When they see an increase in their monthly bill, they do not parse which portion reflects transmission upgrades, wildfire mitigation, renewable integration or administrative overhead. They see only a cost-of-living increase.

And they react accordingly.

If the issue already is in the headlines, imagine the backlash utilities, regulators and politicians will face if prices continue to outpace inflation: ICF estimates residential electricity rates could rise 15 to 40% over the next five years and double by 2050.

National Trends Versus Regional Realities

While rates are up across the board, the realities are more subtle. For every headline-grabbing story about families struggling with power price increases, there are counter stories that never make the news.

“Prevailing narratives that there is a broad national trend of rapidly rising electricity rates are inaccurate or incomplete,” Charles River Associates warned, in its Retail Rate Trend analysis prepared for Edison Electric Institute and published in February.

In the Northeast, for example, retail rates are more susceptible to increases in wholesale electricity prices because utilities there do not own generation, whereas in California, wildfire costs and related mitigation have been the key drivers.

Electric bills have risen disproportionately in areas where power prices have risen most, or extreme weather events have forced larger-than-usual heating or cooling loads. Energy Information Administration data showed sharp increases in parts of California and New England since 2021.

What Goes Up Occasionally Comes Down

When I received a chipper email boasting that my electric rates had been cut, it was a pleasant surprise. “Dear PG&E Customer,” the email sent by Pacific Gas and Electric said, starting Jan. 1, “PG&E lowered electric prices — our fourth decrease in two years — to help make energy more affordable for the communities we serve.”

The near-term decline is counter to the national trend, as the company noted in a LinkedIn post: “The U.S. Energy Information Administration forecasts national electric prices to increase by nearly 10% between 2024 and 2026.” Small comfort, given that California residents face the second-highest power prices in the nation.

Less than a year earlier, a study by KQED found the average PG&E bill was up 70% since 2020. The customers’ pain didn’t stop the company, along with other California utilities, from trying to increase their authorized return to shareholders to 11.3%.

The request to increase investors’ ROE — the percentage of profit utilities may earn on shareholder-funded investments — was ill timed. It wasn’t just rejected, it was delivered with a slap on the wrist: The CPUC cut the rate, restricting PG&E, Southern California Gas Co. and San Diego Gas & Electric to a little under 10%, and Southern California Edison to a little over 10%, the lowest rate in 20 years.

The returns are not guaranteed. “The utilities earn the full authorized ROE only when they effectively manage costs, maintain safe operations, and deliver projects on time and on budget,” the CPUC said. A cost of capital mechanism enables automatic ROE adjustments when bond markets change substantially in either direction.

The rising prices aren’t about corporate and shareholder greed: They are a result of an aging system that’s being asked to meet conflicting demands. We need a larger grid that’s also more resilient. And if we are to deliver on a climate future that will let future generations flourish, it needs to be cleaner too, regardless of short-term political winds.

For grid operators, utilities and regulators, affordability is the challenge that complicates all those other demands.

Climate and Modernizations Drive Capital Needs

The drivers behind higher residential bills are layered and cumulative. Capital expenditures are surging, with utilities investing heavily in wildfire mitigation, storm hardening, undergrounding, modernizing the grid and replacing aging infrastructure.

Climate stress is making those resilience investments unavoidable: NERC has repeatedly highlighted growing reliability risks tied to extreme weather and climate stress in its Long-Term Reliability Assessment.

The burden is compounded by the rising cost of money. Higher interest rates raise the cost of financing infrastructure, and those costs ultimately are reflected in rate base recovery.

Data Centers Drive Rising Demand

After nearly two decades of flat demand in many regions, load growth has returned. And of the causes — electrification, EV adoption, building heating conversion and data center expansion — one bears the bulk of the perceived blame: data centers.

Total U.S. electricity use is forecast to grow by 1% in 2026 and 3% in 2027, according to the EIA. “This increase would mark the first time since 2007 that power demand has risen for four years in a row. The driving factor behind this surge is increasing demand from large computing centers.”

Greater load can support system use and spread fixed costs. But in the near term, it often requires incremental distribution upgrades and capacity investments that raise costs before efficiency gains materialize. (See DTE Treads Carefully as Michigan Becomes Flashpoint in Data Center Debate.)

EPRI’s Win-Win Watts position paper argued the increased demand from data centers doesn’t have to increase retail rates. “Proactive planning, robust safeguards in large-load tariffs and explicit incentives for demand flexibility, including initiatives like EPRI’s DCFlex, can help turn high-load-factor growth into a tool for moderating average prices, improving asset utilization and accelerating clean energy deployment.”

Addressing the Power Price Risk

The political harm caused by rising retail prices will be addressed, whether through political means, regulatory actions or utility efforts. Legislators are scrutinizing rate cases more aggressively, and if regulators and the industry won’t act, they will step in to fill the void. Some recent examples: Maryland Gov. Wes Moore (D) recently introduced the Lower Bills and Local Power Act, and Pennsylvania Gov. Josh Shapiro (D) urged PJM to protect consumers by extending the existing price floor and ceiling, the price collar, for the 2028/29 and 2029/30 capacity auctions.

For some politicians, a commitment to addressing constituents’ concerns about electricity bills is coming at a cost. Critics accuse New York Gov. Kathy Hochul (D) of ditching the state’s climate commitments to focus on the more politically urgent issue of energy costs.

Politicians do not always have to get involved. Some utilities and regulators are limiting price increases caused by rising data center demand from filtering down to household energy bills, Charles River’s study said. “Going forward, utilities and their state regulators have committed to protecting retail customers from rate increases caused by new data centers and are approving new tariffs and ratemaking measures that embed those protections.”

The Business Risk of Doing Nothing

Failure to address escalating energy costs will strain utilities’ balance sheets if bad debt rises as customers default or disconnect. Investor earnings calls reflect growing sensitivity to bill impacts and the need to balance capital deployment with moderation strategies.

Ratings agencies increasingly factor regulatory and political stability into utility outlooks. Moody’s and S&P regularly assess regulatory environments and affordability pressures in their utility sector commentary. The Center on Budget and Policy Priorities has outlined the risks of rising utility debt and the policy implications for states.

States can address rising energy bills with several levers.  | Center on Budget and Policy Priorities

An Integrated Approach to Managing Energy Costs

For grid operators and utilities, the risk is that affordability becomes the weak link in the energy transition, not because resiliency, decarbonization and grid growth are not technically feasible but because those upgrades are not politically durable. There are four takeaways:

    1. Treat rate design as infrastructure policy: Providing transparency into use-based charges and fixed system recovery costs could improve public understanding. However, rates that better reflect system costs must be paired with consumer protections.
    2. Make affordability an important system metric: Incorporating explicit bill impact assessments into major capital approvals may help gain buy-in. Income-based assistance mechanisms may prove more durable than broad subsidies that dilute price signals.
    3. Revisit incentive structures: Implementing performance-based models that link utility performance to outcomes rather than capital deployment will build political capital over time.
    4. Acknowledge tradeoffs honestly: Building the grid of the future carries near-term costs, and customers deserve clear communication about why investments are necessary and how costs will be managed. It may not help them pay bills in the near term — something that’s also essential — but it may stabilize public trust.

Investing in Not-too-unhappy Customers

No one loves paying their power bill, especially in strained economic times, but rising prices are a reality as the grid transforms. The nation cannot afford to slow investment in transmission expansion, decarbonization or resilience, but public tolerance is a constraint as real as engineering requirements or capital access.

For utilities, grid operators and regulators, the message is straightforward: Affordability cannot be put on the back burner. The future of the grid depends on not only what gets built but also whether monthly bills make sense to the people paying them.

Duke University Study Quantifies Benefits of Data Center Flexibility

Demand flexibility among data centers could reduce the need for new gas-fired generation needed to supply their energy consumption while driving development of additional renewables and cutting electricity prices, according to a report by Duke University’s Nicholas Institute for Energy, Environment and Sustainability.

The institute in 2025 released a major paper showing that small amounts of data center flexibility could unlock 100 GW of grid capacity to serve more of the large loads. (See US Grid Has Flexible ‘Headroom’ for Data Center Demand Growth.)

The new paper — “Data Centers and Generation Capacity over the Next Decade: Potential Benefits of Flexibility” — seeks to expand on that theme by looking at how data center flexibility could change the power system, co-author Martin Ross, a senior research economist, said.

“What I normally do is the longer-term capacity planning modeling around policy analysis,” Ross said in an interview. “And I was curious, in that longer-term context, how would flexibility sort of alter the capacity mix going forward.”

Investments in new combined cycle gas-fired plants are cut by 10-50% in the flexibility scenarios modeled, with the lower end of the range assuming data center demand flexibility that avoids consumption during just 1% of peak hours, the report finds.

The report studies both temporal and spatial flexibility (shifting compute load to other data centers in other regions) to offer the simultaneous ability to further reduce the need for new gas plants.

The flexibility scenarios include ones in which data centers curtail their entire load for 1 and 2% of net peak hours, and other scenarios assuming they can curtail by 20 or 50% during those hours. Other scenarios are combinations, modeling 20 and 50% demand flexibility with the ability, but not the requirement, to completely avoid the system peaks of 1 or 2% of net peak hours. A final scenario uses spatial flexibility to shift 50% of the data center demand in a state to other regions.

Ross focused on combined cycle rather than combustion turbine plants because the formers’ higher capacity factors are better suited for steady data center loads.

“Normally you would think if you were reducing stress on the grid, you would first be reducing the need for peaking units,” he said. “But given sort of the overall demand growth that’s being expected in the system, that leads you a bit more towards the combined cycle units and makes the gas peakers a bit less useful, since it’s not efficient to run them all the time as a base load for the data centers.”

data center

Another graph from the report showing the difference in capacity additions based on flexibility scenarios | Duke University

Projected savings in capital investment, operational spending and fuel costs over the next decade range from $40 billion to $150 billion. Data center flexibility could see average electricity prices drop by $2.50 to $8/MWh, with retail customers seeing their bills be 0.5 to 2.8% lower.

The report generally found that greater demand response translated into need for fewer gas plants, with the variable demand matching up better with renewable output.

“Combining the 20% hourly flexibility case with the ability to shift up to 50% of each region’s initial data center demands into other regions allows the national grid to avoid up to two-thirds of the new [combined cycle natural gas] units that would have been built by 2035 as data centers move to regions with both less stress on the grid and higher renewable resources,” the report said.

While flexibility offers clear benefits to the grid, it is unclear how much data centers can offer, Ross said. Customer-facing demand for cloud services generally is inflexible, though the compute load for training AI can be shifted around.

“When I think of flexibility, I normally think of flexibility driven by cost savings as the motivation for the flexibility, and I’m not sure that is a major factor in the thinking right at the moment, versus a race to beat all the other models,” Ross said.

Still, if data centers are motivated to offer flexibility for other benefits such as speed-to-market, their flexibility would benefit all customers.

“I was surprised that the flexibility was having fairly significant effects on what data centers might end up paying because you’re avoiding those high-cost hours, and that sort of not only benefits the data centers, but benefits consumers more broadly,” Ross said.

CTC Global Partners with Google to Launch GridVista System

Advanced conductor manufacturer CTC Global is working with Google Cloud and Tapestry to launch the GridVista System, which combines conductors with fiberoptic cables to offer operators visibility along the entire transmission line.

GridVista’s line awareness, paired with Google Cloud and Tapestry’s artificial intelligence-powered tools, turn line data into actionable intelligence that can optimize grid capacity, prevent outages, cut wildfire risk and lower operational costs, the companies say.

CTC’s advanced conductor technology can double the capacity of an existing transmission line, having worked on 1,400 projects with more than 300 utilities in 65 countries, CEO J.D. Sitton said in an interview.

“Now with the GridVista System, we’re adding data-grade fiber optics to the product that enables the measurement of strain and temperature and capacity and events along the entire length of the power line; not at discrete points, but literally the entire length between substations,” Sitton said. “And so, GridVista really is providing an entirely new level of detail and insight into the operating status of the transmission lines.”

Unlike traditional dynamic line rating products that add sensors at discrete points along a transmission line, the fiber optics in GridVista give utilities full knowledge of what is happening.

“It’s a much higher degree of resolution and a much faster feedback loop to the utilities about an event: a line down, a hot spot, a lightning strike, a rifle shot, a tree branch falling on the power line. These sorts of things,” Sitton said. “We know about it immediately. We know exactly where it is.”

Combining that visibility with advanced conductors puts utilities in a position to operate more cheaply, he said.

“We save them money on the capacity upgrade, and we’re saving them money from an operations perspective, because they’re much smarter about how they dispatch their operating resources and their lines,” Sitton said. “We enable them to operate with a higher degree of reliability because they no longer have these blind spots in their operation between the substations where they’re guessing what’s going on or not going on with their power line.”

The growing demand from data centers that want to connect to the grid much faster than the industry historically has been able to add new power plants or transmission is leading to more demand for products like GridVista.

“I think probably for the first time, we’re seeing utilities in the United States and Western Europe realize that they are, in fact, capital constrained, and so they need to get more out of their existing systems faster than historically they’ve had to,” Sitton said. “So, all of these things are, I would say, accelerated by the dramatic pickup of demand from the data centers that’s creating an environment where the utilities are very open to much more capital and operating cost-efficient solutions.”

Utility customers increasingly have higher expectations for a safe and reliable grid, which can benefit from the awareness the new product unlocks, he added.

CTC had a pre-existing relationship with Google around speed-to-power for its data centers. With GridVista, Google Cloud helped with the user-interface system, and Tapestry is working on reforming grid operations.

“Google is using GridVista as the source for what they call ‘ground-truth data,’ so the fundamental operating data that will feed the capabilities of their software platforms,” Sitton said. “So, it’s basically a two-way street, and it’s really quite exciting to see the early reactions of some of the utilities that we’ve been engaging with about the combined capabilities.”

AI applications are starting to transform how the grid is operated as the industry adopts the technology.

“We’re starting to see utilities rethink how they dispatch their grids, how they respond to operating challenges within their operating assets, and how they think about the kind of the planning aspects of their system,” Sitton said. “So, the next round of interconnections, and the next retirements of generators along their transmission lines and … the way they plan for these things is fundamentally changing.”

Utilities move at different speeds, but the “thought leaders” in the industry are starting to roll out AI applications that improve their operations and planning, he added.

“Utilities are utilities,” Sitton said. “They are, by definition, some of the most conservative organizations on the planet, I think, for good reason. But they’re not all cut from the kind of the absolute conservative cloth, and so we are seeing many utilities moving much more quickly.”

PJM, Monitor Float Reserve Market Changes

PJM and the Independent Market Monitor are drafting proposals to rework the RTO’s reserve market.

Reserve performance has been a focus since PJM implemented a market overhaul in 2022, which was followed by a drop in performance. That was counterbalanced with a 30% adder to the reserve requirement in May 2023, a change PJM’s Emily Barrett said has allowed the RTO to maintain adequate reserves at increased costs to load. The adder was scaled back recently to 20% as average performance increased above 85%.

The RTO’s proposal would increase the penalties for synchronized reserves that fail to respond, replace primary reserves with a handful of more targeted products based on duration and how quickly the resource can respond, and shift procurement to nodal rather than sub-zonal. Barrett presented the package to the Reserve Certainty Senior Task Force (RCSTF) on Feb. 11.

The RCSTF’s work was one of several areas the PJM Board of Managers wrote is integral to the efforts to address rising data center load.

Barrett said the current penalty rate is based on the credits reserves received between events, which can result in widely varying penalty rates for resources in the same event. The logic driving the RTO’s proposal instead would use the amount paid to reserves between events. The rate would be set at the greater of:

    • the mean synchronized reserve clearing price over the past delivery year, broken into intervals set at the average number of days between deployments exceeding 10 minutes (there was an average of 18 days between 10-minute synchronized reserve events in 2025, with an average clearing price of $1,910/MWh); or
    • the maximum system marginal price in the 30 minutes after a resource underperformed.

Stakeholders said there has not been enough focus on why performance has been low, which should be addressed alongside discussions on how underperforming resources should be penalized. Untying the penalty rate from what a resource is paid, and setting it so high, could result in resources that receive a low or zero clearing price facing penalties in the thousands, they said. Resources also could be held responsible for PJM inaccurately modeling parameters.

Joel Romero Luna, a market analyst with the Independent Market Monitor, said it has been doing outreach since mid-2024 and found a lot of issues related to communications and personnel. Performance has improved since generation owners ironed those issues out, leaving inaccurate parameters as a primary driver of the low response rate. In particular, the ramp rate and economic maximum parameters tend to be based on averages rather than how a resource expects to respond.

Monitor Joe Bowring told RTO Insider resource performance had not dropped, but rather the response rate was low because of communications issues.

“There was no actual drop in performance, and PJM’s arbitrary increase in reserves was not justified and continues to be unsupported. The measured performance of some reserves was low because PJM was using antiquated communications technology. The technology issue has been significantly, but not completely, addressed,” he said.

Luna noted that the Monitor has recommended that PJM count overperformance when calculating the fleet response rate. When capturing both sides of reserve performance, he said the response rate is closer to 100%.

PJM’s Kevin Hatch added that while the outreach to the owners of underperforming resources has been led by the Monitor, RTO staff have been involved as well.

New Reserve Products

The RTO’s proposal would add a ramp/uncertainty reserves (RUR) product capable of responding in 10 minutes, which would come with its own reserve requirement, and an energy gap requirement met by a combination of reserves.

Barrett said primary reserves backfill needs not met by other products, but resources lack clear performance obligations and penalties for not meeting commitments.

The energy gap requirement would be tailored toward meeting operational needs identified on medium- and high-risk winter days, and the 30-minute secondary reserve requirement would serve as a backfill to ensure the largest system contingency is met.

The 30-minute RUR and 30-minute secondary reserve products both come with a four-hour minimum availability. Barrett said event duration is expected to become more important as battery storage becomes more common.

Vitol’s Jason Barker said the transparency of the new market design will be crucial to avoiding “black box pricing” with unexplainable variations in pricing.

IMM Proposal

The Monitor proposed to retain most of the reserve market structure, while changing the procurement requirements for 30-minute synchronized and primary reserves. Bowring said PJM should eliminate the adder on the grounds that it is not required for reliability and there is no demonstrated need for it.

The 30-minute reserve requirement would be defined as double the single largest contingency plus real-time uncertainty, defined as the two-hour forecast for wind, solar and load minus the forecasts used in real-time security-constrained economic dispatch 10 minutes in advance. The synchronized reserve requirement would be the single largest contingency plus the extended reserve requirement of 190 MW. The primary reserve requirement would be the larger of 150% of the largest contingency or real-time uncertainty.

Performance evaluation and penalties would remain the same for synchronized reserves. For non-synchronized reserves, they would be pegged to the evaluation and penalty rules for secondary reserves. Reserve resources would be required to be capable of operating for four hours or longer.

Bowring told RTO Insider the proposal would capture the uncertainty of wind and solar generation in the reserve requirement based on analysis of actual resource behavior. PJM has not shown there is a need for larger market design changes, he argued, and its proposal appears designed to increase energy market revenues while failing to fully reflect energy and ancillary market revenues in the capacity market.

“We do not believe that PJM has supported its proposals with analysis, and we do not agree that it’s appropriate to use the demand curve for reserves to increase energy market revenues,” he said. “PJM has not demonstrated the existence of an ‘energy gap’ despite multiple different approaches, and PJM has not demonstrated the need for making the reserve markets more complicated.”

Devendra Canchi, a senior analyst with the Monitor, said PJM’s proposal would go too far and increase costs with no corresponding benefit. He presented part of the Monitor’s proposal during the RCSTF’s meeting Jan. 28.

Some stakeholders argued that the Monitor’s position is not backed with analysis and PJM could save costs by modeling reserves in SCED and accounting for them in transmission constraints.

Bowring responded that the proposal is fully supported and that any nodal distribution scheme would be arbitrary.

“No one knows where the next generation trip or forecast error will occur. PJM’s proposal would increase market costs by arbitrarily redispatching expensive resources with no defined benefit,” Bowring told RTO Insider.

Deputy Monitor Catherine Tyler said adding constraints would increase ratepayer costs, and any assumptions PJM makes about when supply is going to be lost run the risk of being inaccurate. While it’s important to ensure that reserves are deliverable, PJM’s proposal would not accomplish that, she argued.

NERC Staff Outline Growing LTRA Challenges

SAVANNAH, Ga. — NERC Director of Reliability Assessment and System Analysis John Moura acknowledged recent criticism of NERC’s Long-Term Reliability Assessment but emphasized the report is “not a prediction of outages [or] a forecast of impending blackouts.”

Speaking at the ERO’s quarterly technical session Feb. 12, he also discussed some of the ways the ERO’s approach to the assessment is changing in light of ongoing changes to the electric grid.

Moura’s remarks came the same week as his appearance, along with NERC CEO Jim Robb, at the Feb. 9 board meeting of the Organization of MISO States to discuss objections by OMS members to the ERO’s 2025 LTRA, which rated five assessment regions, including MISO, as having the potential to develop into high risk between 2026 and 2030.

The assessment means that planned resources as of July 2025 would lead to energy shortfalls in excess of resource adequacy targets or baseline criteria for unserved energy or loss of load.

OMS members disputed this label in a letter that argued NERC should have counted resources in MISO’s fast-track interconnection queue. The organization argued that the 11 GW of natural gas generation and battery storage proposals to come online by mid-2028 should make up for the 7-GW shortfall projected in the LTRA. (See MISO States Dispute ‘High Risk’ Designation from NERC.)

It was the second time in as many years that MISO objected to NERC’s assessment of the region’s risk. In 2025 MISO’s Independent Market Monitor pointed out that NERC had committed an error in rating the region as high-risk in the previous year’s LTRA. After a back-and-forth, NERC agreed to downgrade MISO to “elevated” risk.

At the technical session, Moura explained — as he did at the OMS board meeting — that NERC’s deadline for including energy projects passed in mid-July 2025 and MISO’s fast-track projects “were only approved in September, [so] were unable to be included in the energy analysis.” (See NERC to OMS: Long-term Assessment not a Predictor of Risk.) He added that the resources might not have been included in the assessment anyway because of changes to how NERC assesses the impact of generation additions.

“Counting capacity is really a thing of the past. Energy adequacy cannot be determined by counting capacity,” Moura said. “These projects weren’t left out because we missed them. The issue is really modelability, deliverability and certainty.”

Moura explained that modelability refers to whether any constraints could apply to a resource — for example, if a gas generator is coming online without firm gas arrangements — that could make it “difficult or impossible for that new generator to get gas on peak winter days.”

Deliverability means the ability of electricity generated in one area to reach load centers in other areas, which Moura observed could be an issue for the 7 GW of fast-track resources “south of the well-known MISO South to MISO North interconnection constraint.”

Finally, Moura defined certainty as the assurance that projected resources will be available as expected, an important question “given the supply chain challenges … in our assessments showing delays of up to 50% of planned generation not meeting their in-service dates.”

Avoiding Panic

Along with the difficulty assessing new generation, NERC Trustee Ken DeFontes observed that in previous LTRAs, NERC staff have also had problems predicting generation retirements because they “tended to only count retirements that had been announced [when] the reality was … it was likely more were coming.” He asked how the team behind the 2025 assessment had accounted for the issue.

Mark Olson, NERC’s manager of reliability assessments, replied that the ERO aims to account for both the “certain level of retirements in terms of being announced and in the process to be deactivated, versus a more speculative industry projection of potential retirement.” Moura added that NERC also must consider retirements as “a dial that can be turned … where states and other policy makers can take action to keep units online [or] have the ability to throttle those resources.”

Karen Onaran, CEO of the Electricity Consumers Resource Council, reminded attendees of the growing influence of NERC’s reliability assessments on lawmakers and policy makers, and warned that without proper context the reports could be “used as ammunition [to] saddle consumers with costs that may not be necessary.”

“At least within the last couple of years, when these reports come out, [they raise] some panic, and [hurt] some feelings. And we take it, and we try to take the lessons learned and figure it out,” Onaran said. She suggested that NERC’s Engagement and Outreach Committee, launched in December 2025 to handle external communications, work on a plan for conveying “what this report really says … and what risks it’s identifying … to all levels of government and to the industry.”

Navigating Extreme Winter Storms: A System-of-systems Perspective

In late January, the mass of cold air — typically held at bay by the high-level upper atmosphere winds that occur 10-30 miles above the North Pole — staged a jailbreak. A breakdown of the polar vortex occurs every so often when sudden stratospheric warming occurs (that’s what occurred during 2021’s devastating Winter Storm Uri), disrupting the vortex and allowing fugitive cold air to spill southwards, bringing extremely cold temperatures in its path. Collision with low-pressure systems can create a volatile cocktail of ice and snow.

That’s exactly what happened when the much-anticipated Winter Storm Fern swept across the country at the end of January. The southeastern U.S. suffered the brunt of physical damage, as high-level warmth brought rain that froze as it encountered cold air close to the ground, coating power lines with ice, breaking equipment and leaving approximately 1 million people without power.

Peter Kelly-Detwiler

Most damage came from ice on distribution networks, with Entergy estimating 860 poles and 60 substations out of service. Some larger transmission lines also were affected. Entergy reported 30 transmission lines out, and the Tennessee Valley Authority also saw as many as two dozen high-voltage transmission lines affected.

While thousands of customers went without power for many days, and the damage to the distribution system was serious, the bulk power system in the southeastern U.S. generally held its ground. The same was true for other regions of the country that got mostly snow but saw extreme cold prevail from Texas to New England.

This stands in stark contrast with winter storms Uri (February 2021 — with its extended system-wide outages in Texas) and Elliott (December 2022 — with outages in TVA and Duke service territories, while PJM barely squeaked by).

Thermal Plants, the Winter Workhorse

While prices spiked across multiple markets, the grid remained intact. There were several major reasons for that, with the underlying factors varying by region, but thermal plant reliability was a key theme. Gas generation was a critical player, but coal and oil also filled the gaps to meet surging demand. A review of several grids illustrates the point.

    • ISO-NE: New England saw dual-fuel plants switch over from gas to oil, burning through about half of the region’s stored oil reserves in late January and early February, with oil-fired generation surpassing gas for a couple of days. The grid operator requested a Department of Energy waiver to avoid emissions penalties.
    • PJM: The Mid-Atlantic grid operator commented that during the “strongest sustained cold period that the PJM system has experienced since the 1990s” it saw an average 18 to 19 GW of outages (compared with an expected 15.9 GW), with plant equipment failure as the greatest cause. Tight gas supplies also were a concern. In response, PJM called upon 5.2 GW of oil-fired capacity “that would otherwise have been restricted,” aided by a DOE order waiving emissions restrictions.
    • MISO: MISO ran coal generation more heavily than normal, including three of the five coal plants whose retirement was delayed by the Trump administration, delivering a total of 965 MW during much of the period from Jan. 21 to Feb. 1.
    • ERCOT: ERCOT also made it through, with its improved weatherization and inspection programs of power plants and transmission facilities reducing generation outages. Increased reserves, more flexible operations (including increased dual-fuel capabilities) and a growing deployment of batteries also helped.

We learned, once again, that our nation’s power grids rely on a significant fossil mix when the weather turns nasty. Coal-fired generation soared across the lower 48 states during the week ending Jan. 25, up 31% from the prior week and representing 21% of power generation, while gas stood at 38% and nuclear at 18%.

| EIA

We also learned that events on the grid are increasingly ripe for being politicized. Less than two weeks after the storm had passed, DOE issued a fact sheet declaring that “Beautiful, clean coal was the MVP of the huge cold snap we’re in right now,” and decrying the actions of the previous administration’s “energy subtraction policies which threatened America’s grid reliability and affordability.”

It seems that such politics are sadly unavoidable these days. But let’s try and remove politics from the conversation and focus on the facts. It’s not controversial to state that during the coldest days that occur every so often, fossil thermal generation — whether oil, coal or gas — is extremely valuable in keeping the lights on. Renewables and storage can pitch in, but they are a long way away from being able to handle that task.

As an example, when this article was drafted (Feb. 13), renewables made up 10% of the generation mix in ISO-NE at 3:30 p.m. In addition, on this clear and sunny day, rooftop solar cut peak demand by about 5,000 MW, with the duck curve exerting its influence on net demand. So, a combination of utility-scale and on-site renewables can generate energy and cut the use of fossil fuels.

However, that solar doesn’t address the evening peak, and it doesn’t help after heavy snow. The day after Fern departed New England, nearly all the panels in the region were blanketed with snow and the duck was hibernating, with no visible impact to be seen. As a dependable resource that can provide both capacity and needed energy, neither variable wind nor solar check the box.

Batteries can help meet peaks and address this issue of renewable energy droughts, if those storage assets can be fed by renewables, and renewable energy shortfalls are of relatively limited duration. That equation may change if we eventually get the long-duration, 100-hour batteries promised by start-up companies such as Noon and Form Energy, and those storage resources are deployed in enormous quantities at affordable prices. But we’re not there yet.

Addressing the Demand-side Thermal Issue

At the same time, much of the peak demand that occurs during extreme cold or hot spells could be greatly mitigated if we started to more accurately frame those peaks as a thermal problem, stemming from the need to heat or cool our built spaces. The better we insulate those spaces, the less volatility we would see in resulting energy demand.

EPA reports that homes can save an average of 15% on heating and cooling costs by employing a variety of insulation technologies. This need not be a herculean task, and insulation is effective. For example, upgrading U.S. homes to a 2009 building code could keep residences above 40 degrees Fahrenheit for nearly two days in sub-zero temperatures.

Maintaining a reliable and cost-effective grid is not, and never has been, a strictly supply-side issue. Rather, the power grid’s various supply and transmission technologies, combined with demand-side technologies, comprise a massive system of systems that can best be made economical, reliable and resilient if it is viewed and addressed as such. But such an approach requires sophisticated thinking that defies the simplistic and easy answers that many politicians and some analysts proffer.

Climate is an Even More Complex System of Systems

Perhaps counterintuitively, the polar vortex breaks down when it experiences spikes in the stratospheric temperatures, known as sudden stratospheric warmings. When those breakdowns occur, some areas of warmer air pour into the Arctic while lobes of polar air flee southwards. Nobody fully understands the dynamics behind this, but models suggest that climate change may be a driving force. If so, then we can add that to the lengthy list of other climate-related issues that justify cutting carbon emissions from our energy systems.

| NOAA

To pretend that we can oversimplify either the power grid or the impacts of human activity on the earth’s climate is a mistake. Each of these complex systems — and their interactions with each other — deserves far more scrutiny and understanding than most of us are willing to devote. We need more data and information, not less, and to entertain more nuanced conversations as well.

Around the Corner columnist Peter Kelly-Detwiler of NorthBridge Energy Partners is an industry expert in the complex interaction between power markets and evolving technologies on both sides of the meter.

N.Y. Cancels Solicitation but Remains Committed to OSW

After 19 months, New York has abandoned its most recent attempt to procure offshore wind power, saying it would not be prudent to proceed amid federal policy uncertainty.

The decision is only the latest setback for a state that, despite multiple cancellations and cost escalations, has the largest offshore wind pipeline in the nation.

New York presented it as a delay, not an end, of its offshore wind ambitions.

In the same week that it canceled the procurement, New York issued a request for information on ways to keep the industry from further atrophy; told an industry conference that offshore wind remains an important part of its energy strategy; and approved the structure of the offshore wind renewable energy credit (OREC) system that subsidizes increasingly expensive construction.

New York launched its fifth offshore wind solicitation July 17, 2024. It attracted 25 proposals totaling 6,870 MW from four bidders — Attentive Energy, Community Offshore Wind, Ørsted and Vineyard Offshore.

Attentive withdrew its proposals in October 2024, and Ørsted in August 2025.

Attentive Partner TotalEnergies and Community partner RWE said in November 2024 and April 2025 respectively that they would put their wind energy development efforts in U.S. waters on hiatus because of the political uncertainty surrounding offshore wind.

And of course, Donald Trump was re-elected president in November 2025 with a promise to block offshore wind development and has been trying to follow through for 13 months.

Given all this, the New York State Energy Research and Development Authority (NYSERDA) announced Feb. 13 that it was canceling the fifth solicitation.

A spokesperson said: “Federal actions disrupted the market and instilled significant uncertainty into offshore wind project development. Given the current level of uncertainty, it would not be prudent to enter into new long-term OREC purchase and sale agreements at this time, and as such, NYSERDA has concluded ORECRFP24-1 without award.”

With that, four out of five of New York’s solicitations are now dead ends.

The two contracts awarded in the 2018 solicitation (to Empire Wind 1 and Sunrise Wind) were canceled because cost escalations rendered the contracts unprofitable. The two contracts totaling 2.6 GW awarded in the 2020 solicitation were canceled for the same reason. And the three projects totaling 4 GW chosen in the 2022 solicitation were rendered untenable when General Electric halted development of the specified turbine.

Only the 2023 solicitation — a rush effort to salvage the state’s imploding offshore wind portfolio — has yielded steel in the water: Empire Wind 1 and Sunrise Wind (1.73 GW combined) are now under construction, at much greater cost than first agreed on.

But that is more than other states with offshore wind aspirations can say.

With the 132-MW South Fork Wind (which was completed in 2024 outside the NYSERDA procurement structure), New York now has three wind farms spinning or being built off its coast.

No other state has more than one, and most have none.

Speaking to Oceantic Network’s IPF 2026 on Feb. 10 in New York City, NYSERDA President Doreen Harris reiterated the state’s commitment to offshore wind despite Trump’s persistent efforts to destroy the sector, including one stop-work order against Sunrise and two against Empire. (See U.S. Offshore Wind Supporters Map Path Forward.)

It is an important part of New York’s strategy to meet rising power demand, she said: “To be clear, offshore wind remains a central part of how we get from here to there on the order of 7 GW of incremental capacity between now and 2040.”

That is a telling detail.

New York’s official offshore wind goal, established by its landmark 2019 climate law and specified on NYSERDA’s own website, has been 9 GW by 2035.

‘Meaningful Step’

So New York is pushing the timeline back and potentially changing the path but not abandoning the effort.

NYSERDA on Feb. 10 issued a request for information (RFI) seeking industry input on a potential predevelopment support program by which the state would enable the private sector to “continue investing responsibly in their lease areas to advance project development during a period of federal uncertainty, so that projects are well positioned to move forward efficiently when federal conditions become more favorable.”

One potential approach for this could be co-investment by the state, Harris said.

Also looking forward, NYSERDA on Oct. 2, 2025, proposed an offshore wind implementation plan that among other things structures the OREC system to reduce impacts on electric utility ratepayers, including through sales to voluntary third parties.

The Public Service Commission approved the plan at its Feb. 12 meeting (case 15-E-0302). PSC Chair Rory Christian said in a news release: “The commission acted to ensure the orderly management of the OSW program and corresponding sale of OSW Renewable Energy Certificates (ORECs) by NYSERDA when the program becomes operational. Our decision today will benefit residential and commercial customers by ensuring that ratepayer costs related to offshore wind development are reduced.”

The New York Offshore Wind Alliance issued a statement supportive of the state’s three policy moves.

“We understand NYSERDA’s decision to close the 2024 offshore wind solicitation without awards was because the original proposals were based on a completely different federal landscape,” said Alicia Gene Artessa, director of the industry group.

“We are strongly supportive of NYSERDA’s recent RFI exploring a predevelopment model for offshore wind solicitation. We believe that fundamentally changing how New York procures offshore wind energy is the right path forward while we adapt to the current federal instability.”

And she said: “We are also encouraged to see that the PSC approved NYSERDA’s Offshore Wind Implementation Plan yesterday. This is a meaningful step from the PSC to allow for more flexibility in the sale of ORECs and ensure our current under-construction projects continue to be managed effectively.”

N.J. Targets Data Centers in New Source Push

New Jersey legislators have advanced a bill that would protect ratepayers from rate hikes triggered by data center development as the state looks for ways to add generation capacity, boost its infrastructure and curb energy use.

A Senate and an Assembly committee each backed S731. It would require the New Jersey Board of Public Utilities to develop a tariff to set special rates for “large load data centers” with a maximum monthly demand of at least 100 MW.

Also moving ahead were unrelated bills to require data centers to report their water and electricity use, to boost solar development and to study how advanced transmission technologies can help the state. The BPU also announced rates would be flat in the next period, which starts June 1.

The tariff set up by the BPU under S731 would shelter ratepayers from rate increases stemming from “increased electricity demand caused by large load data centers.” The tariff also should “incentivize large load data centers to develop and utilize methods to increase energy efficiency, including through the use of technologies that capture and utilize the heat produced by the large load data center.”

To submit a commentary on this topic, email forum@rtoinsider.com.

To make sure investments in the state yield their full potential benefits over time, the legislation requires the state’s four utilities to ensure data centers provide financial guarantees they will “take at least 85% of service they request for a period of not less than 10 years.” Data centers must show the proposed project is “unique and not duplicative of any other large load data center project” in or out of state.

The Senate Economic Growth Committee backed the bill 3-1. The Assembly Telecommunications and Utilities Committee backed a version 9-0 after testimony that showed vigorous support for the legislation.

Data Center Reporting Requirements

Preparing for the predicted dramatic increase in data centers and ensuring they pay for themselves without overly burdening ratepayers is central to the state’s efforts.

Analysts say one cause for the predicted energy shortfall is that the state, like others in PJM, has closed aging, fossil-fueled resources more rapidly than new, mainly clean energy sources have come online.

In a separate vote that also focused on data centers, the Senate Environment and Energy Committee backed S3379, which would require data center owners or operators to compile a water and energy use report to the BPU every six months. The BPU would publish the information, according to the bill, which passed without comment.

The data center tariff, while drawing mainly supportive testimony, demonstrated the complexity of the issue. Some legislators expressed concern that the state should avoid creating obstacles that could deter data centers from coming to the state, but most speakers focused on ratepayer benefits.

“Ultimately, this is about cost fairness for the people of New Jersey,” said Assemblyman David Bailey Jr. (D), a bill sponsor and committee member. “This is about us looking out for our constituents and their best interests and the overall health of New Jersey electrical grid.”

Zach Landesini, a resident of Vineland, N.J., said he sees the need for the legislation in the experience of his community with the development of a 700,000-square-foot data center by Data One, whose website says it will use 350 MW. Landesini said he envisions scenarios that emerge, and which the bill could address.

He said the project will draw 15% of its power from the local grid, and added: “What would happen if Data One needed to draw a larger portion of their power from the local grid?”

“How would this affect local ratepayers?” he asked. “This could happen for a variety of reasons, including technical deficiencies in power generation, and site expansion.”

Supply Side Pressure

Brian O. Lipman, director of the New Jersey Division of Rate Counsel, also backed the bill, but acknowledged that it couldn’t shield ratepayers from all the cost hikes associated with a data center. One committee member asked him how the state could calculate the increased cost of power when rates increased due to a data center pulling a large volume of power from the grid, effectively reducing the supply for everyone else.

“On the supply side, there’s not a lot we here can do, other than if we want to build generation somehow to add more supply,” he said. “What we can do is, and what we are doing is, we’re pressuring PJM: First of all, don’t sign up a data center if you don’t have the power to serve them.”

He noted that an alternative, outlined in a separate bill, would require new data centers to “bring their own generation.” But if New Jersey passed such a law and other states in PJM do not, data center developers would simply build outside of New Jersey, which nevertheless would bear that cost through its participation in the RTO, he said.

Harnessing New Technology

The debate over how the state addresses the looming energy shortfall, and the added pressure on generation and grid systems from data centers, stepped up in earnest in June 2025, when a 20% rate hike on the average electricity bill took effect.

Gov. Mikie Sherrill (D), who took office in January, pledged in her election campaign to freeze rates. Her first executive orders upon taking office laid out a range of measures designed to do so and boost state generating capacity, in part by accelerating solar development. (See New N.J. Governor Rapidly Confronts Electricity Crisis.)

In that vein, the Assembly Telecommunications and Utilities Committee moved ahead a bill, A3969, that would extend the state’s current goal of incentivizing 3,750 MW of solar power by 2026 to one of incentivizing 750 MW per year through 2035.

The Senate Economic Development committee backed a bill that seeks to prepare the state’s infrastructure for future stress by using advanced transmission technologies (ATT). S2189 would require the BPU to evaluate the “attributes, functions, costs and benefits of ATT” and look at whether it could “enable an electric public utility to provide safe, reliable and affordable electricity to its customers, considering existing and planned transmission infrastructure and projected demand growth.”

The bill defines ATT as any software or hardware technology that increases the capacity, efficiency, reliability or safety of an existing or new electric transmission facility, including grid-enhancing technology and advanced or high-performance conductors.

Rate Hikes Temporarily Avoided

BPU President Christine Guhl-Sadovy announced the results of the state Basic Generation Service auction conducted in early February. The results, which largely are shaped by the PJM capacity auction, will mean the average electricity bill stays roughly the same when the new rates take effect June 1.

The minimal increase or slight decline for some ratepayers depending on their utility is due largely to a “collar” PJM agreed to place on its prices, limiting their increase, because of lawsuits filed by New Jersey and other states. Sherrill has advocated for an extension of the collar, Guhl-Sadovy said.

“I think we would anticipate there to be a higher price if we don’t have a collar,” she said. “Because we have significant load growth and so we need to get more generation and more capacity through things like demand response in order to meet that kind of load growth.”

She acknowledged the collar is a temporary measure and the trajectory of future rates is primarily in the hands of PJM. And she noted that Sherrill outlined a series of initiatives to hold down rates and increase generation capacity in her first two executive orders. They included boosting solar and battery storage power and creating a virtual power plant strategy.

“Those things will not have an overnight impact on capacity prices, but they will certainly put downward pressure on capacity prices, because we will have more generation bidding into the capacity market,” she said.

ERCOT Taps the Brakes on Batch Study Process

With the 232 GW of large loads seeking to interconnect with the ERCOT system having “clearly broken the process that we had,” CEO Pablo Vegas said the grid operator’s proposed batch process — or cluster studies, in most managed grids — will provide clarity and transparency to data center developers.

He told his Board of Directors during its Feb. 9-10 meeting that the batch studies will change a process that is “very different than what we’ve had before” by reserving capacity on the transmission system for the large loads’ future use.

“Today, that is not the way it works. It doesn’t work here, and it doesn’t work that way anywhere in any grid in the U.S.,” Vegas told the board. “What we found is that the processes that we had set up were really designed for a system where we were seeing interconnections in eight to 15 a quarter, to where we’re now seeing 80 to 100 in that same time period.”

ERCOT staff have proposed grouping together large load requests to be evaluated, rather than relying on the current individual studies that transmission service providers conduct. The studies will determine the amount of requested load that can be reliably served each year over a six-year period and the transmission upgrades needed to accommodate the full load requested. (See ERCOT Finds Stakeholder Support for Batch Process for Large Loads.)

The grid operator plans to begin the process with a “Batch Zero” study to transition from the current process. It will file a protocol revision request codifying the process and bring it to the board for its approval in June, followed by the Texas Public Utility Commission. If all goes as planned, Batch Zero will begin by late summer.

ERCOT originally planned to request a good-cause exemption from the PUC to begin the Batch Zero study in February. However, after a Feb. 3 workshop with stakeholders left many “open questions to be decided,” PUC Chair Thomas Gleeson said, the commission directed the grid operator to tap the brakes on its effort.

“A top-down, ERCOT-driven process where there isn’t a lot of input from stakeholders is really not the way to do this,” Gleeson said Feb. 11 during an industry conference in Round Rock, Texas. “I do believe that we’ll end up with a better outcome from getting all that information on the front end rather than being kind of centrally controlled by ERCOT and the PUC.”

The grid operator will continue to gather stakeholder input through the Large Load Working Group, stakeholder workshops and the stakeholder-led Technical Advisory Committee.

Batch Zero will set the foundation for the subsequent studies, which are to take place every six months. Another NPRR will be filed to codify the ongoing batch process and brought to the board in September.

“We heard the [PUC’s] message loud and clear. We need to keep the pace going on,” Vegas said. “This work continues to be an important part of supporting the economic growth that’s coming. We need to ensure that we have a more robust and a very scrutable process that’s going to benefit all stakeholders, but we can’t do that at the expense of expediency.”

Board Approves Tier 1 Project

The board approved a Tier 1 project previously endorsed by TAC, a $117.4 million, 138-kV South Texas Electric Cooperative transmission build. The project will accommodate a 300-MW ammonia plant near Victoria on the Texas Gulf Coast.

The project is expected to be in service in June 2028.

The board also elected Vegas as CEO and ratified ERCOT’s officers; confirmed TAC’s 2026 leadership; and signed off on the updated Market Credit Risk and Corporate Standard, which removes references to the Reliability and Markets Committee after it was dissolved in 2025.

The consent agenda included three protocol revisions and two changes to the Planning Guide:

    • NPRR1304: incorporate the Other Binding Document “Procedure for Identifying Resource Nodes” into the protocols to standardize the approval process.
    • NPRR1305: add the Other Binding Document “Counter-Party Credit Application Form” into the protocols to standardize the approval process.
    • NPRR1311: correct an error in the calculation of real-time reliability deployment price adders for ancillary services when ERCOT is directing firm load shed during a Level 3 energy emergency alert under the RTC+B protocols, ensuring final ancillary services prices cannot exceed $5,000/MWh.
    • PGRR127: outline the additional generators that may be added to the planning models to address the generation shortfall introduced by the implementation of House Bill 5066’s requirements and increased load growth. The PGRR would also add a supplemental generation sensitivity analysis for Tier 1 Regional Planning Group project evaluations to minimize the effects of the additional generation on transmission project evaluations.
    • PGRR132: clarify that new resources must interconnect to ERCOT through a new standard generation interconnection agreement.