Search
March 29, 2026

IESO Chief Seeks Improved ‘Alignment’ with Electric Distributors

TORONTO — Local distribution companies and bulk transmission system operators need to improve their alignment as LDCs transition from passive roles overseeing poles and wires, IESO CEO Lesley Gallinger told attendees of the Ontario Electricity Distributors Association’s ENERCOM 2026 conference March 23.

“I think the role that LDCs are taking on is becoming much more pivotal to future reliability and affordability conversations,” Gallinger said during a Q&A session with Elexicon CEO Amanda Klein. “The LDCs are front-running the adoption of emerging technologies like electric vehicles and heat pumps, and that’s useful information [for] the bulk system. And the work that LDCs have done to integrate [distributed energy resources] through their distribution systems is also … helpful from a technical perspective.”

LDCs and IESO need to improve their alignment on regional planning, forecast assumptions and operational practices, Gallinger said, citing a need for real-time distribution data.

“You have never been asked to do more than today,” Minister of Energy and Mines Stephen Lecce told the distributors in a lunchtime speech. “There has never been more constraints and pressure on the electricity system. You’re doing something right. You guys work together. You’re thinking ahead. You’re de-risking. You’re collaborating. You’re trying something new. You’re being bold. You’re challenging the status quo.”

Ontario and the federal government are making big bets on nuclear power, pledging to build 16,000 MW of new generation, including four small modular nuclear reactors and up to 4,800 MW of additional capacity at the Bruce Nuclear Generating Station. (See Ontario Integrated Energy Plan Boosts Gas, Nukes.)

Energy ‘Quadrilemma’

The passage of Bill 40, which made economic development part of the mission for IESO and the Ontario Energy Board (OEB), has turned the traditional energy trilemma — the balance of reliability, affordability and sustainability — into a “quadrilemma,” Gallinger said.

“The challenge lies in the fact that all four dimensions are interconnected. If you prioritize [an] outcome for one of the dimensions, you lose perhaps something on one of the other dimensions,” she said.

“Economic policy and energy policy are now inextricably linked. And so that leaves us kind of a narrow band for error,” she said. “We’re moving now to continuously and proactively plan. So rather than the five-year ‘set it and forget it’ model, we’re continuing to intake new information and … iterate on those plans. And that will allow us to stay adaptable and responsive to those evolving circumstances and hopefully help us get the quadrilemma equation right.”

Chris Benedetti (left) and Mark Olsheski, both with Sussex Strategy | © RTO Insider

Mark Olsheski, vice president of energy at Sussex Strategy, said increasing distribution-based generation will be crucial to navigating the next decade, before new nuclear capacity goes into service.

“We have well over 2,000 MW of embedded solar in Ontario, most of which is coming to the end of its contract. … There’s right now not … a clear plan for how that solar gets renewed,” he said during a panel discussion.

“This is just what’s already on rooftops and in fields at the distribution level. But I think that we need to deploy significant resources currently not [planned] within this window, that don’t involve big procurements for large gas or storage assets. The greatest opportunity to do that … is going to be at the distribution level.”

He cited the importance of the OEB’s Centralized Capacity Information Map, released in January, which provides data for both load and DER connections. “There are big swaths of the province that are pretty red — like you can’t really plug in a toaster oven without something blowing up. So certainly, there’s a lot of work that needs to be” done.

U.S.-Canada Trade Relations Dominate Distributors’ Conference

TORONTO — Canada should expect turbulent relations with the U.S. to continue under the Trump administration, speakers said at the Ontario Electricity Distributors Association’s ENERCOM 2026 conference.

In 2020, President Donald Trump hailed the United States-Mexico-Canada Agreement, which replaced the North American Free Trade Agreement, as the “largest, fairest, most balanced” trade deal in history. But as the agreement comes up for review in July, its renewal is anything but certain.

With “President Trump, anything is possible,” former diplomat Gitane De Silva said at the conference March 23. “I think we have to prepare for any outcome.”

Gitane De Silva, former CEO of the Canada Energy Regulator | © RTO Insider

The most likely outcome is that Canada and the U.S. will fail to extend the agreement, triggering annual reviews until a new pact can be reached, said De Silva, former CEO of the Canada Energy Regulator, which oversees international and interprovincial pipelines and electric transmission.

“It’s not to President Trump’s perceived advantage to get to ‘yes,’ because he likes the chaos. He likes uncertainty. He feels that makes him more powerful,” De Silva said. “So, I think we just have to become accustomed to the fact that that trading relationship is not going to be as stable as it was” before Trump.

Jeff Rubin, former chief economist of CIBC World Markets, was more blunt about prospects for the agreement, known as CUSMA in Canada.

“Prime Minister [Mark] Carney pretends that other than a few little tweaks, CUSMA will be renewed. CUSMA is a dead man walking. There is a 0% chance of President Trump renewing CUSMA,” Rubin said. “Whatever bilateral agreement [that is developed] will involve U.S. tariffs on Canadian exports.”

Already, Rubin said, car manufacturers are relocating operations in the U.S.

Rubin said it was President Joe Biden who began undermining the World Trade Organization and the global trading system with his “pervasive use of sanctions and tariffs.”

“But whereas Biden’s policy of friendshoring targeted America’s enemies — China and Russia — Trump’s policy of reciprocal tariffs makes no distinction between friend and foe. And, as Canadians have discovered to their horror, Canada is as much a target of economic warfare as is America’s enemies … perhaps in some sense even more so.”

De Silva said Canada will have some leverage in the negotiations because of the U.S. need for Canada’s heavy oil and fertilizers.

Jeff Rubin, former chief economist of CIBC World Markets | © RTO Insider

But Rubin said Canada — with the third-largest oil reserves and fifth-largest natural gas reserves in the world — has failed to maximize its resources.

“Prime Minister Carney often refers to Canada as an energy superpower. … But what Prime Minister Carney does not seem to recognize is it takes more than natural endowments to be an energy superpower,” he said. “Real energy superpowers like the United States and Russia do whatever is necessary to get the energy out of the ground and get it to markets that value it the most.

“While Canada’s geology has bestowed upon it considerable resources, the country has consistently lacked the political will to develop. That is why Canada’s oil production is less than half of Russia and Saudi Arabia and a third of America, and what Canada does produce is way below prices that other oil exporters get for their product.”

Alberta Tie Line, Secession Vote

De Silva said that in addition to considering dairy, poultry, eggs, automobiles and lumber trade, U.S. negotiators may also seek to resolve a dispute over the Montana-Alberta Tie Line.

Stephen Lecce, Ontario minister of Energy and Mines | © RTO Insider 

In 2024, Berkshire Hathaway Energy (BHE) Canada, owner of the intertie, filed a complaint with the Alberta Utilities Commission alleging that the Alberta Electric System Operator’s restriction of imports was discriminatory and jeopardizing renewable power investment in Montana. U.S. Trade Representative Jamieson Greer raised the issue with the Senate Finance Committee during a presentation on CUSMA in December.

Alberta has denied discriminating against the U.S., saying it is merely managing grid congestion and protecting reliability. It says BHE’s complaint is an effort to increase its earnings from the merchant intertie facility.

De Silva noted that Montana Gov. Greg Gianforte is a close ally of Trump. “Given that, I think the potential is that it rises higher on the list of irritants than something of this magnitude normally would,” she said.

Rubin said he expects Trump to attempt to influence an October referendum on whether Alberta should leave Canada. “I’m sure he’s prepared to offer Alberta statehood and throw in the Keystone XL pipeline as a sweetener,” Rubin said.

Benefits of Gridlock

Democrats could seek to restrain Trump if they win back at least one house of Congress in this year’s midterm elections, De Silva said.

“It’s actually an advantage for Canada to see that power be split,” she said. “So, it will become more dysfunctional for Americans … but in a way, that gridlock — dragging the puck — might be beneficial at this point in time.”

The long-term fate of U.S.-Canada relations will depend on who succeeds Trump as president, she said.

“The studies will show you that when you break trust, it takes at least twice as long to build it back than it did the first time,” she said. “We don’t have to like the Americans, but we’re going to be neighbors forever.”

‘Island of Stability’

Stephen Lecce, Ontario’s minister of Energy and Mines, highlighted Canadian leaders’ cooperative response to U.S. pressures, saying he wants Carney to succeed even though they are in different parties.

“This country is an island of stability in a sea of chaos,” Lecce said. “We’re working with the federal government in good faith on these matters, because in this moment, frankly, we’re on the same team. … You don’t hear [that] when I’m traveling the world. I will tell you, many of these subnational and national governments of different parties, they’re not on the same page. … That’s a Canadian value and something I’m proud of.”

As Public Data Shrinks, Private Climate Models will Shape the Grid’s Future

Not long ago, the electric grid ran on a shared set of facts: weather data, flood maps and long-term climate projections from federal agencies. While imperfect, the data was broadly accessible and widely understood. Utilities, RTOs, regulators and developers might interpret the data differently, but they were at least starting from the same baseline.

At the very moment grid operators are being asked to plan for unprecedented complexity — explosive load growth from data centers, electrification of buildings and transport, and a rising cadence of climate-driven extreme events — the public data infrastructure that underpins those decisions is becoming less reliable, less complete and in some cases less available.

If this had happened a decade or two ago, it could have blinded the whole industry. Fortunately, a rapidly expanding ecosystem of private data platforms, proprietary climate models and AI-driven simulation tools is ready, willing and more than eager to fill that gap. And if government data integrity is threatened, the future of grid planning increasingly will be built not on shared public datasets but on licensed, and probably opaque, models.

This is a shift in governance as much as it is a shift in technology. And for a system as interconnected and reliability sensitive as the power grid, it raises a question: What happens when the “ground truth” of grid planning no longer is public?

Public Data Infrastructure is Eroding

Years ago, I toured the National Center for Atmospheric Research (NCAR), a brutalist I.M. Pei structure in Boulder, Colo., and an architecture-and-climate-science nerd’s dream day trip. On display was the room-sized Cray-1, the first supercomputer, highlighting the data-intensive nature of weather prediction.

Dej Knuckey

Funded by the National Science Foundation, NCAR was one of the leading scientific organizations creating the analytical methodologies to track the changing planet. It, along with agencies like NOAA, FEMA, USGS and EPA, has provided the baseline datasets that inform many aspects of the grid. These datasets no longer are produced by Cray-1 (the phone in your pocket is now orders of magnitude faster), and they are not perfect — but they are standardized and transparent.

Budget uncertainty, shifting political priorities and institutional constraints have all contributed to growing concern about the durability of federal climate and environmental data programs. In one of many moves to cut government research groups, the administration announced in December 2025 it would dismantle NCAR, citing it as a source of climate alarmism. While the nonprofit that manages NCAR is challenging the action, the U.S. electric system has to prepare for a day when it cannot rely on federal weather data for grid operations.

Even where data remains available, update cycles are slowing, and agencies struggle to keep pace with rapidly changing conditions. FEMA flood maps, for example, lag actual risk, while wildfire and heat risk datasets often are fragmented across agencies and jurisdictions.

For grid operators, this matters in very practical ways. Transmission planning needs consistent weather baselines, resource adequacy assessments require shared assumptions about temperature extremes and demand patterns, and emergency planning should draw from common risk maps.

The grid has always relied on a shared view of reality. As those baselines degrade — or diverge — the system risks losing coherence.

The Rise of Private Climate and Infrastructure Intelligence

Into this gap has stepped a new class of private-sector players, offering not just data but fully integrated predictive intelligence.

SPP’s Felek Abbas | SPP

One example is NVIDIA’s Earth-2 platform, a high-resolution digital twin of the entire planet. Using AI, Earth-2 aims to simulate weather and climate with a level of granularity far beyond that of traditional models. The promise is transformative: hyper-local forecasts of extreme weather, infrastructure-level risk assessments and scenario modeling that could reshape how utilities plan and operate.

Felek Abbas, senior vice president, chief technology and security officer at SPP, told RTO Insider that SPP is looking at the potential the Earth-2 platform offers.

SPP expects the improved detail and an additional week of detailed forecasting will help it in the day-ahead markets and improve its outage planning.

“If you’re expecting the load to be at a manageable place, you’re able to take an outage, but if you’re expecting the load to be really high, that’s not a time that you can afford to have an outage,” Abbas said. “More accuracy in that space means more reliability, and that’s what we’re after.”

Starting Broad or Narrow

While NVIDIA attempts to model the entire planet for any purpose — a one-planet-fits-all approach — another way to approach grid-meets-weather questions is to start with a specific industry query: How do we make the power system operate most efficiently? From there, we gather data sources, build models and test against decades of load, power pricing and weather data to develop a solution tailored for a complex industry’s needs.

Yes Energy is a prime example of this (and as of 2025, it happens to be RTO Insider’s parent company, so it clearly has great taste in industry intelligence). It has released an all-new module in its Power Signals product for “deeper forecast analysis” that isolates different weather effects to understand how they impacted historical load. In addition, it provides a view of the probable distribution of future demand based on observed weather outcomes up to a year ahead.

Models, Models Everywhere

Other industries have spawned additional weather intelligence companies with some applicability to the grid. Insurance companies and asset owners in particular seek digital crystal balls to assess the risks they face.

Organizations like First Street Foundation are redefining how climate risk is measured and monetized. Its extreme weather, wildfire and macroeconomic models translate complex environmental risks into address-level scores. For example, PVcase has integrated First Street data into its platform rather than relying on outdated FEMA flood maps, helping renewable energy developers and operators better understand flood risks.

Google DeepMind has both WeatherNext and Weather Lab. Jupiter Intelligence provides asset-level climate risk analytics for utilities and infrastructure investors, ClimateAi offers predictive climate insights tailored to operational decision-making, and Descartes Labs leverages geospatial intelligence to model environmental and economic systems at scale.

In a recent op-ed in Nature, two Oxford academics warned that rigorous standards need to be applied before trusting AI weather models. “Before weather agencies adopt AI models, the predictive skill of such models on a range of hazardous events — from heatwaves and heavy rainfall to major storms — must pass a defined minimum standard.” They proposed a protocol for training future AI systems that reserves a designated set of “iconic” extreme events solely for testing.

With or without those guardrails, we are seeing a shift from public data interpreted by utilities and regulators to private models that generate proprietary outputs embedded directly into planning and operational systems.

Implications for Utilities, RTOs and Grid Operators

For grid operators, this shift may become a source of friction and risk.

First, planning fragmentation will increase as utilities, developers and system operators rely on different datasets and models. One entity may base its load forecast on a proprietary climate-adjusted model; another may rely on historical NOAA data; and yet another may incorporate vendor-specific DER adoption projections.

The NCAR Mesa Laboratory in Boulder, Colo., conducts research on atmospheric chemistry, solar physics and forecasting models. | UCAR

The result could be a gradual erosion of alignment. Incompatible assumptions may underpin transmission plans, and divergent expectations of future conditions may emerge in interconnection studies. Regional coordination will be more difficult when participants no longer work from the same baseline.

Second, grid planning may become a “black box.” Many of the new models entering the space are proprietary and continuously evolving. Their assumptions are not fully transparent, their methodologies are not easily replicable, and their outputs may change as algorithms are updated.

For regulators and stakeholders, this creates a challenge: How do you evaluate a transmission investment justified by a model you cannot fully interrogate? How do you compare competing proposals built on different proprietary datasets?

We may move from engineering-led planning to model-led planning, with privately owned models.

Third, cost and access asymmetries are emerging. Large investor-owned utilities and well-funded developers can afford to license advanced datasets and modeling tools. Smaller utilities, municipal systems and co-ops often cannot. This creates the risk of a two-tier system of grid intelligence, where some actors operate with far more sophisticated — and expensive — insights than others.

Finally, operational dependencies are deepening.

Real-time grid operations increasingly rely on high-quality forecasts: weather-driven load, renewable generation output, wildfire risk and outage probabilities. These inputs are now being integrated into outage management systems, DERMS platforms and advanced forecasting tools — many of which depend on third-party data providers.

That creates a new form of vendor lock-in. If critical operational decisions depend on proprietary data streams, switching providers or validating outputs becomes significantly more difficult.

Regulatory and Market Implications

Regulators need to ask a seemingly simple question: Who validates the data?

If a utility files a transmission plan based on outputs from a proprietary climate model, what standard should regulators apply? Transparency? Historical accuracy? Peer review? And how should regulators enforce those standards when intellectual property protects the underlying models?

There also is a market power dimension.

Control over datasets increasingly means control over forecasts, risk perception and, ultimately, investment decisions. In that sense, private data providers may occupy a role analogous to credit rating agencies in financial markets: entities whose assessments shape outcomes but are not always fully visible or accountable.

For the grid, the stakes are particularly high because reliability depends on coordination, and coordination depends on shared assumptions. If different actors are operating on different versions of reality, the risk of misalignment — and failure — increases.

Managing the Public Grid with Private Data

I’m not arguing that private innovation is a problem. Far from it. The advances being driven by companies such as Yes Energy, NVIDIA, First Street and others are essential for managing a more complex, climate-exposed grid.

In a perfect world, society would treat public data infrastructure as critical infrastructure. However, the current administration has shown it will not consistently fund, maintain and update federal datasets at a pace that reflects their importance to national energy systems. And if it perceives climate data as a political tool rather than a neutral truth, it is unlikely to continue improving the open, standardized baselines that the industry has relied on until now.

Given the irreversible move from public to private data, transparency requirements need to evolve.

If proprietary models are used in regulated planning processes, the assumptions, validation methodologies and sensitivity analyses should be disclosed. Regulators do not need to see every line of code, but they do need confidence in the outputs.

Hybrid approaches should be explored. Public-private partnerships may combine the strengths of open data and private innovation, creating shared validation frameworks that preserve comparability while enabling advancement.

Data itself should be recognized as a core component of grid infrastructure. We regulate power plants and transmission lines. We set standards for reliability and interconnection. But we do not yet have a coherent framework for governing the data that determines where those assets are built and how they are operated.

The electric grid is becoming more digital, more dynamic and more exposed to climate risk. The question no longer is whether we have enough data to run the grid; it is whether the data we rely on is shared, trusted and governed in the public interest.

Reliability is not just a function of power plants and wires; it is a function of whether everyone is working from the same map.

Power Play columnist Dej Knuckey is a climate and energy writer with decades of industry experience.

New Group Questions IMM Findings that $22B MISO Spend Uneconomic

Don’t shoot the messenger.

Bill Malcolm

The MISO Independent Market Monitor was looking out for retail electricity consumers when he found the $22B transmission Tranche 2 spend uneconomic, and that local solutions weren’t adequately considered. The board approved it anyway. (See MISO Board Endorses $21.8B Long-range Transmission Plan.)

With a full-blown affordability crisis in utility rates, there is no justification for questionable spending — much less if it primarily benefits other states, some of which have expensive policies that need such transmission. The IMM also found a per-household cost of such spending of $7,500.

There’s something fishy about a new “consumer group” Manifest being formed to challenge this via the five-state complaint discussion. (See Group Raises Questions over MISO IMM Involvement in $22B Tx Plan Complaint.)

An independent market monitor is just that — independent.

He should not be shown the door but should be allowed to continue to do his job and also to talk to state regulators who seek his advice.

Instead, those who put the plan together and approved it without full vetting of local generation and non-wires solutions should be scrutinized. We owe it to the ratepayers.

And we need to see who is really behind this new group.

Bill Malcolm is a former MISO employee. His opinions are his own.

SPP RTO Expands into Western Interconnection April 1

SPP will complete its third major expansion of its RTO footprint when it begins administering the regional transmission grid under its tariff for several Western organizations overnight March 31 into April 1.

C.J. Brown, the grid operator’s vice president of operations, said staff have been encouraged by the status of their system and readiness activities, and they are expecting a successful cutover.

“After years of planning and testing, it’s exciting to be close enough to April 1 that SPP’s forward‑looking studies now include data from the western part of our expanded territory,” he said in a statement to RTO Insider. “This is obviously a major milestone, but it’s just the beginning of something bigger. Our operators and support staff are already looking ahead to April 2 and every day that follows, when we’ll be just as focused on our ongoing mission to keep the lights on.”

The RTO Expansion (RTOE) members affirmed their support to go live April 1 with a unanimous vote of support in March. (See SPP RTO Expansion Members Affirm April 1 Go-live.)

The expansion will add three states to the RTO’s 14-state footprint: Arizona, Colorado and Utah. It follows the previous additions of the Integrated System (IS) in 2015 and Nebraska’s public utilities in 2009. Those expansions added the Dakotas and parts of Iowa, Minnesota, Montana and Wyoming to the RTO’s footprint. (See Integrated System to Join SPP Market Oct. 1.)

The key organizations joining RTOE are:

    • Basin Electric Power Cooperative;
    • Colorado Springs Utilities;
    • Deseret Power Electric Cooperative;
    • Municipal Energy Agency of Nebraska (MEAN);
    • Platte River Power Authority;
    • Tri-State Generation and Transmission Association; and
    • Several Western Area Power Administration (WAPA) regions: Upper Great Plains (UGP)-West, Colorado River Storage Project and Rocky Mountain.

Basin Electric, MEAN, Tri-State and WAPA’s UGP-East Region already are members of SPP, having placed their respective facilities in the Eastern Interconnection under SPP’s tariff as part of the IS.

Several other load-serving and embedded entities that are part of WAPA’s Colorado-Missouri balancing authority also will become part of the SPP RTO on April 1.

The expansion began in 2020 when several utilities decided to explore RTO membership. A Brattle Group study found the move would be mutually beneficial and save $49 million annually.

SPP’s members will have access to seven of the eight back-to-back DC ties, with a combined capacity of 1,320 MW, that connect the asynchronous Eastern and Western interconnections.

Report: Poor Voltage Control, Lack of Regs Drove Iberian Grid Collapse

The European Network of Transmission System Operators’ (ENTSO-E) final report on the Iberian Peninsula blackout of April 2025 lays out the root causes and chain of events that led to the collapse of the grid, providing critiques of the numerous points of failure.

Much of 472-page document, released March 20, consists of the organization’s factual report, released in October 2025, but it also includes a detailed root-cause analysis and recommendations for preventing future outages, as well as additional data that were not previously available. (See European Regulator Issues ‘Factual Report’ on Iberian Outages.)

The report’s findings largely align with those of the Spanish government and grid operator, which concluded the blackout occurred because traditional synchronous generation could not provide adequate control of high voltage resulting from frequency oscillations, exacerbated by a faulty power plant controller.

To submit a commentary on this topic, email forum@rtoinsider.com.

ENTSO-E stresses throughout the report that no single factor led to the collapse and that any one of the factors by itself would not have been a problem. Instead, it created a large root-cause tree displaying the multiple factors that led to a very fast voltage increase (page 332).

The roots of the tree include:

    • no explicit criteria concerning dynamic behavior for conventional generators’ reactive power;
    • no economic consequences for generators if their voltage-control requirements were not met;
    • renewable resources operate in fixed power factor mode; and
    • the design of voltage control of local generation networks not aligned with system needs.

The report separates its recommendations based on whether they are related to the root causes and gives each a priority label. ENTSO-E said generators should operate in voltage-control mode whenever possible and that transmission system operators (TSOs) should ensure there are enough voltage-control and monitoring equipment on the grid.

It also said TSOs need to enforce Europe’s harmonized voltage operating range: Spain allows its grid to operate up to 435 kV, while the rest of the continent allows up to 420 kV.

Finally, ENTSO-E said a common procedure should be established to create a snapshot after a significant event, allowing for accurate simulations of the system under similar conditions to those of the event. It noted that it had to rely on incomplete data, particularly from Spanish TSOs.

“This blackout highlights how developments at the local level can have systemwide implications and underlines the importance of maintaining strong links between local and European system behavior and coordination, while ensuring that market mechanisms, regulatory frameworks and energy policies remain aligned with the physical limits of the system,” ENTSO-E said.

Swett Affirms FERC’s Jurisdiction in Connecting Large Loads

HOUSTON — FERC Chair Laura Swett treaded carefully when discussing the commission’s Advance Notice of Proposed Rulemaking to accelerate the massive wave of large loads that virtually everyone in the industry agrees is coming, whether the grid is ready or not.

Energy Secretary Chris Wright directed the commission in October 2025 to open a rulemaking that accelerates large loads’ interconnection by asserting for the first time FERC’s jurisdiction over end-use customers’ grid connections. (See Energy Secretary Asks FERC to Assert Jurisdiction over Large Load Interconnections.)

The commission faces an April 30 deadline to release the rules. Given that the docket is still open, Swett is simply trying to avoid ex parte rule violations.

Speaking at CERAWeek 2026 by S&P Global on March 26, Swett was asked by conference chair Daniel Yergin about the commission’s role in matching supply with, in Swett’s own words, the “exponential, explosive demand.”

“We have to ensure that there are clear and efficient rules for large loads and generation to get online as quickly as possible, and I’m being a little bit careful here,” she said, “because we have a live docket from the secretary of energy that gets to this matter.”

When she was again asked to comment on the ANOPR during a press briefing after her onstage appearance, Swett said, “The question you asked me is squarely within that docket, so I won’t speak to any specifics.”

To submit a commentary on this topic, email forum@rtoinsider.com.

Speaking generally, she said that federal jurisdiction is very clear.

Federal and state jurisdiction “has been that way for hundreds of years, and as a regulator, I take those lines very seriously,” Swett said. “That is part of the way that the commission is thinking, but we still have quite a bit of time before we have to opine on that docket.”

Speaking elsewhere during the conference, Commissioner Judy Chang echoed Swett by saying she has “convictions” that FERC has “some amount of jurisdiction” over large loads interconnecting to the grid.

Wright’s directive included a proposed rulemaking designed to ensure the timely and orderly interconnection of large loads, and laid out four legal justifications:

    • Large load interconnections are a critical component of open-access transmission service that requires minimum terms and conditions to ensure non-discriminatory service.
    • Interconnecting large loads is a practice that directly affects FERC-jurisdictional rates, and the Federal Power Act has vested the agency with exclusive authority to ensure wholesale rates are just and reasonable.
    • The ANOPR will not impinge on state authority over retail sales. FERC will not exert jurisdiction over any retail sales to large loads, and the states retain authority over expansion or siting of generation facilities.
    • Any contrary view of the proposed changes conflicts with the FPA’s core purpose of granting FERC exclusive jurisdiction over transmission in interstate commerce and interconnecting large loads to the grid to obtain service benefits.

Wright recommended the ANOPR apply to loads 20 MW or greater and that it include standardized financial and readiness requirements. Loads that agreed to be curtailed during tight grid conditions would be expedited and also responsible for 100% of network upgrade costs.

FERC may be building on several interconnection proposals that the country’s grid operators have made. MISO and SPP have instituted expedited studies to interconnect “shovel-ready” projects that received commission approval in 2025. SPP also received approval for 90-day study processes that review interconnection requests from “high-impact” large loads seeking to interconnect to its system. (See MISO, SPP Collaborate on Their ERAS Proposals and FERC Approves SPP Large Load Interconnection Process.)

“There are several markets across the country that have come to FERC with proposals,” Swett said. “Under our authorizing statutes, we are in a receptive posture, but we also have tools to be a little bit more aggressive in directing markets to do something.”

She said the country’s electric markets have “very different” characteristics and differing transmission systems. Market members and state policies also have different “progressions” for solving the problem, Swett said.

“As we have received proposals across the country, we look at them and try to accept them or refine them as quickly as we can to get the markets the feedback and the approvals that they need to then solve the problem themselves,” she said.

Peter Lake, National Energy Dominance Council | © RTO Insider 

Peter Lake, senior director of power for the White House’s National Energy Dominance Council, said FERC has done a “tremendous job” on the rulemaking and that if implemented, it would change the interconnection and energizing timeline to 60 days.

“I really hope that they get where we hope they’re going, but to give you a sense of context, a lot of these hyperscalers who are writing $100 million checks are frustrated with a five-, six-, seven-year timeline,” he said. “It would be an extraordinary game changer in both accelerating interconnected data centers and accelerating this country’s ability to compute.

“It should be an extraordinary change, an extraordinary paradigm shift, in how this country develops and how this country competes in the global AI arms race and how our generation fleet operates,” Lake added.

“That is remarkable,” said moderator Douglas Giuffre, with S&P Global. “Sixty days. It would be amazing.”

It may also be unlikely. The industry requires stability models and assessments when interconnecting and energizing loads. To maintain reliability in interconnecting the loads, stability — the critical-path item — is a priority.

“There’s a lot of details,” Lake said, “and FERC is doing a great job of working through that, as they should.”

Comanche 3 Repair Delay Raises RA Concerns in Colorado

A delay in the repair of Unit 3 of Xcel Energy’s coal-fired Comanche Generating Station has sparked questions among Colorado regulators about the utility’s ability to meet summer peak demand in 2026.

Public Service Company of Colorado, an Xcel subsidiary, told the Colorado Public Utilities Commission in a March 2 filing that Comanche Unit 3 would return to service “around August 2026.” In previous filings, PSCo listed the unit’s expected return-to-service date as June 15, 2026.

The Unit 3 outage began Aug. 12, 2025, prompting a petition from PSCo and Gov. Jared Polis’ administration to postpone the retirement of Comanche Unit 2 to help make up for the outage. The PUC granted the petition in December, delaying Unit 2’s retirement one year to the end of 2026. (See Colorado PUC Approves Extension for Comanche Coal Plant.)

A PSCo forecast that assumed Unit 3 would be back online in June showed a summer peak capacity need of 7,534 MW and a total accredited capacity of 7,457 MW, for a 77-MW deficit. But without a return to service of Unit 3, which has an accredited capacity of 415 MW to the PSCo system, the shortfall potentially grows to 492 MW.

The delayed return of Unit 3, coupled with an early spring heat wave throughout the West, elicited concern from commissioners during a March 25 PUC meeting.

“We have a real problem with Comanche 3,” Commissioner Tom Plant said. “I’m not convinced they’re going to be able to meet summer peak, particularly when we’ve got 85-degree temperatures in March.”

As part of its approval of the Unit 2 extension in December, the commission directed PSCo to file by March 1 a status report on Comanche 3 and the company’s plans to address resource needs. A second report is due by June 1.

But during their March 25 meeting, commissioners said summer 2026 resource needs should be addressed quickly. They asked the company to file a report by April 15.

Commission Chair Eric Blank wants the company to run a test of its demand response programs in April or May to see how much demand is reduced. Commissioners said they’re prepared to fast-track increased incentives for demand response programs to increase participation this summer.

Commissioners asked PSCo to work with its wholesale customers on demand response. Another idea was to focus on customer-side energy storage programs, which could help during peak demand and public safety power shutoffs.

Blank told the company to coordinate with the state on a text messaging system that can alert residents to an energy emergency and urge them to conserve electricity. Such messaging helped avert blackouts in California during a September 2022 heat wave and in Alberta, Canada, during a January 2024 cold snap. (See CAISO Reports on Summer Heat Wave Performance and Consumer Response Saved Alberta Grid During Jan. 2024 Cold Snap.)

More Coal Plant Extensions?

The delay in Comanche Unit 3 repairs was one of “two very bad pieces of news” from PSCo’s March 2 filing, Blank said. The second disappointment is that the utility is now “wavering on its longstanding commitment” to coal plant retirements, he said.

“Near-term, the most likely capacity solutions are continued extensions of existing units — namely, Comanche Unit 2 and, to a lesser extent, the Hayden units,” PSCo said in the filing.

Hayden Station, whose two units have a combined capacity of 446 MW, is scheduled to retire by 2028.

The company acknowledged that additional investments — and potentially regulatory changes — would be needed to keep the coal-fired units running beyond their retirement dates. Coal extensions also “may create challenges with continuing the pace of the company’s clean energy transition,” PSCo said. Xcel plans to exit from coal by 2030.

In addition to a summer peak capacity shortfall in 2026, PSCo forecasts show a 445-MW shortfall in summer 2027. In 2028 to 2030, a summer peak capacity surplus is predicted.

Capacity shortfalls are also forecast for winter peaks: 96 MW in 2027 and 424 MW in 2028, both assuming Comanche 3 is back in service.

PSCo said it has been looking at other ways to plug its capacity shortfalls. Market purchases will help improve the capacity position in 2026. But the availability of short-term market purchases “is increasingly constrained,” the company said, “due to high demand across the West, limited transmission availability and the impending implementation of SPP’s Western expansion.”

Other potential strategies for increased capacity are extending power purchase agreements, implementing demand response programs and acquiring more resources through a commission-approved near-term procurement process. (See Colo. PUC Approves 3.2-GW PSCo Resource Package.)

Comanche 3 Repairs

Comanche Unit 3 is undergoing both offsite and onsite repairs, PSCo said in its filing. Mitsubishi is conducting the offsite work but has been hindered by supply chain delays, the filing said. General Electric is performing onsite work and will restore the unit to service.

Preliminary results of a root cause analysis point to fabrication and design issues with Unit 3, the filing said, noting that Public Service ran the unit “within the relevant thermal parameters.”

The public version of the filing redacts the name of the Unit 3 component that was shipped out for repair, as well as the cost of repairs.

Comanche Unit 3 has been plagued with outages since it first went online in 2010. PUC commissioners have questioned whether it makes sense to keep fixing it.

But for now, PSCo said, “there are no reasonable alternatives to returning Comanche Unit 3 to service.”

“The timelines to develop and in-service new fixed generation are, at best, in 2029 and will cost billions of dollars to address the 415 MW accredited capacity of Comanche Unit 3,” the company said.

‘Lumpy’ Data Center Load Concerns Emerge in California

Data center load growth in California could turn out to be “lumpy,” with sudden, large increases in specific regions of the state, rather than smooth growth over time.

Standard forecasting models assume smooth and diversified load growth, such as a 1.5% increase per year, GridLab Senior Program Manager Casey Baker said in a letter to the stakeholder working group managing CAISO’s large loads initiative. But data center load growth often jumps 50 to 100 MW almost instantly in specific locations, he said.

California’s data center demand is expected to increase by 1.8 GW by 2030 and 4.9 GW by 2040, but utility interconnection queues in the Silicon Valley area suggest the 4.9-GW demand could occur by 2030, Baker added.

To address sudden demand growth, CAISO should complete a high load-growth sensitivity case as part of its 2026/27 transmission planning process (TPP), he said. This study would show the risks of underbuilding the CAISO system compared to the risks and costs of overbuilding and would be particularly critical for the South Bay Area and other load pockets where growth is concentrated.

To submit a commentary on this topic, email forum@rtoinsider.com.

Major transmission infrastructure requires eight to 10 years to plan, permit and construct, but modern data centers can be energized in two or three years. If CAISO’s TPP relies on the base case forecast but a high load-growth future occurs, the grid faces a physical deficit in the early 2030s that “cannot be remedied in time,” Baker said.

“The transmission owners simply will not be able to build wires fast enough to catch up to the demand,” he said.

A high load sensitivity case would include specific hot zones where loads develop at a higher rate than is forecast, Baker said.

“This is the only way to reveal where voltage instability and thermal overloads might occur that the base case forecast averages away,” Baker said.

GridLab cited a Silicon Valley Power (SVP) study that showed SVP’s capital construction plan far exceeds the state’s base assumptions. Load in Santa Clara will double from about 720 MW to about 1,300 MW by 2035, with most of it happening in the next five years.

It is important for CAISO to clarify how it will review large load interconnection requests, particularly when significant network upgrades are identified as needed, SVP representatives said in comments to CAISO on the initiative. These network upgrades could occur in utility-led load interconnection studies that are not fully reflected in CAISO’s annual TPP, the representatives said.

SVP found that several 230- and 115-kV facilities could become overloaded based on proposed large load additions in SVP’s region, according to the letter.

CAISO does not have a threshold for what constitutes a large load, Danielle Mills, CAISO infrastructure policy development principal, said during a February CAISO workshop. (See CAISO Examines ‘Pulsating’ Data Center Loads.) The ISO is taking comments on this threshold and may develop a definition over the next several months.

Utilities are responsible for large load interconnections, but CAISO is monitoring developments at the federal level regarding whether RTOs and ISOs can or should be more heavily involved in the process, the ISO said in a Jan. 30 Large Load Considerations issue paper.

Large loads include more than data centers: Loads from EV charging stations and electric agricultural and industrial equipment, which also fall into the category, are expected to increase significantly over the coming years too.

SPP Model Should be Considered, Some Stakeholders Say

NextEra Energy representatives said CAISO should model SPP’s recent proposal for large loads that was approved by FERC. SPP’s proposal includes a 90-day study process for interconnecting large loads that will be paired with new local generation.

SPP’s model is “just and reasonable and not unduly discriminatory” and allows large load customers “to interconnect to the transmission system in a timely manner and increase the speed of interconnection queue processing,” NextEra said. CAISO could develop a similar model to meet the California Energy Commission’s large load forecast in a timely manner, it said.

CAISO could develop a new way to study large load proposals in its interconnection queue, specifically those that include co-located generation, NextEra said.

Currently, large loads with co-located generators or energy storage facilities are part of two interconnection processes — the transmission owner’s load interconnection process and CAISO’s generator interconnection process. A more closely coordinated process is needed to ensure that large loads are quickly interconnected while recognizing the effects at the point of interconnection, NextEra said.

Generation Industry Calls for Repowering at IPPNY Conference

ALBANY, N.Y. — Generation industry representatives and their allies united behind a call to loosen New York’s climate law to allow the repowering of old fossil fuel plants with new natural gas turbines at the Independent Power Producers of New York’s 40th annual Spring Conference on March 24.

“I personally love Gov. [Kathy] Hochul’s all-of-the-above energy strategy,” Richard Barlette, IPPNY chair and director of state government affairs for Constellation Energy, said during his opening remarks. Intermittent resources need “firm” support as they continue to grow. “We must remain laser-focused on reliability while continuing to scale clean technologies. Wind, solar, nuclear, storage, natural gas and hydrogen must all be a part of the conversation.”

NYISO CEO Rich Dewey reinforced these points in his keynote address. While he didn’t express specific policy positions, he painted an all-too-familiar picture of New York’s aged generation fleet, thinning margins and the intense balancing act the ISO had to run during June 2025’s heat wave and the late January winter storm. Keeping the grid going in the winter, Dewey said, was harder than in a summer heat wave primarily because of fuel constraints and reliance on older units. (See NYISO Details Late June Heat Wave for Reliability Council and NYISO Provides System Data During Winter Storm Fern.)

“[What] keeps me up most at night [is] the aging generation fleet. Twenty-five percent of capacity is more than 50 years old,” Dewey said. “But on hot days, on cold days, we can’t maintain a reliable system without these resources and increasingly depend on their participation more and more.”

NYISO planning studies have typically assumed that these generators would be online for the foreseeable future, but this is becoming “less and less responsible” when examining the grid, he said.

Since the enactment of the New York Climate Leadership and Community Protection Act, “we’ve deactivated 4,200 MW of primarily dispatchable resources,” Dewey said. Of the 2,274 MW interconnected since then, “almost all of that is renewable intermittent resources. When you think about that, that’s a significant margin.”

Large loads come online much faster than generation as well, he said. A data center has an average build time of about 18 months; generation of all types takes years to get on the grid under the best circumstances.

Ruben Diaz Jr., Natural Allies | IPPNY / Tim Raab

“There is a recognition through the State Energy Plan that we are going to need to have progress made on some of these repowering proposals,” Dewey said in response to an audience question about lowering energy prices. “Those repowering proposals will yield a generation source that is more efficient, cleaner and more cost-effective in the long run.”

In another keynote address that ended the conference, former Bronx Borough President Ruben Diaz Jr. spoke on behalf of Natural Allies, an industry group trying to make the case for natural gas generation as compatible with environmental justice.

“Energy policy is not theoretical. It shows up on the kitchen table. It shows up in utility bills. It shows up in the cost of groceries. It shows up in the cost of rent,” Diaz said. He said his past environmental justice positions were not at odds with wanting to upgrade old fossil plants. “New York is among the highest electric bills in America, and that is … experienced by working families every single month. When energy policies undermine reliability, it is not the wealthy who feel the pinch.”

Diaz said environmental justice means reducing emissions and that the current law makes it impossible to replace old, dirty plants with cleaner ones. This has knock-on effects, forcing more emissions in poor neighborhoods as reliability margins thin.

‘How to Keep the Damn Lights on’

After Dewey’s keynote, a panel convened to discuss “How to Keep the Damn Lights on.” The panel was moderated by longtime industry analyst John Reese of Morningsidenergy and included Matt Schwall of Alpha Generation; Pallas LeeVanSchaick of Potomac Economics, the NYISO Market Monitoring Unit; Derek Hagaman of Gabel Associates; and Bryan Sixberry of GE Vernova.

Reese opened with a breakdown of federal Energy Information Administration data showing that 50% of generation in New York City was over 50 years old. Roughly 6% of units in the city were 70 years old, around six times the national average of 1.3%. He said that the mechanical wear on a fossil fuel unit made living to 80 almost impossible. The oldest units will either be offline in the next few years because they cannot be maintained affordably or because they “decide not to wake up one day.”

From left: Pallas LeeVanSchaick, Potomac Economics; Bryan Sixberry, GE Vernova; Matt Schwall, AlphaGen; Derek Hagaman, Gabel Associates; and moderator John Reese, Morningsidenergy | IPPNY / Tim Raab

“The data is scary, and we need to do something,” Reese said.

Hagaman said running the old peaker plants was “not the most affordable option”. He said it was hard to sell the idea for repowering because any investment on the system is going to cost money.

“Frankly it’s a matter of making it more affordable than the alternative,” Hagaman said. He pointed to Arizona and Colorado as states that were embracing repowering as elements of an all-of-the-above strategy.

Schwall said replacing old, inefficient natural gas plants with new units that could also burn alternative fuels was compatible with an environmental justice message. He pointed to his childhood growing up on Staten Island, where it took over 25 years for the Fresh Kills Landfill to be remediated.

Matt Schwall, AlphaGen | IPPNY / Tim Raab

“Environmental progress does not happen overnight. … Environmental progress is not always perfect,” he said. Using natural gas generators is not a perfect solution in the context of our environmental goals, but in the near and medium term, especially in New York City it is a necessary solution for the environment and for reliability.”

AlphaGen, which owns and operates the Gowanus and Narrows floating power plants in New York City, has proposed replacing the six peaking units with three lower-emitting ones. (See AlphaGen Proposes Repowering Peakers to Meet NYC Reliability Need.)

Sixberry said the main stopping blocks for getting a generator on the grid were transformer and breaker backlogs. Repowering projects could take advantage of transformers and breakers that are already in place, effectively swapping out an old generator for another.

In response to a question about local communities not wanting repowering projects because of emissions, Schwall said that while repowering is “not a perfect solution,” it is very difficult to get enough renewables on the grid to maintain reliability.

“Installing technology that is cleaner, that is capable of running on zero-emissions fuel … it’s not a zero-sum game,” he said. “It does not mean that the state should not be pursuing solar energy or storage.”

Fighting the Headwinds

In another panel, renewable energy advocates discussed how local government could help in the face of unprecedented federal interference in state climate policy.

Most of the panelists said New York needed to reduce permitting time and expedite construction. Longer lag times for permits create uncertainty for project financing, which can lead to canceled projects.

“The biggest issue we’ve been facing prior to this administration, and writ large, has been that it takes too long to develop. It takes way too long to do anything, not just renewable energy,” said Alicia Gené Artessa, director of the New York Offshore Wind Alliance. “When you have a set contract price and it takes 10 years to build a project, the price doesn’t match up anymore.”

From left: Kristina Persaud, Advanced Energy United; Alicia Gené Artessa, director of the New York Offshore Wind Alliance; Ryan Stanton, executive director of the Long Island Federation of Labor; and Jeffrey Escobar, Sheppard Mullin, discuss how to develop renewables in an uncertain climate. | © RTO Insider 

Jeffrey Escobar, a partner at Sheppard Mullin, concurred, saying that he has had clients drop out of the development cycle in New York after buying up build sites because it was too difficult to build.

Other panelists said the state needs to make contracts with renewable energy developers more flexible so they could withstand supply issues or federal policy shocks while also making development smoother.

Ryan Stanton, executive director of the Long Island Federation of Labor, said more effort has to be put into communicating with local communities and local stakeholders when developing renewables. He cited Citizens United as a major contributor to enflaming local discourse.

“They have a ton of money to back doing nothing,” Stanton said. “It’s a lot harder to communicate effectively, to have an honest conversation about making policy decisions for the long term.”

He pointed to the success of renewable development in the Republican-controlled town of Brookhaven, on Long Island. He said the town is on track to link 900 MW of offshore wind to the town because the labor movement, local government and community leaders had coordinated and communicated effectively.

Artessa agreed, saying communication is the “backbone” of their industries. She said she hoped the state would work with local governments to make it clear to developers where they were welcome, and to coordinate planning across municipalities.