Search
February 17, 2026

CTC Global Partners with Google to Launch GridVista System

Advanced conductor manufacturer CTC Global is working with Google Cloud and Tapestry to launch the GridVista System, which combines conductors with fiberoptic cables to offer operators visibility along the entire transmission line.

GridVista’s line awareness, paired with Google Cloud and Tapestry’s artificial intelligence-powered tools, turn line data into actionable intelligence that can optimize grid capacity, prevent outages, cut wildfire risk and lower operational costs, the companies say.

CTC’s advanced conductor technology can double the capacity of an existing transmission line, having worked on 1,400 projects with more than 300 utilities in 65 countries, CEO J.D. Sitton said in an interview.

“Now with the GridVista System, we’re adding data-grade fiber optics to the product that enables the measurement of strain and temperature and capacity and events along the entire length of the power line; not at discrete points, but literally the entire length between substations,” Sitton said. “And so, GridVista really is providing an entirely new level of detail and insight into the operating status of the transmission lines.”

Unlike traditional dynamic line rating products that add sensors at discrete points along a transmission line, the fiber optics in GridVista give utilities full knowledge of what is happening.

“It’s a much higher degree of resolution and a much faster feedback loop to the utilities about an event: a line down, a hot spot, a lightning strike, a rifle shot, a tree branch falling on the power line. These sorts of things,” Sitton said. “We know about it immediately. We know exactly where it is.”

Combining that visibility with advanced conductors puts utilities in a position to operate more cheaply, he said.

“We save them money on the capacity upgrade, and we’re saving them money from an operations perspective, because they’re much smarter about how they dispatch their operating resources and their lines,” Sitton said. “We enable them to operate with a higher degree of reliability because they no longer have these blind spots in their operation between the substations where they’re guessing what’s going on or not going on with their power line.”

The growing demand from data centers that want to connect to the grid much faster than the industry has historically been able to add new power plants or transmission is leading to more demand for products like GridVista.

“I think probably for the first time, we’re seeing utilities in the United States and Western Europe realize that they are, in fact, capital constrained, and so they need to get more out of their existing systems faster than historically they’ve had to,” Sitton said. “So, all of these things are, I would say, accelerated by the dramatic pickup of demand from the data centers that’s creating an environment where the utilities are very open to much more capital and operating cost-efficient solutions.”

Utility customers increasingly have higher expectations for a safe and reliable grid, which can benefit from the awareness the new product unlocks, he added.

CTC had a pre-existing relationship with Google around speed-to-power for its data centers. With GridVista, Google Cloud helped with the user-interface system, and Tapestry is working on reforming grid operations.

“Google is using GridVista as the source for what they call ‘ground-truth data,’ so the fundamental operating data that will feed the capabilities of their software platforms,” Sitton said. “So, it’s basically a two-way street, and it’s really quite exciting to see the early reactions of some of the utilities that we’ve been engaging with about the combined capabilities.”

AI applications are starting to transform how the grid is operated as the industry adopts the technology.

“We’re starting to see utilities rethink how they dispatch their grids, how they respond to operating challenges within their operating assets, and how they think about the kind of the planning aspects of their system,” Sitton said. “So, the next round of interconnections, and the next retirements of generators along their transmission lines and … the way they plan for these things is fundamentally changing.”

Utilities move at different speeds, but the “thought leaders” in the industry are starting to roll out AI applications that improve their operations and planning, he added.

“Utilities are utilities,” Sitton said. “They are, by definition, some of the most conservative organizations on the planet, I think, for good reason. But they’re not all cut from the kind of the absolute conservative cloth, and so we are seeing many utilities moving much more quickly.”

PJM, Monitor Float Reserve Market Changes

PJM and the Independent Market Monitor are drafting proposals to rework the RTO’s reserve market.

Reserve performance has been a focus since PJM implemented a market overhaul in 2022, which was followed by a drop in performance. That was counterbalanced with a 30% adder to the reserve requirement in May 2023, a change PJM’s Emily Barrett said has allowed the RTO to maintain adequate reserves at increased costs to load. The adder was scaled back recently to 20% as average performance increased above 85%.

The RTO’s proposal would increase the penalties for synchronized reserves that fail to respond, replace primary reserves with a handful of more targeted products based on duration and how quickly the resource can respond, and shift procurement to nodal rather than sub-zonal. Barrett presented the package to the Reserve Certainty Senior Task Force (RCSTF) on Feb. 11.

The RCSTF’s work was one of several areas the PJM Board of Managers wrote is integral to the efforts to address rising data center load.

Barrett said the current penalty rate is based on the credits reserves received between events, which can result in widely varying penalty rates for resources in the same event. The logic driving the RTO’s proposal instead would use the amount paid to reserves between events. The rate would be set at the greater of:

    • the mean synchronized reserve clearing price over the past delivery year, broken into intervals set at the average number of days between deployments exceeding 10 minutes (there was an average of 18 days between 10-minute synchronized reserve events in 2025, with an average clearing price of $1,910/MWh); or
    • the maximum system marginal price in the 30 minutes after a resource underperformed.

Stakeholders said there has not been enough focus on why performance has been low, which should be addressed alongside discussions on how underperforming resources should be penalized. Untying the penalty rate from what a resource is paid, and setting it so high, could result in resources that receive a low or zero clearing price facing penalties in the thousands, they said. Resources also could be held responsible for PJM inaccurately modeling parameters.

Joel Romero Luna, a market analyst with the Independent Market Monitor, said it has been doing outreach since mid-2024 and found a lot of issues related to communications and personnel. Performance has improved since generation owners ironed those issues out, leaving inaccurate parameters as a primary driver of the low response rate. In particular, the ramp rate and economic maximum parameters tend to be based on averages rather than how a resource expects to respond.

Monitor Joe Bowring told RTO Insider resource performance had not dropped, but rather the response rate was low because of communications issues.

“There was no actual drop in performance, and PJM’s arbitrary increase in reserves was not justified and continues to be unsupported. The measured performance of some reserves was low because PJM was using antiquated communications technology. The technology issue has been significantly, but not completely, addressed,” he said.

Luna noted that the Monitor has recommended that PJM count overperformance when calculating the fleet response rate. When capturing both sides of reserve performance, he said the response rate is closer to 100%.

PJM’s Kevin Hatch added that while the outreach to the owners of underperforming resources has been led by the Monitor, RTO staff have been involved as well.

New Reserve Products

The RTO’s proposal would add a ramp/uncertainty reserves (RUR) product capable of responding in 10 minutes, which would come with its own reserve requirement, and an energy gap requirement met by a combination of reserves.

Barrett said primary reserves backfill needs not met by other products, but resources lack clear performance obligations and penalties for not meeting commitments.

The energy gap requirement would be tailored toward meeting operational needs identified on medium- and high-risk winter days, and the 30-minute secondary reserve requirement would serve as a backfill to ensure the largest system contingency is met.

The 30-minute RUR and 30-minute secondary reserve products both come with a four-hour minimum availability. Barrett said event duration is expected to become more important as battery storage becomes more common.

Vitol’s Jason Barker said the transparency of the new market design will be crucial to avoiding “black box pricing” with unexplainable variations in pricing.

IMM Proposal

The Monitor proposed to retain most of the reserve market structure, while changing the procurement requirements for 30-minute synchronized and primary reserves. Bowring said PJM should eliminate the adder on the grounds that it is not required for reliability and there is no demonstrated need for it.

The 30-minute reserve requirement would be defined as double the single largest contingency plus real-time uncertainty, defined as the two-hour forecast for wind, solar and load minus the forecasts used in real-time security-constrained economic dispatch 10 minutes in advance. The synchronized reserve requirement would be the single largest contingency plus the extended reserve requirement of 190 MW. The primary reserve requirement would be the larger of 150% of the largest contingency or real-time uncertainty.

Performance evaluation and penalties would remain the same for synchronized reserves. For non-synchronized reserves, they would be pegged to the evaluation and penalty rules for secondary reserves. Reserve resources would be required to be capable of operating for four hours or longer.

Bowring told RTO Insider the proposal would capture the uncertainty of wind and solar generation in the reserve requirement based on analysis of actual resource behavior. PJM has not shown there is a need for larger market design changes, he argued, and its proposal appears designed to increase energy market revenues while failing to fully reflect energy and ancillary market revenues in the capacity market.

“We do not believe that PJM has supported its proposals with analysis, and we do not agree that it’s appropriate to use the demand curve for reserves to increase energy market revenues,” he said. “PJM has not demonstrated the existence of an ‘energy gap’ despite multiple different approaches, and PJM has not demonstrated the need for making the reserve markets more complicated.”

Devendra Canchi, a senior analyst with the Monitor, said PJM’s proposal would go too far and increase costs with no corresponding benefit. He presented part of the Monitor’s proposal during the RCSTF’s meeting Jan. 28.

Some stakeholders argued that the Monitor’s position is not backed with analysis and PJM could save costs by modeling reserves in SCED and accounting for them in transmission constraints.

Bowring responded that the proposal is fully supported and that any nodal distribution scheme would be arbitrary.

“No one knows where the next generation trip or forecast error will occur. PJM’s proposal would increase market costs by arbitrarily redispatching expensive resources with no defined benefit,” Bowring told RTO Insider.

Deputy Monitor Catherine Tyler said adding constraints would increase ratepayer costs, and any assumptions PJM makes about when supply is going to be lost run the risk of being inaccurate. While it’s important to ensure that reserves are deliverable, PJM’s proposal would not accomplish that, she argued.

NERC Staff Outline Growing LTRA Challenges

SAVANNAH, Ga. — NERC Director of Reliability Assessment and System Analysis John Moura acknowledged recent criticism of NERC’s Long-Term Reliability Assessment but emphasized the report is “not a prediction of outages [or] a forecast of impending blackouts.”

Speaking at the ERO’s quarterly technical session Feb. 12, he also discussed some of the ways the ERO’s approach to the assessment is changing in light of ongoing changes to the electric grid.

Moura’s remarks came the same week as his appearance, along with NERC CEO Jim Robb, at the Feb. 9 board meeting of the Organization of MISO States to discuss objections by OMS members to the ERO’s 2025 LTRA, which rated five assessment regions, including MISO, as having the potential to develop into high risk between 2026 and 2030.

The assessment means that planned resources as of July 2025 would lead to energy shortfalls in excess of resource adequacy targets or baseline criteria for unserved energy or loss of load.

OMS members disputed this label in a letter that argued NERC should have counted resources in MISO’s fast-track interconnection queue. The organization argued that the 11 GW of natural gas generation and battery storage proposals to come online by mid-2028 should make up for the 7-GW shortfall projected in the LTRA. (See MISO States Dispute ‘High Risk’ Designation from NERC.)

It was the second time in as many years that MISO objected to NERC’s assessment of the region’s risk. In 2025 MISO’s Independent Market Monitor pointed out that NERC had committed an error in rating the region as high-risk in the previous year’s LTRA. After a back-and-forth, NERC agreed to downgrade MISO to “elevated” risk.

At the technical session, Moura explained — as he did at the OMS board meeting — that NERC’s deadline for including energy projects passed in mid-July 2025 and MISO’s fast-track projects “were only approved in September, [so] were unable to be included in the energy analysis.” (See NERC to OMS: Long-term Assessment not a Predictor of Risk.) He added that the resources might not have been included in the assessment anyway because of changes to how NERC assesses the impact of generation additions.

“Counting capacity is really a thing of the past. Energy adequacy cannot be determined by counting capacity,” Moura said. “These projects weren’t left out because we missed them. The issue is really modelability, deliverability and certainty.”

Moura explained that modelability refers to whether any constraints could apply to a resource — for example, if a gas generator is coming online without firm gas arrangements — that could make it “difficult or impossible for that new generator to get gas on peak winter days.”

Deliverability means the ability of electricity generated in one area to reach load centers in other areas, which Moura observed could be an issue for the 7 GW of fast-track resources “south of the well-known MISO South to MISO North interconnection constraint.”

Finally, Moura defined certainty as the assurance that projected resources will be available as expected, an important question “given the supply chain challenges … in our assessments showing delays of up to 50% of planned generation not meeting their in-service dates.”

Avoiding Panic

Along with the difficulty assessing new generation, NERC Trustee Ken DeFontes observed that in previous LTRAs, NERC staff have also had problems predicting generation retirements because they “tended to only count retirements that had been announced [when] the reality was … it was likely more were coming.” He asked how the team behind the 2025 assessment had accounted for the issue.

Mark Olson, NERC’s manager of reliability assessments, replied that the ERO aims to account for both the “certain level of retirements in terms of being announced and in the process to be deactivated, versus a more speculative industry projection of potential retirement.” Moura added that NERC also must consider retirements as “a dial that can be turned … where states and other policy makers can take action to keep units online [or] have the ability to throttle those resources.”

Karen Onaran, CEO of the Electricity Consumers Resource Council, reminded attendees of the growing influence of NERC’s reliability assessments on lawmakers and policy makers, and warned that without proper context the reports could be “used as ammunition [to] saddle consumers with costs that may not be necessary.”

“At least within the last couple of years, when these reports come out, [they raise] some panic, and [hurt] some feelings. And we take it, and we try to take the lessons learned and figure it out,” Onaran said. She suggested that NERC’s Engagement and Outreach Committee, launched in December 2025 to handle external communications, work on a plan for conveying “what this report really says … and what risks it’s identifying … to all levels of government and to the industry.”

Navigating Extreme Winter Storms: A System-of-systems Perspective

In late January, the mass of cold air — typically held at bay by the high-level upper atmosphere winds that occur 10-30 miles above the North Pole — staged a jailbreak. A breakdown of the polar vortex occurs every so often when sudden stratospheric warming occurs (that’s what occurred during 2021’s devastating Winter Storm Uri), disrupting the vortex and allowing fugitive cold air to spill southwards, bringing extremely cold temperatures in its path. Collision with low-pressure systems can create a volatile cocktail of ice and snow.

That’s exactly what happened when the much-anticipated Winter Storm Fern swept across the country at the end of January. The southeastern U.S. suffered the brunt of physical damage, as high-level warmth brought rain that froze as it encountered cold air close to the ground, coating power lines with ice, breaking equipment and leaving approximately 1 million people without power.

Peter Kelly-Detwiler

Most damage came from ice on distribution networks, with Entergy estimating 860 poles and 60 substations out of service. Some larger transmission lines also were affected. Entergy reported 30 transmission lines out, and the Tennessee Valley Authority also saw as many as two dozen high-voltage transmission lines affected.

While thousands of customers went without power for many days, and the damage to the distribution system was serious, the bulk power system in the southeastern U.S. generally held its ground. The same was true for other regions of the country that got mostly snow but saw extreme cold prevail from Texas to New England.

This stands in stark contrast with winter storms Uri (February 2021 — with its extended system-wide outages in Texas) and Elliott (December 2022 — with outages in TVA and Duke service territories, while PJM barely squeaked by).

Thermal Plants, the Winter Workhorse

While prices spiked across multiple markets, the grid remained intact. There were several major reasons for that, with the underlying factors varying by region, but thermal plant reliability was a key theme. Gas generation was a critical player, but coal and oil also filled the gaps to meet surging demand. A review of several grids illustrates the point.

    • ISO-NE: New England saw dual-fuel plants switch over from gas to oil, burning through about half of the region’s stored oil reserves in late January and early February, with oil-fired generation surpassing gas for a couple of days. The grid operator requested a Department of Energy waiver to avoid emissions penalties.
    • PJM: The Mid-Atlantic grid operator commented that during the “strongest sustained cold period that the PJM system has experienced since the 1990s” it saw an average 18 to 19 GW of outages (compared with an expected 15.9 GW), with plant equipment failure as the greatest cause. Tight gas supplies also were a concern. In response, PJM called upon 5.2 GW of oil-fired capacity “that would otherwise have been restricted,” aided by a DOE order waiving emissions restrictions.
    • MISO: MISO ran coal generation more heavily than normal, including three of the five coal plants whose retirement was delayed by the Trump administration, delivering a total of 965 MW during much of the period from Jan. 21 to Feb. 1.
    • ERCOT: ERCOT also made it through, with its improved weatherization and inspection programs of power plants and transmission facilities reducing generation outages. Increased reserves, more flexible operations (including increased dual-fuel capabilities) and a growing deployment of batteries also helped.

We learned, once again, that our nation’s power grids rely on a significant fossil mix when the weather turns nasty. Coal-fired generation soared across the lower 48 states during the week ending Jan. 25, up 31% from the prior week and representing 21% of power generation, while gas stood at 38% and nuclear at 18%.

| EIA

We also learned that events on the grid are increasingly ripe for being politicized. Less than two weeks after the storm had passed, DOE issued a fact sheet declaring that “Beautiful, clean coal was the MVP of the huge cold snap we’re in right now,” and decrying the actions of the previous administration’s “energy subtraction policies which threatened America’s grid reliability and affordability.”

It seems that such politics are sadly unavoidable these days. But let’s try and remove politics from the conversation and focus on the facts. It’s not controversial to state that during the coldest days that occur every so often, fossil thermal generation — whether oil, coal or gas — is extremely valuable in keeping the lights on. Renewables and storage can pitch in, but they are a long way away from being able to handle that task.

As an example, when this article was drafted (Feb. 13), renewables made up 10% of the generation mix in ISO-NE at 3:30 p.m. In addition, on this clear and sunny day, rooftop solar cut peak demand by about 5,000 MW, with the duck curve exerting its influence on net demand. So, a combination of utility-scale and on-site renewables can generate energy and cut the use of fossil fuels.

However, that solar doesn’t address the evening peak, and it doesn’t help after heavy snow. The day after Fern departed New England, nearly all the panels in the region were blanketed with snow and the duck was hibernating, with no visible impact to be seen. As a dependable resource that can provide both capacity and needed energy, neither variable wind nor solar check the box.

Batteries can help meet peaks and address this issue of renewable energy droughts, if those storage assets can be fed by renewables, and renewable energy shortfalls are of relatively limited duration. That equation may change if we eventually get the long-duration, 100-hour batteries promised by start-up companies such as Noon and Form Energy, and those storage resources are deployed in enormous quantities at affordable prices. But we’re not there yet.

Addressing the Demand-side Thermal Issue

At the same time, much of the peak demand that occurs during extreme cold or hot spells could be greatly mitigated if we started to more accurately frame those peaks as a thermal problem, stemming from the need to heat or cool our built spaces. The better we insulate those spaces, the less volatility we would see in resulting energy demand.

EPA reports that homes can save an average of 15% on heating and cooling costs by employing a variety of insulation technologies. This need not be a herculean task, and insulation is effective. For example, upgrading U.S. homes to a 2009 building code could keep residences above 40 degrees Fahrenheit for nearly two days in sub-zero temperatures.

Maintaining a reliable and cost-effective grid is not, and never has been, a strictly supply-side issue. Rather, the power grid’s various supply and transmission technologies, combined with demand-side technologies, comprise a massive system of systems that can best be made economical, reliable and resilient if it is viewed and addressed as such. But such an approach requires sophisticated thinking that defies the simplistic and easy answers that many politicians and some analysts proffer.

Climate is an Even More Complex System of Systems

Perhaps counterintuitively, the polar vortex breaks down when it experiences spikes in the stratospheric temperatures, known as sudden stratospheric warmings. When those breakdowns occur, some areas of warmer air pour into the Arctic while lobes of polar air flee southwards. Nobody fully understands the dynamics behind this, but models suggest that climate change may be a driving force. If so, then we can add that to the lengthy list of other climate-related issues that justify cutting carbon emissions from our energy systems.

| NOAA

To pretend that we can oversimplify either the power grid or the impacts of human activity on the earth’s climate is a mistake. Each of these complex systems — and their interactions with each other — deserves far more scrutiny and understanding than most of us are willing to devote. We need more data and information, not less, and to entertain more nuanced conversations as well.

Around the Corner columnist Peter Kelly-Detwiler of NorthBridge Energy Partners is an industry expert in the complex interaction between power markets and evolving technologies on both sides of the meter.

N.Y. Cancels Solicitation but Remains Committed to OSW

After 19 months, New York has abandoned its most recent attempt to procure offshore wind power, saying it would not be prudent to proceed amid federal policy uncertainty.

The decision is only the latest setback for a state that, despite multiple cancellations and cost escalations, has the largest offshore wind pipeline in the nation.

New York presented it as a delay, not an end, of its offshore wind ambitions.

In the same week that it canceled the procurement, New York issued a request for information on ways to keep the industry from further atrophy; told an industry conference that offshore wind remains an important part of its energy strategy; and approved the structure of the offshore wind renewable energy credit (OREC) system that subsidizes increasingly expensive construction.

New York launched its fifth offshore wind solicitation July 17, 2024. It attracted 25 proposals totaling 6,870 MW from four bidders — Attentive Energy, Community Offshore Wind, Ørsted and Vineyard Offshore.

Attentive withdrew its proposals in October 2024, and Ørsted in August 2025.

Attentive Partner TotalEnergies and Community partner RWE said in November 2024 and April 2025 respectively that they would put their wind energy development efforts in U.S. waters on hiatus because of the political uncertainty surrounding offshore wind.

And of course, Donald Trump was re-elected president in November 2025 with a promise to block offshore wind development and has been trying to follow through for 13 months.

Given all this, the New York State Energy Research and Development Authority (NYSERDA) announced Feb. 13 that it was canceling the fifth solicitation.

A spokesperson said: “Federal actions disrupted the market and instilled significant uncertainty into offshore wind project development. Given the current level of uncertainty, it would not be prudent to enter into new long-term OREC purchase and sale agreements at this time, and as such, NYSERDA has concluded ORECRFP24-1 without award.”

With that, four out of five of New York’s solicitations are now dead ends.

The two contracts awarded in the 2018 solicitation (to Empire Wind 1 and Sunrise Wind) were canceled because cost escalations rendered the contracts unprofitable. The two contracts totaling 2.6 GW awarded in the 2020 solicitation were canceled for the same reason. And the three projects totaling 4 GW chosen in the 2022 solicitation were rendered untenable when General Electric halted development of the specified turbine.

Only the 2023 solicitation — a rush effort to salvage the state’s imploding offshore wind portfolio — has yielded steel in the water: Empire Wind 1 and Sunrise Wind (1.73 GW combined) are now under construction, at much greater cost than first agreed on.

But that is more than other states with offshore wind aspirations can say.

With the 132-MW South Fork Wind (which was completed in 2024 outside the NYSERDA procurement structure), New York now has three wind farms spinning or being built off its coast.

No other state has more than one, and most have none.

Speaking to Oceantic Network’s IPF 2026 on Feb. 10 in New York City, NYSERDA President Doreen Harris reiterated the state’s commitment to offshore wind despite Trump’s persistent efforts to destroy the sector, including one stop-work order against Sunrise and two against Empire. (See U.S. Offshore Wind Supporters Map Path Forward.)

It is an important part of New York’s strategy to meet rising power demand, she said: “To be clear, offshore wind remains a central part of how we get from here to there on the order of 7 GW of incremental capacity between now and 2040.”

That is a telling detail.

New York’s official offshore wind goal, established by its landmark 2019 climate law and specified on NYSERDA’s own website, has been 9 GW by 2035.

‘Meaningful Step’

So New York is pushing the timeline back and potentially changing the path but not abandoning the effort.

NYSERDA on Feb. 10 issued a request for information (RFI) seeking industry input on a potential predevelopment support program by which the state would enable the private sector to “continue investing responsibly in their lease areas to advance project development during a period of federal uncertainty, so that projects are well positioned to move forward efficiently when federal conditions become more favorable.”

One potential approach for this could be co-investment by the state, Harris said.

Also looking forward, NYSERDA on Oct. 2, 2025, proposed an offshore wind implementation plan that among other things structures the OREC system to reduce impacts on electric utility ratepayers, including through sales to voluntary third parties.

The Public Service Commission approved the plan at its Feb. 12 meeting (case 15-E-0302). PSC Chair Rory Christian said in a news release: “The commission acted to ensure the orderly management of the OSW program and corresponding sale of OSW Renewable Energy Certificates (ORECs) by NYSERDA when the program becomes operational. Our decision today will benefit residential and commercial customers by ensuring that ratepayer costs related to offshore wind development are reduced.”

The New York Offshore Wind Alliance issued a statement supportive of the state’s three policy moves.

“We understand NYSERDA’s decision to close the 2024 offshore wind solicitation without awards was because the original proposals were based on a completely different federal landscape,” said Alicia Gene Artessa, director of the industry group.

“We are strongly supportive of NYSERDA’s recent RFI exploring a predevelopment model for offshore wind solicitation. We believe that fundamentally changing how New York procures offshore wind energy is the right path forward while we adapt to the current federal instability.”

And she said: “We are also encouraged to see that the PSC approved NYSERDA’s Offshore Wind Implementation Plan yesterday. This is a meaningful step from the PSC to allow for more flexibility in the sale of ORECs and ensure our current under-construction projects continue to be managed effectively.”

N.J. Targets Data Centers in New Source Push

New Jersey legislators have advanced a bill that would protect ratepayers from rate hikes triggered by data center development as the state looks for ways to add generation capacity, boost its infrastructure and curb energy use.

A Senate and an Assembly committee each backed S731. It would require the New Jersey Board of Public Utilities to develop a tariff to set special rates for “large load data centers” with a maximum monthly demand of at least 100 MW.

Also moving ahead were unrelated bills to require data centers to report their water and electricity use, to boost solar development and to study how advanced transmission technologies can help the state. The BPU also announced rates would be flat in the next period, which starts June 1.

The tariff set up by the BPU under S731 would shelter ratepayers from rate increases stemming from “increased electricity demand caused by large load data centers.” The tariff also should “incentivize large load data centers to develop and utilize methods to increase energy efficiency, including through the use of technologies that capture and utilize the heat produced by the large load data center.”

To submit a commentary on this topic, email forum@rtoinsider.com.

To make sure investments in the state yield their full potential benefits over time, the legislation requires the state’s four utilities to ensure data centers provide financial guarantees they will “take at least 85% of service they request for a period of not less than 10 years.” Data centers must show the proposed project is “unique and not duplicative of any other large load data center project” in or out of state.

The Senate Economic Growth Committee backed the bill 3-1. The Assembly Telecommunications and Utilities Committee backed a version 9-0 after testimony that showed vigorous support for the legislation.

Data Center Reporting Requirements

Preparing for the predicted dramatic increase in data centers and ensuring they pay for themselves without overly burdening ratepayers is central to the state’s efforts.

Analysts say one cause for the predicted energy shortfall is that the state, like others in PJM, has closed aging, fossil-fueled resources more rapidly than new, mainly clean energy sources have come online.

In a separate vote that also focused on data centers, the Senate Environment and Energy Committee backed S3379, which would require data center owners or operators to compile a water and energy use report to the BPU every six months. The BPU would publish the information, according to the bill, which passed without comment.

The data center tariff, while drawing mainly supportive testimony, demonstrated the complexity of the issue. Some legislators expressed concern that the state should avoid creating obstacles that could deter data centers from coming to the state, but most speakers focused on ratepayer benefits.

“Ultimately, this is about cost fairness for the people of New Jersey,” said Assemblyman David Bailey Jr. (D), a bill sponsor and committee member. “This is about us looking out for our constituents and their best interests and the overall health of New Jersey electrical grid.”

Zach Landesini, a resident of Vineland, N.J., said he sees the need for the legislation in the experience of his community with the development of a 700,000-square-foot data center by Data One, whose website says it will use 350 MW. Landesini said he envisions scenarios that emerge, and which the bill could address.

He said the project will draw 15% of its power from the local grid, and added: “What would happen if Data One needed to draw a larger portion of their power from the local grid?”

“How would this affect local ratepayers?” he asked. “This could happen for a variety of reasons, including technical deficiencies in power generation, and site expansion.”

Supply Side Pressure

Brian O. Lipman, director of the New Jersey Division of Rate Counsel, also backed the bill, but acknowledged that it couldn’t shield ratepayers from all the cost hikes associated with a data center. One committee member asked him how the state could calculate the increased cost of power when rates increased due to a data center pulling a large volume of power from the grid, effectively reducing the supply for everyone else.

“On the supply side, there’s not a lot we here can do, other than if we want to build generation somehow to add more supply,” he said. “What we can do is, and what we are doing is, we’re pressuring PJM: First of all, don’t sign up a data center if you don’t have the power to serve them.”

He noted that an alternative, outlined in a separate bill, would require new data centers to “bring their own generation.” But if New Jersey passed such a law and other states in PJM do not, data center developers would simply build outside of New Jersey, which nevertheless would bear that cost through its participation in the RTO, he said.

Harnessing New Technology

The debate over how the state addresses the looming energy shortfall, and the added pressure on generation and grid systems from data centers, stepped up in earnest in June 2025, when a 20% rate hike on the average electricity bill took effect.

Gov. Mikie Sherrill (D), who took office in January, pledged in her election campaign to freeze rates. Her first executive orders upon taking office laid out a range of measures designed to do so and boost state generating capacity, in part by accelerating solar development. (See New N.J. Governor Rapidly Confronts Electricity Crisis.)

In that vein, the Assembly Telecommunications and Utilities Committee moved ahead a bill, A3969, that would extend the state’s current goal of incentivizing 3,750 MW of solar power by 2026 to one of incentivizing 750 MW per year through 2035.

The Senate Economic Development committee backed a bill that seeks to prepare the state’s infrastructure for future stress by using advanced transmission technologies (ATT). S2189 would require the BPU to evaluate the “attributes, functions, costs and benefits of ATT” and look at whether it could “enable an electric public utility to provide safe, reliable and affordable electricity to its customers, considering existing and planned transmission infrastructure and projected demand growth.”

The bill defines ATT as any software or hardware technology that increases the capacity, efficiency, reliability or safety of an existing or new electric transmission facility, including grid-enhancing technology and advanced or high-performance conductors.

Rate Hikes Temporarily Avoided

BPU President Christine Guhl-Sadovy announced the results of the state Basic Generation Service auction conducted in early February. The results, which largely are shaped by the PJM capacity auction, will mean the average electricity bill stays roughly the same when the new rates take effect June 1.

The minimal increase or slight decline for some ratepayers depending on their utility is due largely to a “collar” PJM agreed to place on its prices, limiting their increase, because of lawsuits filed by New Jersey and other states. Sherrill has advocated for an extension of the collar, Guhl-Sadovy said.

“I think we would anticipate there to be a higher price if we don’t have a collar,” she said. “Because we have significant load growth and so we need to get more generation and more capacity through things like demand response in order to meet that kind of load growth.”

She acknowledged the collar is a temporary measure and the trajectory of future rates is primarily in the hands of PJM. And she noted that Sherrill outlined a series of initiatives to hold down rates and increase generation capacity in her first two executive orders. They included boosting solar and battery storage power and creating a virtual power plant strategy.

“Those things will not have an overnight impact on capacity prices, but they will certainly put downward pressure on capacity prices, because we will have more generation bidding into the capacity market,” she said.

ERCOT Taps the Brakes on Batch Study Process

With the 232 GW of large loads seeking to interconnect with the ERCOT system having “clearly broken the process that we had,” CEO Pablo Vegas said the grid operator’s proposed batch process — or cluster studies, in most managed grids — will provide clarity and transparency to data center developers.

He told his Board of Directors during its Feb. 9-10 meeting that the batch studies will change a process that is “very different than what we’ve had before” by reserving capacity on the transmission system for the large loads’ future use.

“Today, that is not the way it works. It doesn’t work here, and it doesn’t work that way anywhere in any grid in the U.S.,” Vegas told the board. “What we found is that the processes that we had set up were really designed for a system where we were seeing interconnections in eight to 15 a quarter, to where we’re now seeing 80 to 100 in that same time period.”

ERCOT staff have proposed grouping together large load requests to be evaluated, rather than relying on the current individual studies that transmission service providers conduct. The studies will determine the amount of requested load that can be reliably served each year over a six-year period and the transmission upgrades needed to accommodate the full load requested. (See ERCOT Finds Stakeholder Support for Batch Process for Large Loads.)

The grid operator plans to begin the process with a “Batch Zero” study to transition from the current process. It will file a protocol revision request codifying the process and bring it to the board for its approval in June, followed by the Texas Public Utility Commission. If all goes as planned, Batch Zero will begin by late summer.

ERCOT originally planned to request a good-cause exemption from the PUC to begin the Batch Zero study in February. However, after a Feb. 3 workshop with stakeholders left many “open questions to be decided,” PUC Chair Thomas Gleeson said, the commission directed the grid operator to tap the brakes on its effort.

“A top-down, ERCOT-driven process where there isn’t a lot of input from stakeholders is really not the way to do this,” Gleeson said Feb. 11 during an industry conference in Round Rock, Texas. “I do believe that we’ll end up with a better outcome from getting all that information on the front end rather than being kind of centrally controlled by ERCOT and the PUC.”

The grid operator will continue to gather stakeholder input through the Large Load Working Group, stakeholder workshops and the stakeholder-led Technical Advisory Committee.

Batch Zero will set the foundation for the subsequent studies, which are to take place every six months. Another NPRR will be filed to codify the ongoing batch process and brought to the board in September.

“We heard the [PUC’s] message loud and clear. We need to keep the pace going on,” Vegas said. “This work continues to be an important part of supporting the economic growth that’s coming. We need to ensure that we have a more robust and a very scrutable process that’s going to benefit all stakeholders, but we can’t do that at the expense of expediency.”

Board Approves Tier 1 Project

The board approved a Tier 1 project previously endorsed by TAC, a $117.4 million, 138-kV South Texas Electric Cooperative transmission build. The project will accommodate a 300-MW ammonia plant near Victoria on the Texas Gulf Coast.

The project is expected to be in service in June 2028.

The board also elected Vegas as CEO and ratified ERCOT’s officers; confirmed TAC’s 2026 leadership; and signed off on the updated Market Credit Risk and Corporate Standard, which removes references to the Reliability and Markets Committee after it was dissolved in 2025.

The consent agenda included three protocol revisions and two changes to the Planning Guide:

    • NPRR1304: incorporate the Other Binding Document “Procedure for Identifying Resource Nodes” into the protocols to standardize the approval process.
    • NPRR1305: add the Other Binding Document “Counter-Party Credit Application Form” into the protocols to standardize the approval process.
    • NPRR1311: correct an error in the calculation of real-time reliability deployment price adders for ancillary services when ERCOT is directing firm load shed during a Level 3 energy emergency alert under the RTC+B protocols, ensuring final ancillary services prices cannot exceed $5,000/MWh.
    • PGRR127: outline the additional generators that may be added to the planning models to address the generation shortfall introduced by the implementation of House Bill 5066’s requirements and increased load growth. The PGRR would also add a supplemental generation sensitivity analysis for Tier 1 Regional Planning Group project evaluations to minimize the effects of the additional generation on transmission project evaluations.
    • PGRR132: clarify that new resources must interconnect to ERCOT through a new standard generation interconnection agreement.

CAISO WEIM Surpasses $8B in Cumulative Benefits

CAISO’s Western Energy Imbalance Market has surpassed $8 billion in cumulative economic benefits since its 2014 launch after providing participants with $415.65 million in gross benefits in the fourth quarter of 2025, according to an ISO report.

In a news release accompanying the quarterly report, CAISO noted it has revised the methodology it uses to calculate WEIM benefits to reflect the market behavior of variable energy and battery storage resources.

NV Energy earned the largest share of Q4 benefits, at $83.10 million, followed by PacifiCorp ($66.45 million), Los Angeles Department of Power and Water (LADWP) ($40.71 million), Balancing Authority of Northern California (BANC) ($37.15 million), Public Service Company of New Mexico ($34.78 million) and NorthWestern Energy ($23.41 million).

PacifiCorp, with its East and West balancing authority areas, was the biggest net exporter of energy, at nearly 1.54 million MWh. The next largest exporters were CAISO (720,188 MWh), NV Energy (514,474 MWh), Salt River Project (427,248 MWh) and LADWP (250,431 MWh).

The biggest net importer during the quarter was CAISO, at over 1.02 million MWh, followed by BANC (507,535 MWh), Portland General Electric (433,229 MWh), Powerex (408,684 MWh) and PacifiCorp (391,588 MWh).

In the WEIM, a net export represents the difference between total exports and total imports for a BAA during a particular real-time interval, while a net import represents the inverse, meaning a BAA can be both a heavy exporter and importer over an extended period based on varying momentary needs and trading positions over that period.

CAISO’s BAA facilitated the greatest volume of wheel-through transfers, at 964,219 MWh, followed by PacifiCorp’s two BAAs (501,382 MWH) NV Energy (445,994 MWh) and Arizona Public Service (327,982 MWh).

‘Robust, Transparent, Reflective’

The Q4 report also came with some changes in how CAISO calculates WEIM benefits.

“With significant changes in market resources and operational dynamics across the West, maintaining an accurate picture of market performance is essential,” CAISO said in the release. “Additional time was needed to post this report so that the ISO — working closely with its WEIM partners — could refine the benefits methodology to reflect these evolving market resources and system conditions. This helps to ensure its logic remains robust, transparent and reflective of current conditions.”

The revisions are spelled out in the “Counterfactual Dispatch Cost” section of the updated “EIM Quarterly Benefit Report Methodology.”

In the case of variable energy resources, the revised methodology adjusts the market’s bid range logic for resource base schedules to reflect real-time dispatch (RTD) market data rather than the previous approach of relying on 15-minute market (FMM) data.

“This adjustment offers a more accurate reflection of actual market conditions in two key aspects. Dispatches and transfers from WEIM solution are based on the RTD markets and using bids from RTD market will better align,” CAISO explains in the methodology document.

“Second, the forecasted output for variable energy resources often differs between the FMM and RTD markets. By using the RTD forecast to estimate load imbalance in the benefit calculation, it more accurately reflects actual RTD conditions. It also eliminates imbalances that reflect forecast differences and focus on imbalances from actual market redispatches.”

In describing the impact of the change, CAISO cites the example of a wind resource having 73 MW of energy available based on the FMM forecast but getting reduced to 16 MW in the RTD forecast. Under the new logic, the resource’s bid range would be capped at 16 MW, putting both its base schedule and dispatch-adjusted base schedule at 16 MW heading into the real-time interval, leaving a load imbalance of 0 MW.

“This 0-MW imbalance reflects the scenario where the market is not redispatching the resource down. Instead, it simply accounts for the adjustment in the forecast available in RTD. Therefore, there is no WEIM cost associated with this resource,” CAISO wrote.

Another revision to the methodology deals with the modeling of battery storage resources in the counterfactual dispatch — that is, a theoretical dispatch that would occur without the availability of WEIM transfers.

CAISO explains that, prior to Q4 2025, batteries were modeled like conventional resources, with the model estimating an available dispatch range and determining the counterfactual dispatch based on the resource’s price — an approach that ignored a battery’s limits based on its state of charge. To address that, the updated methodology:

    • Adjusts a battery resource’s maximum bid limit based on its state of charge;
    • Enforces a constraint that prevents a battery from being dispatched below a defined minimum state of charge; and
    • Recognizes the end-of-hour constraint defined by a battery operator.

Former FERC Chair Jon Wellinghoff: A Career Focused on Consumers

Former FERC Chair Jon Wellinghoff is best known as a champion of the demand side, from shepherding through the landmark Order 745 to his prior work for consumers and his subsequent jobs working on demand response.

“Everything in my life that I have done and tried to promote and advocate always comes back to — how do you best help consumers, ultimately,” Wellinghoff said in an interview. “I mean, the whole focus needs to be on the consumer. If you don’t think back from the consumer perspective, you know, it’s not about the utilities, it’s not about Voltus, it’s not about the generator, it’s not about any of those things. It’s about the guy, or the woman, who pays the bill.”

Since leaving FERC, Wellinghoff had a stint at Tesla when it focused on the energy transition. He’s been involved with Voltus since 2017 (first on the board, later as an executive), which seeks to pay consumers for leveraging their distributed energy resources to provide services to the grid.

His focus on consumers goes back to his education, where he got a master’s degree in mathematics from Howard University in Washington, D.C., and briefly taught at an inner-city school there.

“I quickly determined that teaching school is the hardest job in the world, and so I went to law school to become a lawyer,” Wellinghoff said.

He enrolled in the Antioch School of Law, which was in Washington despite being tied to Antioch College in Ohio. He was part of the first graduating class at the law school, which no longer exists, alongside legendary athlete Jim Thorpe’s daughter Grace Thorpe and “quite a few interesting folks.”

His early legal career was in line with his alma mater’s focus on consumers, as he became a staffer for a commissioner on the Nevada PUC. His stint there overlapped with the Arab Oil Embargo in a state where oil was a major source of electric generation.

“So, as a result, in the 18 to 24 months I was with the commission, we saw more utility rate cases being filed than that commission had seen in the previous 10 years,” Wellinghoff said. “So, in like a two-year period, I got this compressed experience with utilities and how they did utility rate cases and their impact on consumers.”

After that experience, he went to work at the District Attorney’s Office in Washoe County in Reno, Nev., where Wellinghoff represented consumers in utility rate cases.

“That was the first time that was ever done in Nevada, and probably in any state, by a state district attorney’s office where they actually represented consumers before the commission, and I then translated that into a statewide office where I actually helped draft some legislation that created a consumer advocate’s office in Nevada,” Wellinghoff said.

He effectively wrote the legislation that led to his next job as state consumer advocate in Nevada, where he continued advocating for consumers, arguing utilities had to control their costs.

“It was a battle, and it continues to be a battle, because the utilities are not financially incentivized to do that,” Wellinghoff said. “And I often did find that there were third parties like solar firms and energy efficiency providers and HVAC providers and others that you were more attuned to working with consumers to try to control consumers’ costs because they had some financial interest in doing that.”

He was one of the early consumer advocates, though around 10 other states had a version of that office when he started the job in 1981, and he kept working there until 1989.

In the 1990s he entered private practice, working on lengthy litigation stemming from a massive industrial accident when the PEPCON rocket-fuel plant in Henderson, Nev., exploded. It was equivalent to a one-megaton detonation, and it caused $100 million in damage in the Las Vegas area.

Working on that case, Wellinghoff did 150 depositions and learned about mass-tort litigation, which would serve him well when he returned to energy law full-time in 1998 to become the general counsel at the Nevada PUC.

“I was sort of in the middle of the Enron debacle, and we were actually drafting legislation in Nevada during Enron to restructure state of Nevada to make it competitive — to allow entities like Enron, as a retail provider, to provide retail energy services to consumers throughout Nevada,” Wellinghoff said. “We did that up until the crisis happened in California, where the whole wholesale market flew apart.”

Enron’s manipulations and the California energy crisis killed similar legislation in other states, and it made Wellinghoff turn back to his skills deposing witnesses who were involved in the crisis, which involved bad actors from many other firms. He did that from outside the PUC, where in private practice, he represented the MGM Resorts in Las Vegas, which included casinos like the Bellagio and had an aggregate power demand of 300 MW.

The utility for Las Vegas asked for $922 million to pay for inflated wholesale power prices at the time — a sum greater than every previous rate request it had ever filed, Wellinghoff said.

“I started taking depositions,” “And I took about 20 depositions of utility executives and of expert witnesses that the utility had hired or had as consultants, and they had one consultant who was charged with developing a software program to assess the risk of their trading program to trade energy in the wholesale market during the Enron debacle,” Wellinghoff said. “And I asked him if he ever assessed what the level of risk was to be short.”

The answer was no — the program kept crashing when trying to calculate the risk. The utility was unhedged and exposed to prices that were two to three times the norm. In a deposition, the consultant admitted “the risk of going short was very, very, very, very large,” Wellinghoff said.

The depositions were part of the evidence the PUC used to slash the request down to some $400 million, and the utility went bankrupt — its shareholders eating the risk it had tried to foist on consumers.

After that experience, President George W. Bush nominated Wellinghoff to FERC, where he served for seven years, five as chair after President Barack Obama elevated him.

“I’m still the longest serving chairman at FERC, which is kind of amazing to me, since this year is 20 years since I went into FERC,” Wellinghoff said.

While there have been other commissioners with long stints on the regulator in recent history, Wellinghoff said the job involves public service with little pay. He had 11 staffers reporting to him who made more money as long-tenured government workers. That pay issue is why most stop at one five-year term at best.

Before joining FERC, the only experience Wellinghoff had with markets was the “Enron debacle,” but there he learned about other wholesale markets like in ISO-NE, where at the time efficiency could be bid into the market, and demand response.

“I saw that there was room for creativity,” Wellinghoff said. “And I also truly believe that whatever could help consumers we should try to do. Whatever can provide consumers with ways to control their costs and be more efficient and get energy services, more reliably and more effectively. And I realized that markets are probably the way to do that.”

During his time as chair, Wellinghoff was able to move that ball forward with Order 745, which required demand response programs in the energy market to pay consumers the same as generators. That case was appealed by opponents and eventually made its way to the Supreme Court, which upheld the order in the EPSA v. FERC decision.

“That was, I think, the most important case that has ever been decided on a FERC opinion,” he said. “And I believe that that was also one of the most important cases in energy for consumers, because it was a clear victory for consumers that gave FERC the authority to oversee consumers’ participation in wholesale markets and provided consumers with that opportunity to participate at a fair level of compensation.”

Wellinghoff’s keynote address “Grid Innovation at the Intersection of Policy and Markets” will be delivered Feb. 25 at Yes Energy’s EMPOWER 2026 conference in Boulder, Colo. To learn more about EMPOWER visit yesenergy.com.

MISO Load Forecasting Shows up to 82 GW in Data Center Load by 2044

MISO’s inaugural long-term load forecasting survey among its membership shows the possibility of 82 GW in data center load by 2044.

The RTO said responses to its pilot survey place data center demand at the top of the list. However, 55 of the 82 GW have been categorized as “low confidence.”

“There’s no surprise here that data centers make up the bulk,” Dominique Davis, manager of strategic insights, said Feb. 12 during a webinar hosted by MISO.

MISO plans to use the survey results to publish a finalized long-term load forecast in April 2026. The RTO said it will use complementary research and third-party analysis to supplement incomplete data to produce a final, nearly 20-year load forecast.

Beyond data center load, members reported the potential for an additional 4 GW in manufacturing load and 3 GW of other, miscellaneous load.

By MISO’s count, public announcements of large loads coming online by 2030 have more than doubled in the span of a year; however, the RTO warned that public announcements of large load do not necessarily reflect firm commitments. In 2024, MISO counted 14 GW of large load announcements. In 2025, it recorded 43 GW.

MISO Central — one of the RTO’s three reliability regions, containing Wisconsin, Michigan, Indiana, Illinois, and parts of Missouri and Kentucky — contains the most potential for large loads, at nearly 40 GW by 2030. However, more than half of that is what MISO considers “low confidence.”

“While this surge is notable, we’ve also observed cancellations for various reasons,” Davis said.

MISO divided load additions into high-, medium- and low-confidence categories:

    • High-confidence additions represent those that have associated interconnection agreements in place with regulators’ knowledge and construction underway;
    • Medium-confidence projects are those that have been submitted through a MISO planning process or have been publicly announced, but construction has not yet begun; and
    • Low-confidence projects are those in the early stage that are not in MISO’s planning process but appear in integrated resource plans or remain conceptual.

MISO’s data collection turned up eight spot loads that would require more than 1 GW, “which compose a significant reliability risk in the future,” Davis said.

The RTO said it cannot share its load forecasts on a local resource zone level because of utilities’ insistence on confidentiality and their nondisclosure agreements with developers.

To collect intel for its long-range pilot, MISO introduced confidentiality provisions that allowed it to receive greater insights from utilities, even when it could not share specific breakdowns. By contrast, the RTO’s 2024 load forecast relied on internally culled data from public sources.

MISO said it received submissions in response to its pilot survey from 44 entities, which are responsible for about 80% of the footprint’s load. “That’s a huge step towards transparency,” Davis said.

The RTO found that 31 responses on large loads additions match public announcements, she said. “So, we did get a good representation in that area.”

MISO is already home to about 17 GW in large loads.

Waning After 2035?

Davis said the survey results show that large load planning tapers off after the first 10 years of the survey and stagnates beyond 2035. MISO said most entities did not provide data on large loads beyond 2035.

Additionally, MISO said only 60% of its respondents even filled out the section on large load. “Confidentiality limits reduced data sharing and response rates, complicating double-counting checks and mapping large load submissions,” it said.

According to data from Yes Energy, over the next decade, the MISO territory is due to host or be affected by 50 data centers either already under construction or in advanced development; 74% of them list in-service target dates in 2026 or 2027. All of them have a 60% or better chance of being built. More than a dozen data center projects would be near Chicago.

Davis said MISO does not believe the responses are an indication that data center load would stagnate. She said for the final forecast, the RTO would fill in some expectations rather than accept an actual leveling-off of large load expectations.

Mississippi Public Service Commission consultant Bill Booth asked why MISO would not take members at their word and stop forecasting more dramatic data center growth beyond 2035. He said there could be improvements in data center management and strides made in efficiency that lead to demand inertia.

Davis said the industry does not appear primed for a slowdown by 2035. “We do understand this comes in phases, but how much energy they’re going to need in phases is not well understood.”

Stakeholders asked how much double-counting of facilities might be in the pilot survey.

Davis said MISO was able to pinpoint a few likely cases of hyperscalers shopping two locations, but she said survey answers left out a lot of identifiable information, especially for potential large loads in the nascent, “low-confidence stages.”

Booth suggested that MISO strike all low-confidence load growth from its forecast. He said he did not want up-in-the-air figures to influence transmission planning.

“Why would you use any information that isn’t reliable? Building a base on shaky data ensures that ratepayers are going to be paying for more transmission than needed,” he said.

Davis said MISO would not include 100% of low-confidence projects in its forecast. It will provide more data on its process when it releases the results in April, she said.

The Union of Concerned Scientists’ Sam Gomberg pushed back on the notion that MISO should wait and act on only loads that are a sure thing. He said that’s not how the electricity industry works.

“If we waited for the load to arrive, it’d be sitting there in the dark while we built,” Gomberg said.

He said MISO’s far-from-perfect effort is nevertheless a good start and shows the need to “drive forward on this low-certainty chunk of load” to figure out what could pan out. He said MISO should strive to provide more transparency.