Search
March 10, 2026

Emerald AI, InfraPartners Team up to Deploy Flexible Data Centers

Digital infrastructure firm InfraPartners and Emerald AI announced a new partnership to construct data centers designed to be flexible grid assets.

The Flex-Ready Data Centers combine Emerald’s energy management software solutions with InfraPartners’ off-site manufacturing approach to constructing and upgrading data centers, the companies announced March 10.

“The innovation here is to put together the data center design with the needs of the software from the beginning, so that the data center is delivered as a flex-ready data center, so there is no retrofitting later,” Emerald’s chief scientist, Ayse Coskun, said. “There are no additional components needed later.”

The software needs telemetry from all aspects of data centers, which includes computing, cooling, any behind-the-meter generation or storage, and other uses of electricity at the facility, she said.

The main attraction for data centers to become flexible grid assets is speed-to-power, but flexibility offers clear benefits to the operation of the grid, InfraPartners Chief Technology Officer Harqs Singh said.

The Electric Power Research Institute has “a data center flex program with all the utilities in it, and so they’re very interested in being able to have data centers become assets, rather than just consumers,” Singh said. (See EPRI Launches DCFlex Initiative to Help Integrate Data Centers on the Grid.)

With Emerald’s management services, data centers can respond to energy availability, match up with intermittent renewables or just respond to prices, Coskun said.

“So, this interface enables not just speed to power, but more broadly a more amicable relationship between the large data center loads and the grid,” she added.

Compared to “traditional” data centers — those used for cloud computing and data storage — AI data centers have a very high “power density,” which is why they have made headlines about massive loads ranging from the hundreds of megawatts to gigawatts.

“The power density of a rack — a cabinet of servers — is increasing like 10 times compared to a typical cloud rack,” Coskun said. “The AI data centers are running a mix of training, inference and other AI loads, and there are differences. For instance, training loads tend to be more spiky, changing the power up and down more rapidly compared to cloud loads.”

Cloud computing data centers must respond to consumer requests, such as when someone accesses a database or streams video, while AI data centers have more batch processing, long-running training and heavily use their computer hardware, she added. Using energy management techniques can help smooth out their highly variable demand.

“I consider this a welcome side effect of controlling power that the spikes are reduced,” Coskun said. “Because essentially, it’s not only necessarily just reducing the power during a high demand time, but also you can set up overall power limits to gently curb the power without adversely impacting performance, at least beyond the performance constraints, and then reduce these spikes.”

The grid does not respond well to major, fast fluctuations in demand or supply, so flexibility can make AI data centers much easier to handle on the bulk power system, she said. Energy management can also smooth out ramps from spiking energy during training and as they are responding to signals from the grid itself.

“In our work so far with power grid operators and utilities, we received both requests — ‘can you reduce the power over a gradual window of 10 to 15, minutes? We don’t want to see this sharp drop,’” Coskun said, and “‘can you respond within seconds in an emergency, if needed?’ And we demonstrated both. So, there’s flexibility on how quickly we can tune power as needed, depending on the needs of the grid.”

InfraPartners can build it all from the start with its approach of building data center infrastructure at a central manufacturing site and then deploying it where needed, Singh said. That can help with initial construction, but also as new chips constantly improve and existing chips wear out and need to be replaced.

“We are going to have to be a lot more agile,” Singh said. “We’re going to have to adapt a lot more.”

The biggest constraint the industry faces now is power supply, and one way of handling that will be to install more efficient chips as they become available, he said.

“That means that the data center needs to evolve to deploy the latest chips all the time,” Singh said, “and being really good grid partners, working with the grid, showcasing to them how are the loads changing. How do we manage our assets on the data center side with grid assets, such that we’re good partners and be able to power the performance improvements that are coming? … It’s what we call ‘the upgradeable data center’: having a data center that upgrades with different chip technologies that are coming.”

A lot of the contracts for chips last about five years, but how often the chips are going to be swapped out is somewhat uncertain at this point, he added.

State Briefs

KENTUCKY

House Committee Approves Bill to Create Carbon Storage Framework

The House Natural Resources and Energy Committee unanimously approved a bill that would create a regulatory framework for carbon storage wells.

The bill would require permits from the Energy and Environment Cabinet before constructing or operating carbon sequestration injection wells, with the goal of having the state government receive permission from EPA to be the primary regulator of the wells.

The bill now heads to the full House.

More: Kentucky Lantern

MAINE

Bill Would Place Temporary Moratorium on Data Centers

The Energy, Utilities and Technology Committee advanced a measure that would place a moratorium on data centers.

The bill would pause the development and permitting of new data centers larger than 20 MW until at least October 2027. The measure would also require the Department of Energy Resources to create a new Data Center Coordination Council, which would submit policy recommendations by the winter of 2027.

The measure now goes to the full legislature.

More: Maine Public Radio

MARYLAND

DOA to Reduce Fee for EV Charger Inspections

The Department of Agriculture said it plans to reduce the inspection fee for EV chargers.

Secretary of Agriculture Kevin Atticks told the House Environment and Transportation Committee his agency will tap into the Strategic Energy Investment Fund to help pay for the charger inspection program, which should lower the fee for each charging port from $150 to $75.

More: Maryland Matters

MASSACHUSETTS

EFSB Approves Construction of Storage Facility

The Energy Facilities Siting Board approved Jupiter Power’s plan to construct a 700-MW battery storage facility, which will be the largest in New England. 

The facility, which will consist of lithium-ion batteries, will be built on a former Exxon Mobil oil tank farm and will occupy about 16.5 of 20.75 acres.

More: WCVB

MICHIGAN

House Bills Seek to Put PSC Seats up for Election

A group of 13 House Democrats introduced proposals that would require Public Service Commission members to be elected by voters.

Under the plan, the PSC would expand to five members. The state’s political parties would nominate candidates while voters would pick commissioners beginning in 2028. They would serve staggered four-year terms and be capped at 12 years. Currently, the three-member panel is appointed by the governor to six-year terms, and all three current members are appointees of Gov. Gretchen Whitmer.

The bills were sent to the Government Operations Committee.

More: The Detroit News

OKLAHOMA

Corporation Commission Rejects OG&E’s Charge Plan for Natural Gas Units

The Corporation Commission voted to reject Oklahoma Gas and Electric’s request to apply a cost recovery mechanism for two natural gas units at its Horseshoe Lake Power Plant.

The decision blocks OG&E from using Construction Work in Progress (CWIP) for units 13 and 14, a financing method authorized under a new state law that allows utilities to charge customers for infrastructure projects before they are completed. In November, the commission approved OG&E’s request to build the two units but didn’t grant CWIP treatment. 

OG&E said it plans to file an appeal with the state Supreme Court.

More: The Oklahoman; KGOU

SOUTH CAROLINA

Santee Cooper to Buy Nuclear Power from JEA

Santee Cooper has agreed to buy $83 million of electricity from JEA’s Plant Vogtle in Georgia. It will purchase 206 MW in 2027 and 103 MW in 2028.

Two of the plant’s four reactors were completed in 2023 and 2024, and the agreement requires all power to come from the new units.

JEA’s board was told it could expect $203 million in revenue from the agreement. JEA pays $250 million annually to buy electricity from Vogtle. The company is offsetting the energy it’s selling with new purchase agreements with Florida Power & Light.

More: The Post and Courier

SOUTH DAKOTA

Senate Nixes Carbon Enviro Studies, Eminent Domain Restrictions

The Senate voted against bills related to carbon dioxide pipelines and eminent domain restrictions.

The Senate Commerce and Energy Committee voted 5-3 to reject a bill passed by the House that would have required environmental impact studies for carbon dioxide pipelines, saying it was unnecessary. Rep. John Hughes (R-Sioux Falls) argued that an impact statement is a more accessible way to learn how a project might affect the environment than other public filings.

Elsewhere, the Senate voted 19-14 not to put a resolution on the fall ballot that would have asked voters to narrow the use of eminent domain and place those restrictions in the state constitution.

More: South Dakota Searchlight; South Dakota Searchlight

VIRGINIA

General Assembly Nixes Bills Requiring Data Centers to Get SCC Certificate

Two bills that sought to give the State Corporation Commission the authority to review how proposed high-load users — mainly data centers and manufacturing facilities — would impact the environment and energy reliability were tabled for the year.

A bill in each house would have directed the SCC to review if a proposed project that would need more than 25 MW would have an impact on rates, affect grid reliability or hinder the utility’s ability to follow environmental regulations. The bills also would have allowed the SCC to examine the environmental and health impacts a project could have on communities.

Opponents said the bills would have unnecessarily extended the building process and lengthened generation queues.

More: Virginia Mercury

Company Briefs

Hope Gas Announces $250M Pipeline Expansion

Hope Gas said it is proceeding with a $250 million pipeline expansion in Mason County, W.V.

The first phase of the project will be the construction of a 30-mile, 24-inch natural gas pipeline. Construction is slated to begin in April 2026, with a completion date slated for the end of 2026.

More: West Virginia Public Broadcasting

Qcells Resumes U.S. Solar Panel Production After Customs Furlough

Qcells, the U.S. solar manufacturing arm of South Korea’s Hanwha Solutions, said it has returned to normal solar panel production at its Georgia manufacturing facilities.

The increased production volume closes a chapter for the manufacturer’s U.S. operations. In November 2025, the company announced a furlough of 1,000 workers due to a temporary pause in production caused by lengthy customs clearance processes.

At full capacity, the two facilities will produce a combined 8.4 GW of solar panels and components annually.

More: pv magazine

SK Lays off Nearly 1,000 Workers at Georgia Plant

Battery company SK Battery America laid off nearly 1,000 workers at a manufacturing plant in Georgia amid automakers’ changing electrification plans and uncertain consumer demand for EVs.

The company said March 6 marked the last working day for 958 employees, about 37% of its workforce, according to a Worker Adjustment and Retraining Notification.

More: The Associated Press

Federal Briefs

NRC Approves TerraPower Nuclear Reactor

The Nuclear Regulatory Commission unanimously voted to grant a construction permit to TerraPower to build its new smaller, advanced nuclear reactors.

The permit allows TerraPower to begin pouring concrete and building the components of its proposed nuclear plant in Kemmerer, Wyo. The plant is currently expected to come online in 2031.

The 345-MW reactor is the first new U.S. commercial reactor in nearly a decade to receive clearance to begin construction.

More: The New York Times

Lawmakers Introduce Bill to Keep Colorado Uranium Disposal Site Running

A group of lawmakers introduced a bill that would extend the operations of the Grand Junction uranium disposal site.

The bill would amend the Uranium Mill Tailings Radiation Control Act of 1978 to extend operations at the site until it reaches capacity. The 1978 legislation currently lists Sept. 30, 2031, as a deadline to shut down the site. The site currently has room for 200,000 more cubic yards of radioactive material.

More: The Daily Sentinel

Oil Prices Soar Past $100/Barrel

Oil prices on March 9 soared above $100/barrel as the conflict between the U.S. and Iran escalates.

Gasoline prices neared $3.50/gallon on average and were up nearly 50 cents from a week ago. Compared to a month ago, before the conflict began, gas prices were up about 58 cents on average. Diesel prices were $4.66/gallon on average, up about 23% over the course of a week.

More: The Hill; The New York Times

PJM Plans to Release Reliability Backstop Design in April

VALLEY FORGE, Pa. — PJM has updated its thinking on the design of its reliability backstop procurement to meet rising data center load, gravitating toward a model in which the RTO would determine the amount of capacity to be purchased and act as the administrator and counterparty to the resulting agreements.

Rebecca Carroll, executive director of market design and economics, repeatedly stressed during a workshop March 4 that PJM does not have a proposal yet and will be working on its final design through at least April 10. The RTO aims to file a proposal with FERC by late May.

PJM is considering allowing data centers that procure capacity through the procurement to avoid being enrolled in its proposed Connect and Manage system, which would require them to curtail ahead of demand response resources during strained system conditions. While the amount of capacity purchased in the backstop would be determined by PJM, Carroll said the buyers may be able to submit their own preferred amount to purchase.

The procurement is intended to be a one-time measure that awards 15-year capacity commitments, possibly starting in 2030.

Core questions Carroll said PJM’s package must answer include how the RTO should balance reliability and over-procurement risks; whether changes to credit and collateral rules would be required to account for the greater risks associated with 15-year commitments; and when resources would need to be capable of coming online to participate in the backstop — with 2029 or 2030 being possible requirements.

A model in which PJM is the counterparty could pose substantial risks if either the data center or generation default, which could force the RTO to pick up the remainder of the multiyear commitment or to suddenly procure capacity for the data center. PJM presented examples of how securitization could be used to shift the risk to investors in a model similar to the bonds issued in the wake of February 2021’s Winter Storm Uri.

PJM Chief Risk Officer Carl Coscia said such a high collateral requirement would likely make any project unviable; however, the threshold should be high enough to prevent backstop participants from walking away from their commitments. The RTO’s risk provisions were designed around a three-year advance capacity auction awarding one-year commitments and are not well positioned to account for the uncertainty with 15-year obligations, he said.

Gwen Kelly, PJM senior director of credit risk and collateral management, said if the current credit policy was applied to a 15-year, 1-GW unforced capacity commitment at $400/MW-day, there would be a $662 million pre-auction credit requirement, $224 million of which would be returnable. Coscia said this accounts for deficiency charges over the 15 years.

PJM CFO Lisa Drauschak repeated that staff still do not have a proposal, and the presentation only illustrates possible pathways and outcomes.

Several stakeholders have presented their own perspectives and proposals during several workshop meetings held over the past month. (See PJM Stakeholders Begin Discussions on Reliability Backstop Design.) The workshops will be on hiatus over the next month until PJM has a complete package to present.

Many of the same sticking points dominated the discussions: how to define the amount of capacity to be procured; whether the procurement should be one-time; which resources are eligible to offer; and whether PJM, utilities or the data centers should be the counterparties to backstop commitments.

Independent Market Monitor Proposal

The Independent Market Monitor proposed a separate backstop procurement auction awarding 15-year commitments to new resources that reach agreement to serve data centers in the same locational deliverability area (LDA).

Bowring stated the key goal of any acceptable backstop auction design is to ensure that the data centers pay for the costs they impose on the system. Bowring stated that the Monitor’s proposal is the only that does not attempt to shift costs and risks to other customers and therefore the only proposal consistent with the statement of principles from the White House Office of Energy Dominance and the PJM governors.

The backstop auction would use the basic Base Residual Auction (BRA) design, modeling capacity transfer capability and limits between LDAs and providing a single clearing price up to a maximum based on the net cost of new entry (CONE) for the reference resource. Unlike the BRA, the backstop maximum price would be based on an assumed 15-year lifespan for the reference resource to match the commitment term. The contracts would cover the full cost of energy, ancillary services and capacity.

The Monitor’s backstop would not be a one-time measure and would be run after each BRA to procure capacity for data center load, which would be excluded from the standard capacity auctions. Bowring said limiting the number of backstop auctions is desirable, artificially limiting the number to one auction would counter the goal of the White House and governors to ensure that data centers pay for the costs they impose rather than shifting those costs to others. In the absence of future backstop auctions the increased demand by additional data centers would be borne by other customers, but just delayed by a year.

Seller eligibility would be limited to new generation, with no allowance for uprates, DR or resources which canceled deactivations or did not clear in the capacity market.  Bowring said this is the only way to counter the strong incentives to game the process and attempt to be designated as new.  Consumers could offer varying bids into the auction, with the highest winning if there is insufficient supply offered.

Monitor Joe Bowring said the core goal is to avoid shifting risk and costs to general load by using the auction structure to directly pair resources and data centers. Any model in which PJM or utilities would be the counterparty would risk requiring other customers to pick up the commitment if the data center defaults. He referenced the ratepayer protection pledge large tech companies signed on March 4, which states they will avoid shifting the costs of their interconnection and service onto other customers. (See Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.) Bowring stated that the pledge is a good model for all data centers.

Data centers larger than 5 MW would be required to participate in the backstop or be subject to curtailment similarly to PJM’s Connect and Manage proposal. PJM would work with electric distribution companies (EDCs) to identify the data center customers behind large load adjustments (LLA) the utilities submit for inclusion in PJM’s load forecast. The purpose of the comparatively cutoff for data centers is to prevent gaming by adding incremental load to a data center in order to avoid the rules.

GQS New Energy Strategies Principal Pamela Quinlan, representing the Data Center Coalition, said it would be a difficult task to tie LLAs to specific customers and seeking to allocate costs to a class of consumers based on how the electricity is used would be undue discrimination.

Bowring responded by saying the capacity shortfall PJM is facing is due to data center load growth, so the costs to mitigate those reliability risks should be assigned to those customers. Bowring said that costs must be directly assigned to data centers in order to prevent costs from being shifted to other customers.

“This is not an allocation to a class of customers. It is a direct assignment via a bilateral contract between willing participants that covers all the mutual risks in the contract. The concept of allocation means that the costs could be reallocated to other customers if a data center fails,” he said in an email to RTO Insider. “The idea that it is difficult for large corporations to reach a bilateral deal is far fetched.”

She argued the Monitor’s analysis assumes available capacity would remain the same in the absence of data center load growth, ignoring the likelihood of resources deactivating without that growth.

Quinlan said using a 15-year net CONE to set the maximum price, on the grounds that the commitment term would be 15 years, misses that resources could participate in PJM’s other markets after the commitment has expired.

Bowring responded that the basic BRA would have higher prices and would continue to attract the capacity needed to meet the needs of other customers. “As our analysis has shown, the current crisis in the capacity market is a result of data centers and not the organic growth of other customers.” It is fine that data centers participate in PJM markets at the end of the 15 year contracts.

“The proposal from the Data Center Coalition would have PJM serve as the counterparty. This means that the data center risk would be imposed on other PJM members. That is inappropriate and inconsistent with the basic goal of a backstop auction to address the costs imposed by data centers. It is not a general concern to be put off to the uncertain future but must be a core element of any backstop proposal,”he said in an email.

Data Center Coalition

Quinlan presented a set of priorities the Data Center Coalition believes should be incorporated into PJM’s design, centering around the position that the RTO should not make substantial changes to the capacity market while designing a one-time procurement structure.

The coalition recommended a backstop design in which PJM would be the counterparty and limited to participants which could be in service for the 2028/29 delivery year, with some allowance for the following year. The RTO’s design should not seek to determine resource adequacy for specific load-serving entities or use “uncertain” long-term forecasts to determine the need for capacity.

Concurrent with the procurement, the RTO should initiate a comprehensive review of the capacity market design, including improvements to load forecasting and consideration of “LSE-based frameworks,” Quinlan said.

Responding to questions on how the risk of a data center default could be managed, Quinlan said risk allocation is an important question to consider, but one that should be part of a long-term discussion. The ideal way to manage the risk associated with multiyear commitments is to ensure that the backstop is a short-term measure that buys time for more substantial market changes, she said.

Quinlan said the coalition considered ways of allocating costs that did not fall to LSEs, but there are practical questions on implementation and whether that can be accomplished in time for a May filing.

Google

Google recommended PJM adopt a backstop in which it procures capacity on behalf of load and allocates the costs across the region, leaving it to states to develop end-user rates. While the company shared several design components it prefers, it stated it does not have a complete proposal.

The company expressed support for one-time solution targeting a specific delivery year with well defined needs, leaving long-term capacity commitments as a separate issue. The backstop should focus on a fuel-neutral framework for incentivizing high-accreditation resources, with the capacity to be purchased defined by the deficiency in a particular auction — rather than targeting individual customers or a class of end-users.

Joint Stakeholders

A cohort of generation owners presented a backstop focused on meeting the shortfall expected for the 2028/29 BRA, scheduled to open in June 2026. The proposal was signed onto by Constellation Energy, Vistra, AlphaGen and Earthrise.

The one-time auction would be conducted in September and mirror the 2028/29 BRA clearing price for commitments up to 15 years. Resource offers would clear first based on the delivery year in which the project can come online, then by the length of the commitment term the offer requested. Procurement would be capped at the reliability requirement for the 2028/29 delivery year.

Seller eligibility includes new resources, uprates, DR, reactivated resources and existing resources that cleared above the maximum price in the 2028/29 auction.

Constellation’s Erik Heinle said the proposal is agnostic on how costs would be allocated, though it specifies that it would respect bilateral contracts. The risk of over-procurement and large loads not coming online would be managed by restricting procurement to load that is already accounted for in the capacity auction through capping the amount purchased at the reliability requirement for the 2028/29 delivery year.

Voltus

Voltus advocated for PJM allowing behind-the-meter capacity to participate in the backstop, arguing behind-the-meter resources have some of the quickest development times — making them well suited to a process intended to rapidly bring on new capacity.

Senior Manager of Regulatory Affairs Kimaya Abreu said PJM should be focused on procuring new capacity from resources not receiving a sufficient price signal from BRAs. That effort would be best served by taking a tech-agnostic approach which allows behind-the-meter capacity to participate. Not allowing DR, behind-the-meter storage, and other DERs’ participation would run afoul of requirements that BTM resources be treated comparably to generation, outlined by FERC in orders 719, 745, 841, and 2222.

Voltus argued including behind-the-meter resources in the backstop is consistent with proposals stakeholders made throughout the 2025 Critical Issue Fast Path process focused on meeting large load growth, as well as the statement PJM’s Board of Managers released at the conclusion of the process. (See PJM Board of Managers Selects CIFP Proposal to Address Large Load Growth.)

The company also endorsed a proposal by the Natural Resources Defense Council to define new capacity, which would allow resources that have completed the third phase of the interconnection process, or are in the surplus interconnection service process, to qualify so long as they are not already subject to the capacity must-offer requirement. For DR, resources that did not offer into the capacity market between the 2025/26 and 2027/28 auctions would be permitted, as well as those seeking to increase the amount of capacity offered.

NRDC

The NRDC’s proposal included an auction design in which capacity would be procured for a pool of buyers that would share the costs and risks, while sellers would receive 10- to 15-year commitments. If participating consumers default or do not come into service, either the capacity payments would be reduced, or the remaining load would pay more. The auction would be a permanent addition to the capacity market, conducted during each queue cycle’s final agreement phase. For Transition Cycle 2, this would be December 2026 or the following month.

Participating resources would be required to offer into BRAs during their commitment terms, with the revenues flowing to load with long-term commitments, which would also be responsible for capacity deficiency penalties. The auction would be open to large loads as well as LSEs seeking to offer long-term firm service to new customers.

The maximum procurement would be set at the amount bid into the auction, and any load that does not receive a commitment would be required to go through PJM’s proposed Connect and Manage system.

Eligibility would be limited to projects that have already cleared the interconnection queue but not yet entered service, as well as DR. The NRDC said the backstop should not be allowed to become another expedited interconnection queue following the example of the Reliability Resource Initiative, which allowed 51 projects to be inserted into TC2. Several of those projects have dropped out of the queue after running into high network upgrade costs.

Questions Raised over Ratepayer Protection Pledge

DALLAS — Figures in the energy industry are casting doubt on the White House’s proposal to shield ratepayers from the costs of interconnecting large loads, saying it ignores the jurisdictional responsibility between regulatory authorities.

The Ratepayer Protection Pledge secured commitments from developers to pay for the full cost of power plants and any required delivery infrastructure upgrades, whether the data centers use the power or not. The pledge asks the data centers to strengthen the grid’s resilience by making their backup generation resources available during times of scarcity to prevent blackouts and power shortages in their communities.

Leaders of seven large Big Tech companies signed the nonbinding pledge during a March 4 ceremony in Washington. (See Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.)

Rob Gramlich, president of the D.C.-based consulting firm Grid Strategies, said during the Federal Reserve Bank of Dallas’ Powering AI conference March 4-5 that there was a “deal to be had between the richest corporations the world has ever known” and the power sector and its end-use customers.

He found gathering regulators in the same room with the tech companies and getting the companies to agree on a “political level” to paying their “fair share” was “quite impressive.”

“That’s an important step,” said Gramlich, who served as an economic adviser to Pat Wood during the latter’s FERC chairmanship. “I know from my eight years being in a regulatory agency that having policymakers agree to, ‘Here’s the deal. Here’s kind of what we’re trying to achieve,’ and then go work out the details … that’s an important step.

“Those first two components are important and check two boxes,” he said. The third phase, implementation, is “really complicated … with a whole new set of complications,” Gramlich said, pointing to jurisdictional issues between the federal government and the states.

“Every market structure is different. Every state has a different arrangement of who’s responsible for transmission, generation, the planning and the cost allocation in ‘FERC land’ outside of Texas,” he said. “The retail-wholesale split is extremely complicated. FERC can’t right now just go and say, ‘Oh, data centers, you pay for this thing.’ Those are retail customers. FERC can’t tell what one retail customer versus another retail customer can do without the state saying that’s the way.”

Gramlich said FERC could assert jurisdiction over the states and go through seven years of litigation. “But there’s not seven years to go through that process,” he warned.

Andrew Schaap, CEO of developer Aligned Data Centers, said he is a firm believer in the necessity of the U.S. winning the artificial intelligence race.

“A lot of our adversaries are not having [these discussions]. They’re just doing it. They’re just going to go as fast as possible,” he said, noting China is building 1.5 TW of solar power a year.

“The fair rate pledge is a way to give latitude to operators like ourselves and hyperscalers to do behind-the-meter generation. I build my own power plant; I build my own systems,” Schaap said. “Is that the most efficient way to do it? Probably not. The most efficient way is to do it with the grid.”

Schaap did allow that the pledge is a “good incentive to get there faster.”

“One of the things that we’re all struggling with is there’s just not enough capacity fast enough,” he said.

Nick Elliot, who recently left the Department of Energy’s Grid Deployment Office to join the White House’s National Energy Dominance Council as a senior policy adviser, was in the room where the pledge was signed before he took a red-eye flight through thunderstorms to Dallas.

The transmission piece of the pledge was the hardest component “to get right,” he said holding a cup of coffee. The large load will have to cover 100% of the direct cost for a tie line, he explained, but connecting large generation to the facility is going to affect the entire system’s upgrade requirements.

“They will benefit you, but they’re going to benefit everyone else,” Elliot said. The hyperscalers are “willing to engage” in innovative regulatory structures, paying “full freight” or over time.

“We’ll build the highway,” Elliot said, speaking for the large loads. “And to the extent you end up building a whole bunch of other hotels on the highway, allocated to those other people later, we will backstop, and we’ll take the risk.”

“I certainly understand why Nick says that transmission is harder, because it’s harder,” said Stu Bresler, PJM’s executive vice president of market services. “The benefits of transmission do flow to many customers once it’s built. I think the challenges with allocating the cost of new generation on the system are equally difficult to transmission when there’s so much uncertainty about how much load you’re actually building for and who the customers will actually be. We have to get through all these cost allocation issues. They’re extremely foreign.”

Texas Addresses Rising Costs

Google was one of the seven companies that signed the pledge. Doug Lewin, who recently left his consultancy to join the company as the Texas lead for energy market development, made it clear that Google wants to be connected to the grid.

“You have several advantages to that, both from the data center side and from the public side,” he said. “We just have to have a historical perspective here and remember that for the entire life of the grid, over 100 years back to the earliest days, system use matters.

“It’s a simple division problem, right? Whatever your fixed costs are, can you spread that across as many users as possible that lowers the unit cost?” Lewin added. “That’s the basic economics of the grid as it has existed since the 1910s, and that principle still holds. So, we think it’s not only good for us to be connected, and this would go for any data center, but also for all customers.”

Google and Lancium, an energy technology and infrastructure firm, have filed joint comments on the Texas Public Utility Commission’s proposal to set interconnection standards for large loads (58481). They argued the proposal requires “large, upfront and nonrefundable financial commitments without providing clear study outcomes, defined interconnection timelines or a predictable path to energization.”

“This sequencing shifts significant risk onto customers before system feasibility and deliverability are known,” the companies said, referencing a flat $100,000/MW nonrefundable interconnection fee they said may result in overcollection beyond true costs.

ratepayer pledge

Google’s Doug Lewin and Emerald AI’s Arushi Sharma Frank share a laugh during their panel discussion. | © RTO Insider 

Instead, they have suggested a five-year, 50% minimum demand charge to fund infrastructure builds and share in costs. Lewin said that is a “very tangible way” large loads can shift around costs based on ERCOT’s Four Coincident Peak (4CP) program. Under the program, industrial customers are charged a fee for 4CP based on the amount of electricity consumed during a defined period in the previous year when demand on the grid was at its highest.

“Large loads can get away from paying a transmission charge,” Lewin said. “We have come forward with other partners and said, ‘We want a minimum transmission charge.’”

Emerald AI’s Arushi Sharma Frank, Lewin’s partner on the panel, applauded the Google-Lancium proposal.

“As the load comes in, it pays for the transmission upgrades, and if the load comes before that, great,” she said. “But if upgrades come first, then the loads need to still be there to foot the bill because they are going to eventually use it.”

ERCOT General Counsel Chad Seely said these discussions are part of policy issues being discussed in Texas.

“We’re trying to figure out what the best process is to study [large loads] reliably and make sure that we’re making the best decisions as far as building out the transmission infrastructure and making sure that they have enough skin in the game,” he said. “This is really a pivotal year for ERCOT and our stakeholders to kind of put forward these policy frameworks that will have long-lasting implications as we move forward to manage this tremendous amount [of load].”

Grid Strategies Calls NERC LTRA Too Pessimistic

NERC’s 2025 Long-Term Reliability Assessment paints too gloomy a picture of the electric grid’s risks over the next 10 years, according to a report from Grid Strategies that purports to present “a more comprehensive picture of adequacy risks … given different … underlying assumptions.”

The LTRA, released Jan. 29, warned that the resource adequacy outlook of the North American grid was “worsening,” with 13 of 23 assessment areas potentially developing into high or elevated risk between 2026 and 2030.

High risk means planned resources as of July 2025 would lead to energy shortfalls more than RA targets or baseline criteria for unserved energy or loss of load, while elevated risk areas meet RA targets but are likely to experience energy shortfalls in extreme weather conditions. (See NERC Warns of ‘Worsening’ Resource Adequacy Through 2035.)

MISO, PJM, Texas RE-ERCOT, WECC-Basin and WECC-Northwest fell into the former category, and the latter included MRO-Manitoba, MRO-SaskPower, MRO-SPP, NPCC-Maritimes, NPCC-New England, NPCC-New York, NPCC-Quebec and SERC-East.

NERC’s risk projections for most areas were driven by concerns about “demand growth … outpacing planned resource additions.” The ERO observed that new large loads such as data centers and industrial centers were projected to come online in nearly every assessment area, with the electrification of transportation and the spread of heat pumps further raising demand.

At the same time, many of the resources planned to replace existing thermal generation facilities are weather-dependent assets like wind and solar plants, or natural gas generation that will require investment in gas infrastructure to ensure the availability of fuel.

The LTRA did not arrive without skepticism: Members of the Organization of MISO States objected to the ERO labeling MISO as a high-risk area. State regulators said NERC should have counted resources in MISO’s fast-track interconnection queue among planned resources, which would have neutralized the 7-GW shortfall NERC expects to materialize by 2028. (See MISO States Dispute ‘High Risk’ Designation from NERC.)

Grid Strategies’ report similarly described the LTRA as “too pessimistic” because of overly strict assumptions about generation resource additions. The authors also claim NERC underestimated regions’ ability to import power and may have exaggerated data centers’ potential demand, further tightening reserve margins. However, they acknowledged the LTRA “highlights important reliability concerns” grid planners will need to address.

Some Tier 2 Exclusions Unwarranted

The authors’ first criticism of the LTRA concerned NERC’s practice of considering generators only under construction or with a signed interconnection agreement as “Tier 1” resources that can be counted as planned capacity additions. Tier 2 resources are those that are in earlier stages of development, such as an interconnection planning study, which NERC considers to “have more uncertainty in being realized.”

This understanding of Tier 1 resources is too limited, Grid Strategies argued, drawing on data from Lawrence Berkley National Laboratory to show that more projects exist with signed IAs than NERC acknowledged. The authors suggested the “mismatch in data could be due to different processing methods, collection periods and … resource accreditation methods.”

Using the LBNL data, the authors claimed a potential shortfall NERC missed in the SERC-Central subregion, and two near-misses in WECC-Basin and MRO-SPP, could be resolved, though the projected shortfalls in MISO and PJM still would remain and another near-miss could develop in New York.

The authors also questioned the decision to exclude all Tier 2 resources from the LTRA. They acknowledged these additions might not complete the interconnection process but suggested treating them as nonexistent for planning purposes was too extreme. Applying what they called “historic region-specific and phase-specific withdrawal rates to the LBNL data,” they suggested that already-planned Tier 1 and Tier 2 resources could be enough to avert shortfalls for all regions except PJM.

The biggest concern about resource additions is “delays in the study, permitting and/or construction phases” that push resources already in the queues past their projected start dates, including planned wind and solar resources. Adjusting the previous projection to account for likely withdrawals introduced shortfalls in WECC-Basin, SERC-Central, MISO and PJM, the authors wrote. They recommended permitting reforms and more efficient interconnection queues for all resources to reduce this chance.

Imports and Demand

Energy imports also have a role to play in addressing energy shortfalls, but the LTRA did not fully include this option either, Grid Strategies observed. NERC counted only firm interregional capacity transfers, but considering “non-firm imports over interregional transmission lines that have been available during historical grid stress events” could alleviate additional strain, the authors wrote.

Going further, the authors suggested that implementing interregional transmission additions proposed in NERC’s Interregional Transfer Capability Study could provide “even more security to avoid shortfalls.” While they acknowledged this would be a harder lift, they also claimed that “legislative and regulatory action taken now could accelerate the construction of these prudent additions.”

Finally, Grid Strategies asserted that utility projections of load growth on which NERC relied to create the LTRA “are based on overstated assumptions.” Data centers, which were a significant part of many utilities’ projections, were a key example in Grid Strategies’ report; citing data from TD Cowen, the authors suggested that limits in the supply of key computer chips could restrict data center growth in the U.S. by as much as one-third.

In addition, the authors observed that data centers could manage their demand to reduce their need for on-peak grid power, such as by using local generation and storage resources, or by curtailing demand when energy prices are high. Such activities “are unlikely to appear in utility forecasts of expected load,” the authors claimed.

To reduce uncertainty, Grid Strategies suggested updating data center load forecasts to take the chip industry and other outside data sources into consideration. Reducing data center load might eliminate all shortfalls in the LTRA, even without factoring in the generation adjustments, the authors wrote.

Searchlight Report Calls for Infrastructure Fund for Data Center Development

The Searchlight Institute released a report arguing that the data center buildout should be taken advantage of to pay for the expansion of the grid.

The think tank was established in 2025 by a group of Democrats who want to come up with policies most Americans support, and it dives into the growth of data centers as their impact becomes a top issue in politics. (See EPSA Summit Held with ISO/RTOs in the Middle of the Political Debate.)

“Seizing the Data Center Buildout for Grid Modernization” is written by Searchlight Senior Fellow Jane Flegal and was released March 9. It notes that the grid is aging and the clean, firm capacity needed for a reliable system is nowhere near built.

“Meeting national goals, from powering economic growth to enhancing our industrial competitiveness to advancing our national security, requires building a dramatically larger, more capable electricity system,” the report said. “Fixing this problem was always going to be expensive and politically difficult.”

The U.S. is competing with China on artificial intelligence, and a key constraint is the grid’s ability to serve the data centers needed to train and deploy AI.

“Data center demand could be an opportunity to fix the underlying problem,” the report said. “Data center operators want fast access to reliable power, certainty and fair treatment from policymakers. Policymakers and grid advocates can benefit from what those data centers can provide: capital, load growth that justifies long-needed grid investments and tax revenue.”

There is a narrow window through which policymakers must steer the grid buildout for optimal data center development. The report warns that window will close soon.

“The response to data center demand thus far has been ad hoc and inadequate,” the report said. “The structural failures underlying this dynamic, from interregional planning deadlock to permitting barriers to fights over cost allocation, require major policy change.”

The report suggests setting up an “American Grid Infrastructure Fund” to ensure spending associated with data center growth also enables grid modernization and maximizes benefits such as increased local tax revenues and construction employment.

“Participation agreements that require true cost causation commitments would deliver ratepayer cost savings that no voluntary commitment currently produces,” the report said. “An insurance pool backstopping stranded cost risk would unlock proactive transmission investment that can’t get built today without exposing ratepayers to downside. Procurement aggregation would convert hyperscaler equipment purchasing into a domestic manufacturing demand signal that no company negotiating alone can generate.”

Such a fund could be set up to be voluntary at first, but the report calls for a new federal law eventually.

“A fund can convert data center capital and political weight into an organized force for grid reform,” the report said. “Even if a fund failed to solve the political economy problem, it would generate more public benefit than the current, ad hoc approach.”

The fund would offer data centers cheaper financing, a standard participation agreement to accelerate interconnection, procurement aggregation to address grid bottlenecks and access to clean firm power at scale.

“The fund would not solve all of the grid’s problems on its own, but it could serve as part of a framework in which regulatory reform at the federal level, financing through the fund, and incentives for state action reinforce each other,” the report said. “The fund’s participation agreements, governance architecture and deployment strategy would aim to maximize the public benefit of data center demand growth while helping developers secure the certainty and speed they require.”

‘With the Skill to Survive,’ SPP Faces ‘Massive Challenges’

DALLAS — SPP CEO Lanny Nickell took to the stage to Survivor’s “Eye of the Tiger” as he opened the grid operator’s Energy Synergy Summit.

“The ‘Eye of the Tiger? That’s what you chose?’” he asked the event’s organizers as the music faded into the background.

The 1982 rock anthem highlights perseverance, determination and regaining one’s competitive edge, traits that will come in handy for the “massive challenges that are ahead of us.”

“Massive change, massive challenges, massive opportunities,” Nickell said in kicking off the March 2-3 event.

He harkened back to last year’s summit, SPP’s first, when the conversation centered on resource adequacy, increasing extreme weather events and other challenges. The days of excess capacity and unlikely load sheds were numbered.

“Now, we are scrambling, doing everything we can just to maintain a one-day-in-10-year probability of having an event,” Nickell said. “We were having tremendous load growth. Even that’s changed over a year.”

He said SPP was projecting 50% load growth during last year’s summit, but that has increased to 100% over the next 10 years. The RTO has responded, Nickell said, listing the Expedited Resource Adequacy Study process and “industry leading” High-impact Large Load (HILL) study process, both approved by FERC in the past year.

“If you are willing to bring generation with you, either co-located or no more than two buses away, you can get that generator interconnection studied along with the high-impact large load in 90 days or less,” he said. “That’s fantastic speed.”

SPP has also proposed a conditional HILL process for interruptible loads that want to interconnect quickly and a Consolidated Planning Process (CPP) that gives generators more certainty of their interconnection costs and yields affordable solutions through the traditional planning process. It expects commission approval of both in the next few weeks.

“That’s a win, but we’ve got a lot of other things that we want to work on,” Nickell said. “What’s next for us? What’s our next project?”

Whatever the next projects are, Nickell said they will require the same creative, outside-the-box thinking that produced the HILL study in 85 days, from start to finish and through the stakeholder process. They will also require collaboration with and support from members, regulators and market participants.

“We need you to work with us to figure out what it is that’s most important, what it is that we need to solve right now. If you can bring your ideas to the table, I’m convinced that we will come up with the best solutions,” he said. “We have to work together to economically and reliably keep the lights on. That includes solving problems together. SPP [and] staff can’t solve these problems alone. We need your help. That’s why you’re here.”

LaCerte: 765-kV Backbone Necessary

Two days before his nomination for a full five-year term advanced in the U.S. Senate, FERC Commissioner David LaCerte said in a fireside chat with Nickell that the industry’s long-term planning still needs to improve. (See related story, FERC’s LaCerte Clears Committee Vote on Nomination for a Full Term.)

FERC Commissioner David LaCerte | © RTO Insider 

“What that long-term planning looks like now is very much different from 2024 long-term planning,” he said. “It’s difficult. It’s tough because you want to project, but those projections have such a large standard deviation that it’s almost impossible to get it right.”

Picking up on SPP’s approval of four 765-kV transmission projects in its 2025 transmission plan, LaCerte said any future transmission plans should include extra-high-voltage facilities.

“We can’t live without 765s or you’re going to be an invertebrate, right? You don’t want to live your life as an invertebrate. You want to have a backbone,” LaCerte said. “It’s really important that we do these things properly because they have the potential to drive up costs on the consumers even more than they” already are.

He said a “big plus in [his] book” was having the White House come to the table with a bipartisan group of governors and PJM to propose a reliability backstop procurement for the RTO’s capacity auction and begin identifying universal parameters to protect customers from rate increases related to large loads and data centers. President Donald Trump also gathered the leaders of seven large tech firms March 4 to sign a “ratepayer protection pledge.” (See related story, Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.)

“I think that was a great first step because it brought all those people to the table … together to talk about the problems and identify what [is] acceptable, what’s not acceptable and then just identifying the costs,” LaCerte said. “Even at FERC in our building, we even struggle with identifying which costs we are catching in these tariffs and which costs are we not. … If it’s a struggle for the career FERC staff, it’s a struggle for everyone because these are issues which are novel. We are moving so quickly that it’s imperative that we catch as many of those costs as possible so that there’s not a bunch of hidden costs that are passed along to consumers.”

Shielding Consumers from Costs

Members of a panel discussing pricing reform in these high-growth times agreed those costs need to be transparent.

“The public is now, especially in the post-inflation environment, very conscientious of cost, and I think SPP is rightly [placing] affordability as sort of a central tenet,” said Chris Matos, Google’s energy market development strategic negotiator. “The question is more on the commitment side, and with these load forecasts and the infrastructure expectations, if you plan correctly, costs can go down.”

Chris Matos, Google | © RTO Insider 

He said ERCOT’s 765-kV plan, if the expected load materializes, will reduce system transmission costs because essentially, “We’re leveraging a greater scale of megawatt-miles of transmission.”

A bill introduced in the Ohio legislature would require large data centers to enter contracts with utilities detailing their minimum billing demand, long-term service agreements, the exit fees or liquidated damages for canceled projects, and potential collateral or guarantees before any construction. It would also ban utilities from recovering costs incurred by data centers and shifting them onto customers.

“Google’s answer to this has been in the form of the capacity commitment framework that we’ve instituted in Ohio,” Matos said, “where we’ve agreed to minimum terms and minimum charges that ensure there is equity for the existing system and [customers] are not left constrained in the cost of infrastructure.”

Mark Ahlstrom, vice president of renewable energy policy for NextEra Energy Resources, said the company’s approach is to partner with the developers on multi-gigawatt sites that have the land, infrastructure and accessibility to power.

“We think it has to be a close partnership between large infrastructure investors like NextEra and the hyperscalers to put together something like that and make sure it all works within the community under the right tariffs, working hand in hand with the utilities and co-ops and so forth,” he said. “You have to develop certainty that that project is not going to just go away; that we have the commitments, we have the contracts, and we would find a purpose for that.”

BTM Gen ‘Suboptimal’

Longtime regulator Andrew French, chair of the Kansas Corporation Commission, shared a topic that he said has been top of mind in recent weeks: the growing concern about underinvestment in the transmission system.

“And yes, it can have a bill impact,” he said. “If we don’t move fast enough to make the grid ready or have processes to allow load to get on, folks will talk about doing things like behind-the-meter generation or just totally going off-grid. In my mind, that is a very suboptimal use of capital. It’s something that the customers are going to pursue just because they’re looking for the speed.”

KCC Chair Andrew French (left) shares his concerns as ITC Great Plains President Patrick Woods listens. | © RTO Insider

French said he has heard recent discussion of a “ghost grid” being developed with BTM generation and microgrids.

“That really concerns me that you’re going to have this sort of shadow set … of resources that’s probably not sitting in optimal locations, but it was just pursued for expedience,” he added. “It’s not what we want. It’s another reason why I think we need to move quickly. We need to provide pathways. I think there probably are a lot of these loads that would make sense to integrate into the wider grid. Let them find resources that can contribute to the wider grid.”

PPL CEO Vince Sorgi echoed French as he offered his thoughts and said he doesn’t mind the BTM approach “for a period of time.”

“If a grid is not ready and a hyperscaler can contract with a generator to build generation and serve that data center until that grid is ready, have at it,” he said. “But when the grid is ready, you should connect all generation to the grid for a number of reasons, right? One, the hyperscalers don’t want behind-the-meter generation. Two, just having that generation connected to the grid makes the grid more reliable and more resilient. It will ultimately benefit all customers.

“If we just built a bunch of behind-the meter generation, it would be the most suboptimized solution to this problem that we could have come up with,” Sorgi added.

NYISO Yields to Stakeholder Requests on Transmission Planning Changes

RENSSELAER, N.Y — After receiving pan-sector feedback from stakeholders asking for more time to review NYISO’s proposed changes to the reliability planning process, the ISO told the Transmission Planning Advisory Subcommittee it would delay proposing tariff language.

“The last number of meetings have been productive. They have been long. They’ve been painful at times,” Zach Smith, NYISO vice president of system and resource planning, said March 3. “I believe given the volume of feedback we’ve gotten from stakeholders, it’s appropriate that we spend more time with [the Electric System Planning Working Group] to talk through some of the proposals.”

NYISO’s plan had been to introduce tariff language in March. At previous meetings, stakeholders balked at the breadth of changes and their potential system impacts. (See Stakeholders Ask for Boundaries on NYISO’s Reformed Reliability Process.) ISO staff said they will spend additional time in March and April hashing out the changes with stakeholders.

“I want to make sure that we get this right and try to come up with the best process possible,” Smith said. Even with the extra month of deliberation, the ISO should be able to get the new process in place before the next Reliability Needs Assessment, he said.

Environmental, transmission and generation interests were appreciative of the extra time. Several sectors stressed that they were concerned with how NYISO would develop planning scenarios and digest stakeholder feedback.

Tony Abate, representing the New York Power Authority, told stakeholders that transmission planners have been “super involved” with working with NYISO to figure out how best to help with the changes.

“Transmission owners have only been so-so vocal compared with other sectors,” Abate said. “I want other stakeholders to know that’s because we’ve been having intense conversations amongst transmission planners at each of the TOs to consider what we think is best, and how TOs can enhance participation.”

Anie Philip, senior director of planning for PSEG Long Island, commented that NYISO had to develop a process that included the Long Island Power Authority at every step. PSEG Long Island manages LIPA’s distribution system on its.

“LIPA has a direct statutory obligation as a legislatively charted public utility to plan for and maintain reliability within its transmission district,” Philip said. “LIPA must have a full and meaningful role in all aspects of the Reliability Needs Assessment as it applies to LIPA’s district.”

The conversation shifted to discussing the actual proposed changes. This includes the development of multiple forecast scenarios for reliability planning, rather than one base case. If a reliability issue is found in multiple scenarios, the ISO would start the solicitation process for a solution.

Howard Fromer of Bayonne Energy Center asked whether NYISO had considered what it would do if New York state did not agree with a reliability finding under the new process.

“I think it would be counterproductive to have us have a process where you go through with this and the state disagrees with the foundational assumptions that drove you to reach your conclusion,” Fromer said. He recommended that the ISO develop a formal process to work with state agencies to get buy-in upfront.

Adam Evans, chief of wholesale and clean energy markets at the New York Department of Public Service, said he shared Fromer’s concerns and that he would be happy to work with NYISO to come up with a process that “ensures there’s more potential alignment.”