Search
February 6, 2026

DOE Touts Fossil Fuels’ Role in Meeting Peak Energy Demand this Winter

WASHINGTON, D.C. — The U.S. Department of Energy held a press conference to highlight fossil fuels’ role in maintaining reliability over the recent winter storm and boast of some of the actions the agency took to bolster the grid.

The U.S. Energy Information Administration reported the highest ever withdrawal of natural gas from storage the week that Winter Storm Fern affected the eastern half of the country, Secretary of Energy Chris Wright said at the Feb. 6 event.

“That’s a symbol of what increased energy demand came with this storm,” he added. “What is natural gas? It’s the largest source of home heating for Americans in the country. It’s the largest source of electricity generation in the United States.”

Fern was larger than Winter Storm Uri, which five years ago led to one of the biggest crises in the power industry’s history when much of Texas was without power for days and hundreds of people died. Fern’s effects on the energy system were much less, Wright said.

While Uri knocked out power to 4.5 million homes largely due to failures in generation and intertwined issues on the natural gas system, just over a million homes lost power this year and that was due primarily to ice-laden tree limbs taking out power lines.

“We wish that was zero,” Wright said. “We work and strategize and talk every day about how to reduce that number.”

Standing next to bar charts that highlighted how little renewable power contributed to the high demand set by the storm and related cold, Wright argued that the industry needed to focus on installing dispatchable capacity.

“If you want to add to the capacity of our electricity grid, enable data centers, enable us to reshore manufacturing — the only way you do that that’s helpful is you have to add to our peak generating dispatch ability,” he said.

Wind was down 40% during the peak demand seen during the storm, compared to a more normal weather day last year. Overall, that was true, but intermittency correlates with randomness and SPP reported that it had more wind than expected, enabling it to ship power east — in a reversal of what happened during Uri when imports from PJM and other points east minimized its own outages. (See Wind Output Enabled SPP Exports to Neighbors During Storm.)

Solar works better in regions with more sunshine like the deserts in the West, but even then, Wright said, the sun did not always shine, especially when overall energy demand was peaking.

“Peak demand for energy is always in the winter, by far,” Wright said. “Peak demand for electricity is sometimes and often in the summer. Because the biggest use of energy in people’s households by far is heating, like that winter storm we just went through.”

The natural gas distribution network was delivering four times more energy than the grid was at its maximum stress during the recent storm, he added.

But the natural gas system meeting peak household demand when electricity generators also need more power is a dilemma the industry continues to face. (See Grid Weathers Latest Winter Storm, but Still Faces Gas Coordination Problems.)

The RTOs in the northeast still are working to procure as much fuel as possible as the cold continues to affect demand, Wright said. As is typical, ISO-NE had to rely on burning oil to make it through, as that fuel produced 35% of power at the height of the storm.

“Where it matters at peak demand time, oil was No. 1,” Wright said. “This is crazy. Oil was a huge source of electricity generation in the United States when my mom was in high school.”

DOE is working to improve gas-electric coordination, as it has over the past 15 to 20 years, said Assistant Secretary of Energy James Danly. DOE’s National Petroleum Council (NPC) recently released a report making recommendations. (See DOE’s National Petroleum Council Releases Report on Gas-electric Coordination.)

“The RTOs in the Northeast did their best to procure as much fuel as possible in advance and help their gas generators do that in advance of the weather,” Danly said. “It’s still ongoing. The temperatures are still cold, but we’re seeing, especially in PJM, efforts to get gas out as far as possible, and that’s in part with the encouragement of the department and talking with the stakeholders.”

NPC recommended making it easier to build more pipelines. A major focus of the Trump administration has been to get the Continental Pipeline built, which needs regulatory approval from the state of New York. The project would bring up to 650,000 Dth per day of Marcellus shale gas to New England and New York. The project won approval from FERC in 2014, but it was blocked by the state of New York.

Constitution has asked FERC to reauthorize the project. But unlike the vast majority of pipeline proposals, Constitution did not list any anchor customers, which the commissioners view as an indication of need in pipeline approvals. Wright argued the fact that the petition does not have any doesn’t mean it’s not needed.

“If a pipeline has been blocked, you know, by the governor of New York, and she says she’s going to continue to block the pipeline, people wait for that politics to come out,” Wright said. “We will have customers coming out of the woodwork. Do you want to burn far cheaper natural gas versus oil?”

While New England’s generation fleet and power consumers would benefit from more natural gas, the vastly different business models make it so generators do not have the incentives to invest in the firm gas contracts pipelines need to secure financing, recently retired ISO-NE CEO Gordon van Welie said during a recent webinar.

“I’ve also spent a lot of many hours talking to the merchant generators,” van Welie said. “And you know, I’ve come to understand it does not make economic sense for merchant generators to invest in long-term contracts that would be required to ensure adequate pipeline and gas storage infrastructure for these intermittent peaky events, which are low probability. So, they would rather price the risk of non-performance of the gas system into their offers, or financially hedge their risk, or physically try and hedge their risk with dual fueling — if they can get the siting and the permits for dual fueling.”

The New England states came to FERC during the Obama administration asking to socialize the cost of new pipelines among electricity consumers, but the commission found the idea clashed with the Federal Power Act, and pipelines ran into issues with state politics as well.

“The work-around that was conceived in New England, but never put into effect, was to require the electric distribution companies essentially putting electric ratepayers on the hook for contracting for firm transportation from the pipelines, and then having the EDCs resell that capacity to merchant generators,” van Welie said.

Data Center Moratorium Bill Introduced in N.Y. Legislature

ALBANY — Democrats in the New York Legislature have introduced a bill that would institute a three-year moratorium on the citing and permitting of new data centers statewide.

“Let’s take a pause. We don’t even understand all the implications this can have for the climate, environment, energy costs and water for the state of New York,” said State Sen. Liz Krueger. Proposed data centers in the NYISO interconnection queue, she added, already represent 9.5 GW of load.

The legislators argue the pace of data center development has outstripped the existing planning, regulatory and environmental review frameworks. They say data centers are driving up the cost of electricity, creating more demand for fossil fuels and delaying New York state’s climate goals.

“The fact is that we should not allow individual companies to skyrocket ahead with their plans that will cost us huge amounts of money, cost us huge amounts of environmental impact and cost us lost opportunities to make other decisions with our future energy planning,” Kruger said.

Data centers are a hot topic across the country and make up the bulk of system impact studies discussed and approved by the NYISO transmission planning committee.

The new bill echoes New York’s cryptocurrency mine moratorium. (See NY Slaps Moratorium on Certain Crypto Mining Permits.) Gov. Kathy Hochul signed that moratorium into law in 2022. The Hochul administration, however, recently reached an air rights settlement with Greenidge Generation Holdings, allowing the cryptocurrency mine to operate a gas generator in Dresden, N.Y.

The chances of passing a data center moratorium are unclear. Krueger and Assemblymember Anna Kelles introduced the bill, which is co-sponsored by Sens. Kristen Gonzalez, Rachel May and Lea Webb. The Democratic legislators are backed by a coalition of environmental and consumer advocacy groups, including Food and Water Watch, the Alliance for a Green Economy and the New York Public Interest Research Group.

“The proliferation of data centers and their insatiable appetite for ratepayer subsidies, excessive water use, noise pollution and regulatory secrecy must stop,” said Blair Horner, senior policy advisor for NYPIRG. “New York state can show the nation how to regulate data centers in a way that protects consumers’ wallets, the public’s hearing and the environment’s most precious resource, water.”

The legislators and advocacy groups represent areas from New York City to rural Upstate. The New York City Democratic Socialists of America, fresh from their recent victory in the NYC mayoral election, also support the legislation.

The bill calls for the Department of Environmental Conservation to complete a comprehensive environmental impact statement on data centers, including the current and forecasted effects on energy use, electricity rates, water resources, air quality and greenhouse gases. The Department of Public Services would be required to report the cost impacts of data centers on all other ratepayers and issue any new orders necessary to ensure those costs are paid by data center companies and developers.

“I want to emphasize the fact that this is simply a pragmatic decision to put a pause … and create common sense regulations,” said Kelles. “This industry has exploded very quickly, and we have not had the opportunity to create infrastructure in the government, both in law and regulations to ensure that … the industry does not have a significant negative impact on workers and our environment.”

The bill would not block projects retroactively. It would pause new permitting by any government body, agency or public benefit corporation for construction, siting or the start of operations. Projects that already have permits would be allowed to continue.

The bill is awaiting discussion at the Senate Environmental Conservation Committee.

West Needs $60B in Transmission Ahead of 2035, WestTEC Finds

The West must build or upgrade 12,600 miles of transmission at a cost of about $60 billion to meet the region’s forecast 30% increase in peak demand and other needs by 2035, according to the Western Transmission Expansion Coalition’s 10-year outlook.

The anticipated 30% increase in peak electric demand — from 168 GW in 2024 to 219 GW in 2035 — is more than three times greater than what the region has experienced over the past decade, according to WestTEC’s 10-year outlook for the Western transmission system released Feb. 4.

WestTEC, an initiative of the Western Power Pool (WPP), anticipates a 35% increase in energy consumption and a 71% increase in generation capacity over the same period.

Meanwhile, transmission expansion is expected to increase from approximately 98,000 miles of 230-kV transmission lines to about 111,400 miles in 2035, or 14%, according to the 10-year outlook.

“I’m not saying that transmission has to keep up one-to-one with load growth,” Keegan Moyer, a partner at Energy Strategies and consultant for WestTEC, said during a presentation in connection with the release of the report. “I don’t think that’s necessarily true. But we definitively know that it can’t grow by half. And it can’t grow at a third of the rate that we’re adding in generation. This study proves that.”

Data centers and “the electrification of everything in our lives” are driving the forecast increase in peak demand, according to Moyer.

WestTEC has put together a portfolio of planned and newly identified transmission expansion projects that would meet this forecast demand through 2035. The total portfolio is estimated to add or upgrade 12,600 miles of high-voltage transmission at a cost of about $60 billion.

The report notes that the $60 billion is manageable when considering that the annualized cost of the projects is “eight times less than the cost of generation that must be added over the same time horizon and represents only 2.5% of today’s average retail electric price in the West.”

The portfolio includes 73 planned projects that total about 9,400 miles at a cost of $46.6 billion, about 20% of which are already under or close to starting construction.

“Reconductoring and rebuild projects represent about 10% of planned transmission in terms of both line miles and costs,” according to the report. “If these sponsors do not complete these in-flight projects, the total transmission gap will grow, and needs identified in this study will not be met.”

WestTEC identified an additional 3,300 miles of upgrades needed to address reliability, deliverability and efficiency concerns.

The group said the portfolio would enable the Western grid to address the 30% growth in electricity demand and reduce the risk of outages by addressing more than 75 “steady-state power flow violations on the high-voltage system that would occur but for the construction of upgrades identified by WestTEC.”

The portfolio would cut power production costs by $500 million/year, with grid congestion costs and generation curtailments falling by 20% and 17%, respectively.

“These metrics are inherently conservative and do not reflect the full extent of savings and efficiencies that could occur,” according to the report.

The identified projects could allow an additional 10 GW of power to move across key regional interfaces during critical periods, lowering shortage risks and reserve requirements, per the report.

‘Admirable Achievement’

Several projects under the Bonneville Power Administration’s $5 billion Grid Expansion and Reinforcement Portfolio are listed in the WestTEC study.

For example, the Lower Columbia to Nevada-Oregon Border project is on the list. The project is aimed at improving connectivity from the lower Columbia region to the Nevada-Oregon border with 500-kV transmission lines and a new substation near the border. (See BPA Provides More Details on $5B Tx Projects.)

The effort has a preliminary estimated cost of $1.9 billion with an estimated completion by 2035.

BPA has been a “proud partner working with the Western Power Pool to support the creation of WestTEC,” BPA spokesperson Kevin Wingert told RTO Insider. “We are one of many participants across the Western utility landscape to participate in WestTEC’s 10-year horizon report.”

“We believe this first-of-its kind, West-wide study identifies necessary transmission infrastructure additions needed to enhance grid reliability, increase efficiency and facilitate the integration of new resources,” Wingert added. “We appreciate the broad range of regional stakeholders in identifying emerging transmission needs across the Western Interconnection. This study provides actionable recommendations for the implementation of new transmission that will address congestion and unlock interregional transfer capabilities.”

Several CAISO projects are similarly included in the list of planned projects in the WestTEC study, such as the 260-mile Humboldt-to-Collinsville 500-kV line. The line is part of CAISO’s 2023/24 transmission planning process. (See FERC OKs Abandoned Plant Incentive for Calif. Offshore Wind Tx Developer.)

“The ISO is very supportive of a coordinated West-wide transmission plan for the next 10 and 20 years, and looks forward to future West-wide analysis and planning, including the 20-year Westside transmission plan,” Jeffrey Billinton, director of transmission infrastructure planning at CAISO, said in a statement. “The 10-year plan affirms the planning and approvals that are already underway in California and the West, and sets the table for more interregional transmission development in the next two decades.”

Brian Turner, senior director at Advanced Energy United, called the WestTEC study an “admirable achievement.”

“These upgrades don’t just address the increasing reliability risks in the region, they address the very important need for transmission to connect and deliver the massive new power the West needs for economic development, demonstrating that transmission is a critical component of our nation’s need for speed to power,” Turner said in an email to RTO Insider.

The WestTEC effort, jointly facilitated by WPP and WECC, addresses long-term interregional transmission needs across the Western Interconnection. The release of the 10-year planning horizon report comes after 18 months of work, Moyer noted. (See WestTEC Targets Early 2026 for Release of 10-year Tx Outlook.)

A 20-year horizon report is slated for release later in 2026.

The main objective of WestTEC is to create an “actionable” transmission study by conducting integrated planning analysis across the Western Interconnection.

The study horizons focus on evaluating transmission requirements in 2035 and 2045, with the goal of prioritizing “flexible and scalable transmission solutions for nearer-term needs to help better position the system for efficient long-run expansion,” the study plan says.

Prolonged Cold Drove Record Monthly Energy Costs in New England

New England experienced record-high energy costs in January amid cold weather, high gas prices and heavy reliance on oil-fired generation, according to ISO-NE.

The energy market’s value totaled about $2.7 billion in January, the highest monthly total in the region’s history, the RTO told the NEPOOL Participants Committee on Feb. 5. The monthly costs surpassed the previous monthly record of nearly $2.2 billion in January 2014.

Much of the cost was concentrated during the extended stretch of cold weather at the end of the month. Temperatures averaged about 14 degrees Fahrenheit below normal over the last nine days of the month, ISO-NE noted. Energy market costs totaled $422 million on Jan. 27 alone, up nearly 150% over the previous daily total.

The grid experienced its highest peak load of the winter on Jan. 25 at 20,221 MW, exceeding ISO-NE’s high-range forecast of 21,125 MW.

Gas prices also broke records: The maximum day-ahead gas price in Massachusetts reached about $122/MMBtu on Jan. 27, the highest maximum since ISO-NE launched its standard market design in 2003, easily exceeding the previous record of about $82/MMBtu.

ISO-NE CEO Vamsi Chadalavada praised the performance of the region’s resource fleet throughout the cold stretch while acknowledging the region is not out of the woods yet, with more cold weather forecasted for the coming weekend.

Oil-fired generation, which typically accounts for less than 1% of energy in the region on an annual basis, provided 28% of energy from Jan. 24 through Feb. 1. Gas-fired generation also accounted for about 28% of energy, followed by nuclear at 19%, imports at 11%, renewables at 9% and hydropower at 5%.

On Jan. 25, ISO-NE obtained a waiver from the Department of Energy allowing generators to operate in excess of emissions limits, intended to enable resources to provide as much power as possible throughout this event. The RTO has received an extension of this waiver until Feb. 14.

With the waiver in place, about 21 resources have exceeded some limit at some point during the event, said Stephen George, vice president of system and market operations at ISO-NE.

The region burned about 66 million gallons of oil between Jan. 24 and Feb. 1, causing significant depletion of resources’ stored fuel inventories, he said. Fuel oil inventories dropped from 43% of the region’s total storage capacity to about 20%, according to data as of Feb. 4. This has caused the region’s inventory of stored fuel oil to drop to its lowest point in the past 10 years.

Heavy snowfall across the region on Jan. 24 and 25 hindered generators’ replenishment capabilities, he noted, adding that he expects to see a significant uptick in storage levels over the next couple weeks as oil consumption declines and generators continue to replenish their tanks.

He added that oil consumption by dual-fuel generators “contributed to a high demand for demineralized water trucks which were in short supply.”

While snowfall significantly limited the output of solar resources, wind resources generally performed well, averaging about 885 MWh over the nine-day period.

Imports from neighboring regions averaged about 1,900 MWh during the period, with about 52% coming from Québec and 41% coming from New York. However, flows reversed over about a two-day period coinciding with the winter storm, with New England sending power to Québec amid tight conditions in the province. (See Hydro-Québec Halted NECEC Deliveries amid Reliability Concerns.)

George said these exports cleared in the day-ahead market and were not emergency exports.

ISO-NE also experienced by far its highest monthly costs in its new day-ahead ancillary services (DAAS) market, which the RTO launched in March. While some stakeholders already expressed concerns about the high costs experienced in the new market, monthly per-megawatt prices roughly doubled in January relative to December levels.

The ISO-NE Internal Market Monitor estimates that DAAS costs totaled $921 million between March and January, dwarfing the RTO’s projection of $140 million in annual costs.

In response to the spike in DAAS market prices, the Monitor recommended three “targeted market design adjustments,” with the support of ISO-NE.

They include upwardly adjusting how it formulates the strike price “to better align it with the short-run marginal costs of resources providing these ancillary services”; decreasing the forecast energy requirement “to reflect the expected contribution of renewable generation”; and considering decreasing the non-performance factor in the 10- and 30-minute operating reserve requirements in the day-ahead and real-time markets.

Taken together, the changes “represent narrow but meaningful refinements” that should “enhance cost effectiveness while remaining aligned with the core objectives of the DAAS design,” the Monitor wrote.

Also at the meeting, Chadalavada signaled an openness to considering changes to the region’s Pay-for-Performance rate, emphasizing the need to strike the right balance between setting strong performance incentives for capacity resources and avoiding excessive risk premiums in future capacity auctions.

He stressed the importance of both moving with agility to address potential market issues and building consensus among stakeholders to ensure durable solutions. He said ISO-NE aims to implement the proposed DAAS changes in time for next winter.

Several stakeholders expressed support for this sentiment and applauded ISO-NE for its performance throughout the cold weather event and for being open to market changes in response to cost concerns.

FBI Releases Critical Infrastructure Cyber Recommendations

The FBI has launched an initiative to help critical infrastructure operators and other entities strengthen the cybersecurity of their operational technology and information technology assets.

In a YouTube video posted Jan. 28, Brett Leatherman, assistant director of the FBI’s cyber division, described Operation Winter SHIELD (Securing Homeland Infrastructure by Enhancing Layered Defense) as a cyber counterpart to the winter preparations that infrastructure owners implement each year. Leatherman told listeners that even though winter storms “test our infrastructure … to their limits … the most critical threats to infrastructure don’t come from the weather, they come through our networks.”

The program’s launch came the same week that cybersecurity firm Dragos published a report blaming a group linked to Russia’s intelligence service for a cyberattack against Poland’s electric grid in December 2025 that targeted a system for managing renewable energy sources. (See Dragos Blames Electrum Group for Poland Grid Cyberattack.) In its report, Dragos wrote that attacking a power grid “in the depths of winter is potentially lethal to the civilian population dependent on it.”

The goal of the program is to position “industry not as passive victims or recipients of intelligence but as critical allies … in detecting, confronting and dismantling cyber threats,” the bureau wrote.

“In far too many cyber investigations, we see the same pattern,” Leatherman said. “Adversaries exploit known vulnerabilities [such as] stolen credentials, end-of-life systems [and] third-party access, and they take advantage of incident response plans that look great on paper but break down in practice.”

The Winter SHIELD campaign is built around 10 recommended actions developed by the FBI with input from domestic and international partners, based on adversary behavior and defensive gaps seen in recent cyber events. Each week during the campaign, the bureau will highlight a different action and its security benefits.

Among the FBI’s recommendations are adopting authentication measures to reduce the risk of phishing attacks, such as device-bound passkeys and security keys that comply with the FIDO2 standard developed by Microsoft and other partners and phasing out riskier systems like text-based authentication and authenticator apps with push-only approvals. The bureau also suggested adopting a risk-based vulnerability management program, including a complete asset inventory and aggressive timelines for remediating known risks.

More recommendations include tracking and retiring end-of-life technology, which no longer receives security updates and likely is targeted by cyberattackers, on a defined schedule; exercising tight control over data access by third parties; protecting security logs for detection, response and attribution; and maintaining offline backups and regularly testing restoration.

Finally, the FBI urged organizations to improve the speed and effectiveness of their incident response plans with regular testing.

“The goal is not to check boxes or push for perfection; we want to drive momentum,” Leatherman said. “Nation-state cyber operations are invisible until they aren’t. … Meanwhile, cybercriminals continue to steal our money and hold our data for ransom. But together, we can deny adversaries the digital real estate they need to operate and raise the cost of every attack.”

CAISO Examines ‘Pulsating’ Data Center Loads

CAISO wants to ensure grid reliability when artificial intelligence data centers “pulsate.”

A pulsating load can occur at AI training data centers after servers in the facility complete large computations. When these computations have finished, “all the load drops significantly, 80 to 90%, within seconds,” CAISO staff member Ebrahim Rahimi said at a Feb. 5 large load information session hosted by CAISO.

“That nature of the AI training comes with a number of potential issues,” Rahimi said. “The main thing is we don’t want a [grid] disturbance to happen and all the large load data centers drop off and stay off. They [could] have a large impact on the system frequency and voltage.”

As observed with AI training data centers, pulsating load levels can trigger critical dynamic system frequencies and force system oscillations, Rahimi said. These effects can cause resonance issues with nearby rotating machines, he said.

The Feb. 5 information session was intended help CAISO determine whether it needs to make certain policy changes for large loads. If policy changes are required, the ISO will begin a new stakeholder initiative.

“Currently, we don’t have a threshold for what constitutes a large load,” Danielle Mills, CAISO infrastructure policy development principal, said during the session. “So that is something that we’ll both take comments on and may develop over the next several months.”

Utilities are responsible for large load interconnections, but CAISO is monitoring developments at the federal level regarding whether RTOs and ISOs can or should be more heavily involved in the process, a Jan. 30 CAISO Large Load Considerations issue paper said.

Cost allocation rules for large loads could be particularly complex, especially rules for co-located load and generation facilities, the paper says. Often, it will be impossible to tell whether load or generation affected networked facilities, the paper says.

“As electric regulators are fond of saying, cost allocation ‘is not a matter for the slide rule,’” the paper says. “Enhancing large load cost allocation rules and responsibilities will thus require significant coordination across tariffs, including the ISO’s.”

Currently, CAISO includes large loads in its annual transmission planning process, and in recent years it approved projects to provide increased capacity in the Bay Area to support data center load growth, the paper says.

CAISO might develop new technical standards that govern large loads — an effort that resembles the NERC project to consider new technical standards for large loads, the paper says.

CAISO is looking for ways to allow large loads to participate in ISO energy and ancillary service markets more efficiently.

At the information session, one stakeholder asked if the focus of the paper is to see whether large loads can be made flexible.

“Flexible loads are a consideration, but we also have a parallel initiative going on right now — the demand and distributed energy market initiative — that is looking probably in more detail at flexible loads,” Mills said.

In July 2025, the California PUC partly approved a new rule to make it easier for AI data centers and other large customers to complete transmission connection projects in Pacific Gas and Electric’s territory. PG&E’s retail customer transmission interconnection demand has increased by more than 3,000% since 2023, utility representatives said. (See CPUC OKs New PG&E Rule to Speed Tx Connections for AI Data Centers, Others.)

Large Loads Forecast

California’s data center load could increase by 1.8 GW by 2030 and by 4.9 GW by 2040, according to the California Energy Commission’s demand forecast.

But large loads include more than data centers: Loads from EV charging stations and electric agricultural and industrial equipment, which also fall into this category, are expected to increase significantly over the coming years, too.

EV charging load is forecast to be about 2.4 GW at the peak hour in 2030 and about 7.8 GW in 2040, the CEC forecasts. These amounts represent about 5% of the system’s load at peak in 2030 and about 13% of the system peak in 2040, the CEC told RTO Insider.

Charging for electrified agricultural equipment such as tractors, specialized mobile equipment and all-terrain vehicles at farms would constitute approximately 16 MW at the peak hour in 2030 and about 51 MW at peak in 2040, the CEC said.

The agency requests updated data from the state’s investor-owned utilities and other utilities three times per year. It uses the data from the first request to develop the draft energy demand forecast, the second to finalize the energy demand forecast, and the third for additional details that inform the dataset it develops for CAISO’s transmission planning process.

Xcel, NextEra to Partner on Generation for Data Centers

Xcel Energy’s leadership says a partnership with NextEra Energy will allow its operating companies to contract up to 6 GW of data center capacity by the end of 2027, with sales and generation investment ramping into the next decade.

“We think there’ll be increased clock speed as we think through combining the best sales teams, the best development teams and the best analytical teams in the country to deliver solutions for a very sophisticated customer set,” Xcel CEO Bob Frenzel told financial analysts during the company’s Feb. 5 year-end earnings call.

The companies the day before had announced a memorandum of understanding to co-develop generation, storage and interconnections for data center projects. They said the agreement will support existing and new large load opportunities across Xcel’s service territories by better anticipating system needs, streamlining development timelines and advancing innovative grid technologies.

“It brings scale and the ability to put an inflection point in the curve of data center delivery and signed [energy services agreements] and contracts and, ultimately, investment opportunities in all three of our big regions,” Frenzel said.

He said conversations with data center developers have affirmed Xcel’s position that they don’t want to own and operate their own generation.

“We don’t want you to take existing supply out of the stack,” Frenzel said. “[Data centers] would rather have someone own and operate for them in a deregulated market. That means me working with the developer to build that generation, leave it through a regulated utility and sell it to the customer.”

Xcel has more than 2 GW of new contracted data center capacity and a 3-GW goal by the end of the year. The company has more than 20 GW of capacity in its large load pipeline.

The Minneapolis-based company reported 2025 diluted earnings of $2.02 billion ($3.42/share), compared with $1.94 billion ($3.44/share) for 2024. It attributed the results to increased recovery of infrastructure investments and sales growth, partially offset by higher interest, depreciation, and operating and maintenance expenses.

Xcel reaffirmed its 2026 guidance range of $4.04 to $4.16/share. It has led or exceeded ongoing guidance for 21 consecutive years.

The company’s share price Feb. 5 closed at $76.12, down 8 cents from its previous close.

In a nod to the violent immigration enforcement taking place in Xcel’s hometown, Frenzel said he was pleased to sign an open letter alongside more than 60 other CEOs urging a solution to the turmoil.

“It goes without saying that the tragic events across the Twin Cities have weighed heavily on our communities, our customers and our employees,” he said. “We have engaged extensively and proactively with senior federal, state, local and community officials with a goal to de-escalate and identify a sustainable path forward.”

Xcel’s Energy Foundation has committed to help fund the Minneapolis Foundation and support local and small businesses affected by the events.

Hairston Poised to Leave BPA, Join EWEB

Eugene Water & Electric Board voted to select Bonneville Power Administration CEO John Hairston as its next general manager, though the utility noted that no final decision has been made pending further negotiations over a compensation package.

EWEB’s Board of Commissioners voted Feb. 3 to select Hairston following a nationwide search for a general manager that started in September, according to a Feb. 4 news release from the Oregon municipal utility.

“We are aware of the reports regarding BPA Administrator John Hairston,” BPA spokesperson Kevin Wingert told RTO Insider in an email. “While he has been identified as a candidate for another position, the process is ongoing and no final decision has been made. Until any decision is finalized and formally announced, Hairston remains fully engaged in his role as administrator and CEO.”

EWEB has yet to make a formal offer and is negotiating Hairston’s salary package. If negotiations are successful, Hairston will replace Frank Lawson, who in September announced plans to retire.

“It’s an honor for us to have someone at that level with that degree of integrity interested in this position,” Lawson said in a statement. “I have a lot of respect for John Hairston.”

Lawson’s exact retirement date is still open-ended, but he’s expected to step down sometime this spring, EWEB spokesperson Aaron Orlowski told RTO Insider. Negotiations with Hairston are expected to conclude within the next couple weeks, he said.

The final salary package must be approved in a public vote by the utility’s board, said Orlowski, who confirmed the posted pay range for the position is $350,000 to $475,000/year. In 2025, the board set Lawson’s total compensation at $405,564 annually.

EWEB said in its news release that 18 people applied for the role of general manager, and the board selected two finalists for additional interviews.

“We saw a very clear picture. We saw a very clear vision from both candidates,” EWEB Commissioner Tim Morris said. “I think the vision lines up with our mission and vision and values as an organization.”

Hairston assumed the role of BPA administrator in January 2021 after former chief Elliot Mainzer left the agency to become CEO of CAISO. Hairston joined the agency in 1991 and worked as chief operating officer and chief administrative officer. (See Hairston Appointed BPA Administrator.)

Hairston has guided BPA through significant decisions both for the agency and the region. For example, following a lengthy and sometimes heated stakeholder process, BPA decided in May 2025 to join SPP’s day-ahead market option Markets+ instead of CAISO’s Extended Day-Ahead Market. (See BPA Chooses Markets+ over EDAM.)

Under Hairston, BPA paused certain transmission planning processes and launched the Grid Access Transformation project to tackle an unprecedented interconnection queue. The most recent study includes 61 GW of new generation, compared with 5.9 GW in 2021. (See BPA Tx Planning Overhaul Prompts Concern for Northwest Clean Energy Compliance.)

Hairston’s potential departure would come after especially tumultuous year for BPA on the staffing front. Like other federal agencies, BPA in 2025 confronted an exodus of experienced employees after the Trump administration offered federal workers buyouts and imposed a blanket hiring freeze — despite the power marketing administration’s status as a self-funding entity. (See BPA Employees Confront Trump’s ‘Fork in the Road’.)

BPA lost about 200 workers — 6% of its workforce — and rescinded 90 job offers because of those policies. As of late 2025, BPA was still looking to fill 155 positions after its hiring freeze was lifted. (See BPA Looks to Fill 155 Positions After Hiring Freeze.)

Cleanview: Data Centers’ Speed-to-market Goals Lead to Inefficient Gas Generation

Many of the hyperscale data centers being built around the country are using less efficient, dirtier natural gas generation as part of their race to get more computing power online, says a new report from clean energy advocate Cleanview.

Some 46 facilities with 56 GW of power demand are planning to build their own behind-the-meter generation, which represents 30% of all planned data center capacity in the country, according to Cleanview research.

“There’s been this huge surge in data center demand and data centers wanting to connect to the grid, and that has resulted in the timeline to connect to the grid exploding,” Cleanview CEO Michael Thomas said. “It can now take as long as seven years in some markets like Virginia to connect. And then it’s also put a huge amount of pressure on turbine supplies.”

Just three manufacturers make the most efficient combined-cycle natural gas turbines, and the wait time for them has grown in recent years. Some had thought that combination would throttle data center development, but Thomas said they have found creative ways to get generation capacity with many facilities already under construction.

“What these data center developers are doing is installing gas turbines on semitrucks and driving them in so they can install them in weeks, not years,” Thomas said. “They are repurposing aero-derivative turbines that were originally designed for airplanes, warships and in some cases even cruise ships. And then they’re using these backup generators and engines that companies like Caterpillar have traditionally sold as backup power to be used a small number of hours in a year, and they’re using those essentially 24/7.”

Those types of generators are less efficient than combined-cycle plants, and they produce more pollution, whether local pollution like nitrogen dioxide that can make their neighbors sick or climate pollution.

The Stargate Project in New Mexico, being built by OpenAI and Oracle, is about 2 GW and will emit 15 million tons of CO2 per year.

“Over the last 20 years, New Mexico, as a whole state, has decarbonized its economy by 15 million tons, and they’re one of the leaders,” Thomas said. “And so that single data center would wipe out all of the state’s decarbonization.” (According to its website, Cleanview’s mission is “to accelerate the clean energy transition.”)

Massive data center developments using whatever generation they can get their hand is a growing trend. It started in Memphis, Tenn.

“A little more than a year ago, this was just a niche phenomenon,” Thomas said. “xAI, famously owned by Elon Musk, was one of the first to do it in Memphis. A few others have kind of experimented with it, but it was really niche. Now it’s become one of the most popular strategies. So, 90% of the projects that we identified, representing about 50 gigawatts, were announced in 2025 alone. We’ve seen this huge explosion in that trend.”

The report was based on data from the facilities permits, SEC filings, utility regulatory filings and press releases, though Thomas noted the press releases often focus on cleaner generation and leave out the use of inefficient generators.

Musk’s Colossus AI data center was built in an area of Memphis that already was overburdened with pollution, and that led to significant pushback. The NAACP sued xAI over air permits and has launched an effort to fight similar projects as they arise. (See NAACP Event Examines Data Center Impact on Environmental Justice.)

Most of the data centers identified in the report are being built in more rural areas, in part to avoid the political pushback encountered by xAI, but also to gain better access to natural gas and an easier permitting process. Only one of the data centers from the report using behind-the-meter generation is in a city, Thomas said.

That data center, in San Jose, Calif., “will be built with Bloom fuel cells, which results in far less air pollutants,” he added.

Wind, solar and storage do not face the same kind of timelines as combined-cycle generation, but they do make developments more difficult because of greater land use, Thomas said.

“These are already thousands of acres for the data center alone, and then if you add on top of that, many more thousands of acres for solar and wind data center, developers might be concerned that it’s just harder to lock up that land, or it’s harder to permit that, or maybe it sparks more backlash,” Thomas said.

Across the 56 GW highlighted in the report, there are significant differences in how the data centers plan to relate to the grid in the near term and over time, said former FERC Commissioner Allison Clements. She’s now a consultant at 804 Advisory and a partner at Appleby Strategy Group, which advises data center developers.

“For some, onsite generation is intended as a bridge until more reliable grid power becomes available,” Clements said. “In any case, the report affirms that the high-octane speed-to-power craze is real, and that substantial capital is willing to pay big bucks and take on stranded-cost risk.”

The enduring strength of that demand is uncertain, but utilities would be wise to unlock more capacity on their systems quickly.

“Regulators can support these efforts by moving swiftly to align incentives around fast, lower-cost tools like advanced transmission technologies, rapid battery deployment and portfolios of distributed energy resources,” Clements said.

The Cleanview report’s findings were highlighted by another close watcher of growing power demand, with Grid Strategies Vice President John Wilson highlighting the report at the National Association of State Energy Officials conference Feb. 4.

Wilson is behind the firm’s load forecasting reports, which show up to 90 GW of data centers planned to come online in the next five years, though that could be limited to 65 GW because of chip shortages. (See Grid Strategies: Pace of Load Growth Continues to Speed up.)

The Cleanview report shows that many of those data centers are not using the cleanest gas generation, he said.

“Most of this natural gas generation that’s going in is not highly efficient, modern gas generation, it is less efficient — whatever they can get, literally generators-on-the-back-of-a-truck kind of generation,” Wilson said. “This is what is in their permits.”

In five years, the industry could add more data center demand to the national grid than ERCOT’s record peak, Wilson said. “A year ago, we were really not looking at this, and now 40% plus of the large load growth is from these gigawatt-scale data centers, 500 MW-plus — mostly AI,” Wilson said. “The idea of a gigawatt-scale load was just not something that most utilities considered a possibility five years ago, much less 2, 3, 4 GW at a single location.”

Traditionally such major power demands would be served by combined-cycle turbines, but the demand for those has led to lengthy lead times, which clash with the massive financial incentives for data center developers, Thomas said.

“A data center like this, built by developer, can, right now, sell that capacity for between $10 billion and $12 billion per gigawatt,” he added. “So, the opportunity of coming online in just six months or a couple years early is huge, and so they’re willing to pursue these strange strategies.”

AI applications are not making that much money yet, but the firms involved in the industry like Meta or Microsoft are the largest in human history, with massive balance sheets. They worry about being left behind by a potentially major leap forward in technology. They had been sitting on large stores of cash for years, which now are being spent on data centers and related infrastructure, Thomas said.

The trend the Cleanview report put firm numbers around had been picked up by the Energy Information Administration (EIA), which recently posted about the possibility of using old jet engines from a facility on Davis-Monthan Air Force Base in Arizona colloquially known as “the Boneyard.” Data centers in Texas recently deployed modified jet engines as generators that can each produce 48 MW, EIA said.

The engines from the fallow planes at the desert facility could produce a total of 40 GW, which beats the current installed generation in the state of Arizona by 10%, EIA said. But the engines are old, averaging more than a decade, and the military has its own uses for them — so the actual capacity is far less.

The burst of data centers being built with creatively sourced generators means additional demand for natural gas, which increasingly is being exported via LNG and faces higher demand from the combined-cycle plants that also are being built, said Public Citizen Energy Program Director Tyson Slocum.

“The era of cheap gas is over,” Slocum said. “All of this new gas, build for power generation, is going to be very expensive.”

In 2025, the eight export LNG export terminals used more of the fuel than the 74 million Americans served by natural gas utilities, he added.

So far, data centers have affected power prices most visibly in PJM, where its capacity prices have surged as its reserve margins have narrowed. While supply might catch up to demand and lead to lower capacity prices eventually, Slocum asked how many more billions of dollars that would take.

“I’ve heard this argument in competitive markets since the beginning — ‘Well, folks just need to pay a little more, and then the market will balance itself out,’” Slocum said. “And then right when it’s supposed to balance out, they’ll say, ‘Gosh, we need more transmission.’ Or whatever the argument is going to be, there’s always some caveat.”

If data centers are not part of an AI bubble and LNG exports continue unabated, gas prices will be high, and that translates directly into higher energy prices across the country, he added. Then, laying on the legitimate issues around pollution on top of the costs leads to questions about the value of the “AI race.”

“My big issue here is that we’ve got big tech and their supporters in the administration saying the artificial intelligence race is of national security importance,” Slocum said. “Well, says who? Says a bunch of tech companies that stand to make massive profits by commodifying and locking us into their products? We have not had a national conversation about the scope of AI’s application in our society or in our economy.”

Pilot Project will Site Small Data Centers Near Stranded Power

A new collaboration is working to develop models for the faster setup of smaller-scale, real-time data processing centers.

EPRI, InfraPartners, NVIDIA and Prologis will assess the ways data centers in the 5- to 20-MW range can be built at or near utility substations that have available capacity.

The effort was announced Feb. 3 at the DTECH transmission and distribution conference in San Diego.

The goal is to speed deployment by making better use of underused infrastructure. The partners hope to have at least five pilot sites in development nationwide by the end of 2026, and develop a replicable, scalable model for wider use.

The focus is on inference data processing, which supports artificial intelligence in nearly every sector of the economy, EPRI said.

Unlike AI model training, which often is carried out in larger facilities over longer time frames, AI inference provides real-time responses and can work from smaller facilities.

When AI inference is distributed, rather than centralized at a single hyperscale facility, it is closer to the end users of data, which can reduce response time.

EPRI said this edge-of-grid distribution also can reduce transmission congestion, improve system flexibility and help integrate renewable energy.

EPRI President Arshad Mansoor said: “This collaboration with Prologis, NVIDIA, InfraPartners and the utility community highlights the type of innovative actions required to meet the moment. Using existing grid capacity to bring inference compute closer to where it’s needed — quickly and reliably — is a win for all.”

Power industry R&D organization EPRI will identify areas with capacity and fiber connections that could host the pilot projects; later, it will collect and analyze the results to inform future best practices.

Industrial real estate investment trust Prologis will identify suitable land and buildings that could be used for rapid deployment and will coordinate development and planning.

Graphic processing unit designer/manufacturer NVIDIA will deliver optimized computing platforms, offer technical guidance and facilitate connections to potential customers.

Data center builder InfraPartners will provide AI data centers manufactured offsite and designed for high-density power and cooling.

Participating utilities will assess distribution capacity, guide siting and interconnection, and ensure operational requirements are met.

Marc Spieler, senior managing director for the global energy industry at NVIDIA, said: “AI is driving a new industrial revolution that demands a fundamental rethinking of data center infrastructure. By deploying accelerated computing resources directly adjacent to available grid capacity, we can unlock stranded power to scale AI inference efficiently.”