Power Constraints Challenge the AI Boom

By
Power Constraints Challenge the AI Boom
An Amazon data center adjacent to a nuclear power plant in Pennsylvania. (AFP)
Share:

As America’s big artificial intelligence (AI) companies unveil plans to build multibillion-dollar data centers, their digital ambitions to support miraculous AI technologies face a far more mundane roadblock: electrical power.

Creating and sustaining AI systems requires massive amounts of electricity. To meet the demand, AI companies are embarking on an unprecedented investment spree in data centers, or warehouses of computers, and the power gear they need to run AI systems 24/7, 365 days a year. Indeed, data centers now account for one of the fastest-growing sources of energy demand required for supercomputers of the future. Consulting firm Deloitte estimates that power demand from AI data centers could grow more than 30-fold in less than a decade, reaching 120 gigawatts (GW), up from just 4 GW in 2024.

Power demand from AI data centers could grow more than 30-fold in less than a decade, reaching 120 gigawatts

Yet, utility companies designed electrical systems to provide and sustain power for a different era. Much of America’s power infrastructure dates back to the 1960s and 70s. The grid, which connects power companies to a web of energy resources, is old, fragmented, capital-intensive, and sorely in need of an upgrade to meet the surge in demand from AI companies in the power industry. America is not alone in its power problems. Europe, too, struggles with limited capacity due to energy deficits caused by the war in Ukraine. And Asian countries race to implement upgrades designed to attract AI investment. Just as problematic, all systems face time, structural, and regulatory constraints that could easily deflate the winds in financial markets now blowing a suspected bubble in AI stock valuations, though some analysts say the values are justified.

Power Constraints Challenge the AI Boom
The US power system, with its aging electrical grid, presents a unique problem. (AFP)

AI Data Centers Could Lose Their Spark

Of all power systems in the world, America confronts one of the steepest conundrums. Companies such as OpenAI, the developer of ChatGPT, plan to spend billions of dollars on building data centers to develop generative AI, supercomputers that can perform tasks and jobs better than humans. Setting aside criticism that AI companies exaggerate how close they are to achieving their lofty goals, the AI industry has launched an unprecedented spending spree to build data centers to reach them. Analysts at the investment bank Morgan Stanley say tech companies and their partners worldwide will spend about $3 trillion by 2028, requiring them to borrow nearly $1 trillion from banks and investors to finance their data dreams.

AI companies need a massive amount of power for two reasons. First, training AI models like GPT-4 or Claude requires specialized chips that must run simultaneously for weeks or months to perform trillions of mathematical calculations. The math enables the chips to adapt to the parameters needed for the training data. A single training run for a cutting-edge model can burn as much power as hundreds of American homes consume in a year.

Secondly, simply running the AI services requires a significant amount of electric power. Every time someone uses OpenAI’s ChatGPT-4, Anthropic’s Claude, or generates an image, computer servers must process the request. When millions of people use these services simultaneously, the demand requires data centers to run continuously, placing a significant strain on the power systems that energize the machines. The specialized chips that perform the relentless calculations also generate heat, necessitating additional energy allocations to cooling systems.

The result challenges the capacity of existing power systems that were originally designed to keep the lights on, not to meet the power needs of competing AI giants with open wallets.

The result challenges the capacity of existing power systems that were originally designed to keep the lights on, not to meet the power needs of competing AI giants

US Power Grid: Old and Inadequate

Although the development of generative AI creates challenges worldwide, America’s power system, with its fractured electrical grid, poses a unique problem. For historical and technical reasons, the current US power grid evolved into an ungainly system with three main interconnections: one serving the Eastern US, one serving the West, and one serving only Texas. The tripartite grid structure serves hundreds of different utilities, ranging from investor-owned corporations to city-owned systems. Regulatory authority is split among the federal government, public utility commissions, and regional transmission organizations, each using different yardsticks to oversee the regulated rates customers pay. If one section of the country runs short of power, it can technically cover the deficit by tapping into another regional grid that possesses an energy surplus.

The trouble is that the three American interconnection points have a limited ability to exchange power with one another, partially because the equipment for efficient connections is dated, and also due to a functional mismatch. Engineers designed the grid system mainly for emergencies and small power transfers, not for large-scale commercial operations required by the data centers envisioned by AI companies. Power and utility companies also designed transmission lines to move electricity across relatively short distances from power plants to substations to consumers. AI development would concentrate demand in clusters of data centers, where transmission lines are inadequate to support the gigawatt-scale loads they require. Building new transmission lines can take a decade or more because of permitting (licensing) battles, land acquisition disputes, and interstate regulatory conflicts.

Texas Grid Vulnerable to State’s Maverick Ways

OpenAI’s Stargate project demonstrates the challenges AI companies face in meeting the anticipated demand for ChatGPT. OpenAI, which has yet to turn a profit, says it and some partners plan to spend $500 billion on numerous data centers under the Project Stargate umbrella. Two of the data centers are in Texas, which will make them part of the ERCOT, or the grid run by the Electric Reliability Council of Texas.

ERCOT’s independence from other grids is a matter of pride in Texas; it allows the state to avoid federal regulation. But its maverick status also makes it vulnerable. ERCOT’s more limited interconnection points toward the Eastern and Western grids mean that the state often can’t acquire power easily, as during a 2021 winter storm that knocked out 30 to 40 percent of its electric generation capacity. If Stargate’s data centers’ peak electricity needs in Texas coincide with the rest of the state’s demand, the company could face a disaster. An AI training run interrupted by grid instability can corrupt weeks of computational work, wasting both time and an enormous amount of energy already expended.

AI companies facing power shortages will struggle to get help from elsewhere, such as Europe. When Vladimir Putin shocked the European Union with the Russian invasion of Ukraine, the EU struck back with sanctions, banning imports of most of Russia’s cheap oil. The embargo created an energy crisis that increased Europe’s reliance on expensive liquefied natural gas (LNG), accelerated the adoption of renewable energy, and reduced industrial demand. The crisis, which has eased in recent months, nevertheless, led to high electricity and energy costs, hurting EU citizens and the competitiveness of European industries.

Asia Too Faces Power Crunch

Even in the absence of any EU market turmoil, Europe couldn’t buy or sell power with America. Transmission lines to move power beneath oceans are economically unfeasible with current technology.

That makes Asia a non-starter too, not only because of transmission issues but also because the region faces a significant electrical capacity shortfall for its own data centers. In Southeast Asia, data center electrical demand is expected to more than double by 2030, driven by regional hubs in Singapore and southern Malaysia, which are already under strain. Singapore, a major data center hub, imposed a moratorium on new data centers from 2019 to 2022 due to severe power and land constraints. The city-state recently announced it would free up to 300 megawatts of data center capacity in the near term. Singapore officials may accomplish the power boost by reallocating resources and adopting efficiency enhancements. Other Asian nations face similar capacity constraints.

Even in America, where AI companies are leading the charge, several American cities have delayed or blocked data center construction. The data centers would no doubt drive up utility bills, a particularly sensitive subject to local politicians. In Prince William County, Virginia, a $24.7 billion data center construction project is facing delays due to legal challenges and local opposition. It is one of 16 projects worth $64 billion that have faced delays or roadblocks since 2023, according to Data Center Watch, which tracks opposition to data centers in the US.

Trump’s Executive Order to Ease Problems

A fundamental challenge common to Asia, Europe, and the US is that improving power grid infrastructure for data centers takes 5 to 10 years, considering planning, licensing, and construction. But data centers can be operational within 2 to 3 years, thus creating a regional timing mismatch that also applies to much of the world.

A fundamental challenge common to Asia, Europe, and the US is improving power grid infrastructure for data centers

The Trump administration issued an executive order in July 2025 to start addressing some of the problems. The order calls for streamlining permit processes, accelerating data center grid connection requests, preventing “premature” decommissioning of older power plants that rely on fossil fuels such as coal, and imposing new barriers on wind and solar power, sources that some critics believe are critical to meet the surging demand.

Although the administration’s approach could help alleviate some problems, it has drawn criticism from environmental groups. At the same time, some industry advocates question whether executive action alone can achieve the reforms they believe are needed to resolve most problems.

Power Constraints Challenge the AI Boom
Companies want to build their own data centers and solutions. (AFP)

AI Advocates Want to Go Nuclear

Companies that want to build data centers have their own positions. They want to go nuclear. Microsoft recently signed a 20-year agreement to purchase electricity from the reactivated Three Mile Island nuclear reactor, a plant once shuttered due to safety concerns. Amazon has announced over $20 billion in investment to convert the same site into an AI-ready data center campus powered entirely by carbon-free nuclear energy. The company also purchased a data center adjacent to a nuclear power plant in Pennsylvania. Some advocates urge quick approval of smaller nuclear plants adjacent to planned data centers.

Unlike wind and solar power, which generate electricity intermittently, nuclear power plants typically generate power constantly, supplying the grid with enough energy to run continuously 24/7, the schedule that data centers covet. Companies also want to explore on-site power generation with nuclear microreactors and dedicated renewable power installations that have power storage mechanisms.

Although some of the administration’s and the AI industry’s proposals could be beneficial, analysts question whether the industry can implement them within the desired time frame. The nuclear projects, for example, will only make a dent in demand by 2030. The modest size of the next generation of reactors will probably require building dozens of new plants to meet the annual needs of large tech companies’ projects, which is a tall order.

Electric Needs and New Power Equation

The grid’s shortcomings won’t necessarily stop AI in its tracks, but they will shape where, when, and how development occurs. A nation that wants to lead in AI must recognize that the path to dominance runs through copper cables, transformer substations, and transmission lines as well as through semiconductor advances, more powerful chips, and neural network architectures.

A nation that wants to lead in AI must recognize that the path to dominance runs through copper cables, transformer substations, and transmission lines

AI companies’ hunger for power exposes challenges that government, industry, and AI advocates must address. The ability to generate electricity is a crucial resource that governments worldwide should master, particularly given the winner-takes-all mentality of the AI giants. If the challenges created by conflicting power demands continue to languish in ambiguity, capital and capacity will flow towards countries willing to accommodate the AI juggernaut. A new form of geopolitical competition will flourish where technological leadership and the capacity to deliver energy will prevail, with gigawatts or an even greater unit as the prize.

James O’Shea

James O’Shea

James O’Shea is an award-winning American journalist and author. He is the past editor-in-chief of The Los Angeles Times, former managing editor of the Chicago Tribune, and chairman of the Middle East Broadcasting Networks. He is the author of three books, including The Deal from Hell, a compelling narrative about the collapse of the American newspaper industry. He holds a master’s degree in journalism from the University of Missouri.
What to read next...
By
By
By
By
By
By
By
By
Eagle Intelligence Reports
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.