The AI Race Isn’t Just About Chips. It’s About Power.

By David Hodges

The global competition around artificial intelligence is usually framed as a race for chips, models, and computing capacity. That is only part of the story. The binding constraint is increasingly electrical power. In the U.S., data centers already account for roughly 4% to 5% of electricity use, and EPRI projects that figure could rise to 9% to 17% by 2030 as AI deployment accelerates.

That reality should change how we talk about AI leadership.

AI is not just a software race. It is an infrastructure race. The countries that can most quickly permit, build, connect, and operate reliable power will have a decisive advantage in the next phase of digital growth. Every breakthrough in artificial intelligence still depends on physical systems: generation, substations, transmission, cooling, backup architecture, and the industrial work required to make it all run.

The scale of demand is no longer hypothetical. EPRI estimates U.S. data center electricity use in 2024 at roughly 177 to 192 terawatt-hours, with the potential to climb significantly by the end of the decade. New AI-oriented campuses are no longer 20- or 40-megawatt facilities, but increasingly in the 100-megawatt to 1-gigawatt range. That is no longer a conventional commercial load. It is grid-shaping infrastructure.

Meanwhile, the grid is already under pressure. Lawrence Berkeley National Laboratory reports that nearly 2,300 gigawatts of generation and storage capacity were seeking interconnection at the end of 2024. These queues are a reminder that demand growth does not automatically translate into available power. A project is not “powered” because someone wants to build it – it is powered when generation, transmission, interconnection, and on-site systems align on schedule.

That is why reliability matters as much as supply. AI infrastructure does not tolerate instability. These facilities require high-density power, continuous uptime, advanced cooling, and redundant systems that perform exactly as designed. For many projects, the question is not just whether enough electrons exist on paper, but whether reliable, dispatchable power can be delivered when the facility needs to go live.

This is where the policy conversation often misses the mark. The debate still centers on semiconductor manufacturing and model development. Those are critical issues, but the economic winners in AI will also be determined by permitting speed, grid upgrades, transmission buildout, generation strategy, and the industrial workforce capable of executing complex facilities under hard deadlines.

A practical energy mix will be required. Renewables will continue to expand and should remain part of the buildout, but the operational profile of AI infrastructure means firm, dependable power will remain essential. In the near term, that points to a larger role for natural gas, hybrid systems, grid modernization, and, in some cases, on-site or dedicated power solutions.

There is also a broader local dimension. Communities are right to ask what large data center clusters mean for land use, water demand, local grids, and long-term planning. The answer is not to stop building, but to plan better, site smarter, invest earlier, and be clear-eyed about what this infrastructure requires. Done well, these projects can expand the tax base, create jobs, and strengthen regional energy systems. Done poorly, they create friction, delay, and distrust.

The AI race will not be decided by algorithms alone. It will be decided by whether we can build the physical backbone on which advanced computing depends -power plants, transmission, substations, cooling systems, and the teams that bring it all online.

The next chapter of AI will be measured in gigawatts.

David Hodges is Chief Operating Officer of LP Energy Services Group, an industrial services company supporting energy, infrastructure, and mission-critical facilities across North America.

This article was originally published by RealClearEnergy and made available via RealClearWire.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
Subscribe
Notify of
5 Comments
Inline Feedbacks
View all comments
KevinM
March 21, 2026 6:36 pm

If you build it in USA or Europe you get higher cost.
If you build it in China you get mandatory answers that flatter CCP.
India looks like the answer to me… if you trust that government to last.

Intelligent Dasein
March 21, 2026 6:49 pm

Energy costs are about to skyrocket due to the ill-advised war with Iran, and AI is a meme stock bubble that does not produce a real return on investment. Ergo, it’s not happening.

We’re going to see an unprecedented crash of the G7 very soon.

Reply to  Intelligent Dasein
March 21, 2026 7:16 pm

I kind of agree. Energy prices are going to be volatile in the near term, and they are going to be a problem for some regionally (looking at you Germany); but overall, the damage is not going to be that great.

I liken AI to the internet bubble. Yes, there is a big crash and subsequent shakeout ahead, but the survivors will emerge as well positioned near monopolies. Think Sears vs Amazon.

GeorgeInSanDiego
March 21, 2026 7:00 pm

I’m more concerned about the possible socal upheaval. Artificial intelligence looks to me to be about to do to white collar workers what automation did to blue collar workers.

March 21, 2026 7:09 pm

To give you an idea of just how fanatical the AI developers are about reliability, I know of a project that proposes 300% gas turbine capacity (one operational, one spinning reserve one on cold standby), fed by a dedicated gas pipeline. On top of all of that, they were looking at on-site compressed natural gas storage because “What if there is a problem with the pipeline?”.