The Two Biggest Myths About AI Data Centers

By Ross Pomeroy

Across the country, Americans are flocking to city council, planning commission, and water board meetings trying to stop data centers from being built. Driving this NIMBYism are two main arguments: data centers will drive up electricity rates, and data centers will guzzle local water resources. Data centers are prodigious consumers of both power and water, so these views are understandable. They are also (mostly) wrong. 

Let’s examine them in order.

First, data centers have historically reduced inflation-adjusted electricity rates, or at least kept them in check. They’ve done this because power markets in the U.S. do not simply operate on supply and demand. For power to be available any time you want it, the grid has to produce and carry just as much or more electricity than what customers collectively request. But this demand fluctuates heavily, and many large power plants can’t simply be turned off or on at a moment’s notice. Moreover, solar and wind power aren’t always available when they are needed most. This means that utilities typically build more power plants than necessary to ensure reliability during peak demand. As a result, America’s utilities operate at an average load factor – the actual amount of energy consumed compared to total potential energy that could have been used based on peak demand – of just 53 percent, according to a recent analysis from Duke University.*

Unlike intermittent sources of demand, like lighting or air conditioning, data center electricity demand is both large and consistent, permitting utilities to utilize more of their grid capacity. With higher load being pushed through the grid, fixed generating and infrastructure costs can thus be spread across more consumers, reducing costs for all. This explains why Charles Rivers Associates recently found that data center buildouts did not trigger increases in retail utility rates over the past decade.

The speed with which the current AI data center buildout is playing out could imperil this historical trend, however. If demand rises too quickly, faster than the grid can adapt, costs will almost certainly rise for ratepayers. But this scenario can be prevented provided policymakers intentivize or require data center operators to install their own on-site power generation or battery storage, or mandate that they pay for any necessary grid upgrades to deliver the power they need. In part due to public backlash, large data center operators by and large haven’t been resisting these measures.

“The hyperscalers are increasingly on board with paying for any necessary grid and generation upgrades,” Brian Potter, Senior Infrastructure Fellow at the Institute for Progress, told RealClearScience.

Worries over rising electricity rates typically attract the most attention, but close behind are concerns about water shortages caused by data centers’ usage. Here, Potter also says the anger is overblown.

“Data centers use a lot of water compared to a single family home, but not all that much when you compare it to other industrial uses (which I think is the proper comparison). Data centers use less water than golf courses, less water than steel mills. The state of Arizona alone uses on the order of six to seven times as much water as data centers do for growing crops in the desert, and the data centers are generating vastly more economic value than growing alfalfa.”

AI data centers are being built, whether most Americans like them or not. Around 3,000 were planned or under construction as of December. The good news is that there actually seems to be more to like than to hate.

*This section was corrected 3/21 to clarify the meaning of “load factor” and explain how the power grid actually works. H/T to retired electrical engineer and RCS reader Ken Davis.

This article was originally published by RealClearScience and made available via RealClearWire.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
4.5 17 votes
Article Rating
58 Comments
Inline Feedbacks
View all comments
Beta Blocker
March 23, 2026 2:18 pm

Only so much power generation equipment will be available to service an ever-increasing worldwide demand for electricity. This fact alone has the potential to increase the price of electricity for the average ratepayer if AI data centers begin out-competing the power utilities for priority access to the power generation equipment needed to expand our supply of electricity.

Dieter Schultz
Reply to  Beta Blocker
March 23, 2026 5:09 pm

Only so much power generation equipment will be available to service an ever-increasing worldwide demand for electricity.

Small, modular, reactors once the manufacturing processes are sussed out, will be adding to worldwide capacity for the generation of the electricity for these data centers at ever increasing rates.

Yes, they will start off with low production rates but within 5-10 years they will be building and installing new, nuclear, power reactors at rates that will likely match the growth in the demand for the electricity required by our grid.

My educated guess is that within 10 years the electrical needs of the new data centers will be fully met by the increased capacity that the manufacturing system will bring online.

rovingbroker
Reply to  Dieter Schultz
March 24, 2026 3:01 am

I remember learning about this in school. I think they called it, “Supply and Demand.”

KevinM
Reply to  Beta Blocker
March 23, 2026 7:23 pm

Only so much power generation equipment will be available…
Why? Did someone forget how to build it?

Beta Blocker
Reply to  KevinM
March 23, 2026 11:02 pm

Ramping up production of turbines and generation units and transformers and a variety of other system components is not a quick and easy thing. A decade might be needed for equipment suppliers to expand the supply chain sufficiently enough to meet worldwide demand.

Dave Andrews
Reply to  Beta Blocker
March 24, 2026 8:16 am

Yep. The IEA say already 2500GW of projects are stalled in grid connection queues worldwide and to meet electricity demand to 2030 will require annual grid investment to increase by 50% from today’s $400bn% .

They also note that there is a mismatch in the time required to plan and build new grid infrastructure compared to generation projects or data centres. Planning, permitting and completing new grid infrastructure can take 5 – 15 years whereas new builds on the supply and demand side are faster at 1-5 years for wind and solar, 1-3 years for data centres and 1-2 years for EV charging infrastructure.

At the same time they note prices for key grid components have nearly doubled in the last 5 years.

IEA ‘Electricity 2026’ (Feb. 2026)

real bob boder
Reply to  Beta Blocker
March 24, 2026 12:25 pm

Nonsense, there is nothing difficult about ramping up production these types items

cipherstream
Reply to  KevinM
March 24, 2026 8:35 am

Probably not so much about “how to build” as “where they are built”. A lot of the transmission components found at substations are built outside of the country (in the USA at least) and so the transmission system build out is going to be a bottleneck.

mikeq
Reply to  Beta Blocker
March 23, 2026 11:35 pm

Traditionally, energy intensive industries have had to provide for their own energy needs through on-site generation, with grid connections only required for back-up purposes primarily, with a secondary role of grid feed-in if they have a surplus.

In addition, water intensive industries usually have to provide their own water treatment systems to re-circulate water for re-use and to control pollution and contaminants in discharges.

These issues are routine in mature energy and water intensive industries.

That these issues are being presented as problems for energy and water intensive data centers is evidence of either or both policy dysfunction and/or data center operators improper attempts to transfer responsibility to others.

All this shite and nonsense about a matter that was resolved for mature energy and water intensive industries generations ago.

Reply to  mikeq
March 24, 2026 4:31 am

The media love to talk about problems or exaggerate them or just invent them.

Bob
March 23, 2026 2:28 pm

Good information.

Victor
March 23, 2026 2:32 pm

AI data centers use large amounts of water.
What happens to the used water from AI data centers?
Does the water used by AI data centers disappear?
Is the water from AI data centers toxic?
Is it possible to use the water from AI data centers for irrigation?

Paul Seward
Reply to  Victor
March 23, 2026 3:33 pm

The water evaporates in cooling towers and can be seen escaping from the top of the cooling towers. It is nontoxic, just H2O, but is not reused and escapes

AlbertBrand
Reply to  Paul Seward
March 23, 2026 3:49 pm

Evaporative cooling requires 540 calories per gram that is equivalent to raising the temperature by 540 degrees centigrade, therefore quite efficient say compared to using a river to cool the plant like Indian Point did.

oeman50
Reply to  AlbertBrand
March 25, 2026 5:42 am

Once through cooling (like at Indian Point and many other power plants) do not consume water when it is “used.” It is returned to the water body, although at a higher temperature. This does increase the evaporation from the water body, but at 1/3 of the rate of the water loss to the atmosphere of a cooling tower.

sidabma
Reply to  Paul Seward
March 23, 2026 3:54 pm

We have a technology that can recover over 95% of the water in the cooling towers plumb. Water in many communities is a precious commodity and they just can’t afford to supply and then lose that much water. This is why we developed a system that will capture that water so that it can be reused.

KevinM
Reply to  Paul Seward
March 23, 2026 7:38 pm

There are several comments here about evaporative cooling, which IS a standard industrial cooling system. Are we sure the data centers we’re talking about use evaporative cooling? I had an understanding that they used large heat exchangers for water cooling.

Paul Seward
Reply to  KevinM
March 24, 2026 10:14 am

If water to water heat exchangers are used, then the water is returned to its source (at a higher temperature) and is not lost or wasted. If the heat exchanger is not leaking, then there would not be any cross contamination and could be used for irrigation.

starzmom
Reply to  Victor
March 23, 2026 3:47 pm

If the water is used for cooling, a lot will be lost to evaporation. Otherwise, it would go back to the original water source, most likely. Most industries have to treat industrial waste water back to close to drinking water standards.

Jeff Alberts
Reply to  starzmom
March 23, 2026 4:34 pm

Doesn’t evaporation come back down as rain?

Reply to  Jeff Alberts
March 23, 2026 8:13 pm

In Arizona, during the summer monsoon, maybe. Mostly it will end up elsewhere.

KevinM
Reply to  starzmom
March 23, 2026 7:40 pm

Are we sure they standard data center design uses cooling towers?

KevinM
Reply to  KevinM
March 23, 2026 7:49 pm

Link to the best answer to my question:
“https://eatyourfrog.substack.com/p/the-fallacy-of-closed-loop-cooling”

TLDR: Yes evaporative cooling. It used to be air cooling, but new designs use evaporators. My knowledge was outdated.

Reply to  Victor
March 23, 2026 4:56 pm

Is the water from AI data centers toxic?

Yes. It is a potent “greenhouse” gas.

If CO2 gets a reprieve under the EPA rescinding the endangerment finding then water vapour from data centre cooling towers may also get a reprieve.

In places with cool climates, the data centre heat could be used for district heating. It will cause increased heat island effect so may as well put the heat to good use on it way to the atmosphere.

Dieter Schultz
Reply to  Victor
March 23, 2026 5:28 pm

Does the water used by AI data centers disappear? … Is it possible to use the water from AI data centers for irrigation?

It’s not quite as simple as that… evaporator cooling is highly efficient allowing for a significant reduction in electricity usage, Especially in low humidity climates like the deserts found in the SW.

So, the trade-off is between water usage and the usage of electricity for cooling the data centers, sometimes the savings. One source reports that: “evaporative cooling can significantly reduce electricity usage in data centers, consuming only about one-fourth of the power used by traditional mechanical air conditioning systems“. Evidently, the 50% relative humidity level seems to be the optimal target for the server farm’s atmosphere.

But, in all uses of the water, the water goes into the atmosphere and is not economically recoverable.

Other considerations that I’ve seen in data centers is around the cooling vs not, or partial, cooling paradigm which includes the trade-off costs between server failure rates (and replacements) and cooling costs.

What some research shows, at least in some climates, is that, given a known failure rate from the higher ambient temperatures in the data centers, it is cheaper to just swap out failed servers based on the higher temperature-related server failures than it is to either cool, either 100% mechanically or some mix of evaporative cooling and mechanical systems.

Admittedly, the stats I’ve seen are from non-AI, non-GPU, server farms that might have a different thermodynamic but, generally speaking, evaporative water cooling is not like using water to cool heat-based electricity generators.

sidabma
March 23, 2026 2:36 pm

These new AI Data Centers need to build away from the subdivisions. The neighbors do not like the night security lights or the drone of the cooling fans.
The AI Groups also need to generate their own electricity in a Community Power Plant. These power plant buildings can be built around the natural gas turbines or boilers. These buildings can be constructed to be pleasing to the eye and the area they are in.
Now we want to have these Community Power Plants to operate at over 90% energy efficiency. Natural gas is America’s clean energy source. The residential and industrial areas have been applying this energy saving ~ emission reducing technology since the early 1980’s. It is now time especially with AI to make their electricity producing facilities to also operate at over 90% energy efficiency.
There is a program called 45Q that is already being used in another industry to eliminate the created CO2 that happens during combustion to be kept from going into the atmosphere. For the AI Data Group it’s very simple. All they have to do is Host all of the exhaust that was created during combustion to a 3rd party. They can reward them with the 45Q and then claim that they are using Emission Free Electricity in their AI Data Center.
This is better than using solar or wind electricity. The Community Power Plant will provide to the Data Center 24/7/365 reliable steady electricity. Because the electricity is delivering directly this electricity, there are no “line losses”. The AI Group only pays for the electricity they consume as delivered by their own power plant.
America under President Trump has become Energy Independent and Energy Strong. Lets now show the world how Energy Efficient America can be.

Rud Istvan
March 23, 2026 2:45 pm

With respect to data center water usage, the ‘complaint’ is largely wrong due to a big misunderstanding. Lots of water is used to cool the electronics. But the system water ‘consumed’ is only to evaporative cooling. Physics says a little evaporation does a lot of cooling, as steam electric generating plants have shown for many decades. I see more NIMBY than anything else, since almost all of the big water volume is recycled over and over.

Electricity is more complicated in theory, but much simplified by practical realities. The grid cannot possibly build out fast enough to handle these big new data center loads at the rate they want to be built. So the obvious simple practical solution is data center self generation, which is easy and relatively fast with CCGT and the US abundance of natgas. Which is what the real (now being built) mega data centers have no choice but to do.

Separate observation. I don’t think ‘3000’ new ‘AI’ data centers are going to happen. The analogies to the ~just pre2000 ‘dot.com’ bubble are astounding to someone who professionally lived thru it. Sun Microsystems (“We put the dot in dot.com”) did not survive. The surviving post bubble burst husk was absorbed into Oracle. Same will happen here. Just dunno yet who the Sun Microsystems equivalents are. When I do, will be the second biggest short opportunity of my life.

sidabma
Reply to  Rud Istvan
March 23, 2026 4:45 pm

Rud. President Trump did not hit Iran’s power plants because he realized what it would do to the country – long term. I only heard that when our military went into Venezuela they did something to “short circuit” their electrical grid so they could do what they had to do in the safety of darkness and confusion.
I some how don’t think it is how, but when is our electrical grid going to feel the effects of a terrorist attack. If a good number of our GW power plants are put out of commission seriously, where will that leave America, besides being in the dark? For how long?
I am really thinking that for America’s energy security having a few thousand or hundred thousand 25 or 50 MW coal and natural gas Community Power Plants operating and on standby would only be a wise decision. It’s a lot easier for the grid to be power “balanced” (solar & wind) utilizing these smaller power plants and having the GW power plants operating at a fixed percentage.
If done right a Community Power Plant could be constructed to blend in and not look like an industrial eyesore. And if done right and in the right locations these Community Power Plants could be operated at near zero emissions. The AI Data Groups and or the communities could Host the exhaust to a 3rd party who would turn the exhaust into good paying full time jobs for the local community. This beats wasting all these Btu’s and CO2 and Water. Utilize it, Don’t waste It.

Reply to  sidabma
March 24, 2026 10:38 am

when is our electrical grid going to feel the effects

I’m actually very surprised that it hasn’t already

Beta Blocker
Reply to  Rud Istvan
March 23, 2026 5:32 pm

“So the obvious simple practical solution is data center self generation, which is easy and relatively fast with CCGT and the US abundance of natgas.”

In a world economy with an ever-growing demand for electricity, there will be competition for access to a limited supply of power generation systems and equipment, including access to CCGT power generation systems and equipment.

If AI firms can successfully outbid the power utilities for access to limited supplies of CCGT systems and equipment, then those AI firms will own much, if not most, of the new-build CCGT power generation capacity we install in the United States, not the power utilities.

IMHO, if that scenario happens, we will see AI data centers make their profit from selling electric power to their customers, rather than selling AI data support services. In that case, their customers will be the power utilities whose only option for access to new capacity is to buy it from the AI data centers.

I am strongly suspicious that this kind of thing, selling electric power rather than data services, is what the CEO’s who will be building the new AI data centers actually have in mind.

oeman50
Reply to  Beta Blocker
March 25, 2026 5:47 am

Manufacturers’ output for gas turbines is already under contract for several years. Plus try to get an air permit for any gas fired power plant in a blue state.

George V
March 23, 2026 2:59 pm

I’ve not heard any discussion of how much natural gas will be needed to generate the datacenter’s electricity. What will drive up retail electricity rates might be a nat. gas shortage. Heaven forbid that coal fired power plants (Ick!! Ack!! Patooey!!!) be built to power the AI datacenters.

March 23, 2026 3:02 pm

“The state of Arizona alone uses on the order of six to seven times as much water as data centers do for growing crops in the desert, and the data centers are generating vastly more economic value than growing alfalfa.”

Be careful with this messaging. This sounds tone-deaf about core values. When push might come to shove, don’t put data centers up against farming, even for animal feed. The only reason for high water consumption for a data center would be for evaporative cooling for lowest-cost heat rejection. Just accept the premium operating cost for dry heat rejection, in locations where tight water supply is likely to become an issue.

Thank you for listening.

Lark
Reply to  David Dibbell
March 28, 2026 4:26 pm

I believe Mr. Pomeroy was referring to the fact that the Arizona state government is allowing Saudi corporate farms to drain the AZ aquifer to grow alfalfa to feed cattle overseas.
I’m pretty sure Arizonans would prefer data centers, and that this is not tone-deaf to core values unless corruption is such a value there. (Given their government, that’s a possibility.)

hdhoese
March 23, 2026 3:05 pm

To what extent are data centers like banks, storing and moving information instead of money? Money is not always well used, but I neither is data. Are both while necessary, however not the end product? 

mleskovarsocalrrcom
March 23, 2026 3:29 pm

Someone school me please. Doesn’t air conditioning create more water than it uses? Other than toilets and drinking water for the operators how does a data center use water?

Jeff Alberts
Reply to  mleskovarsocalrrcom
March 23, 2026 4:55 pm

I don’t think AC creates water, it just condenses it from the surrounding air.

mleskovarsocalrrcom
Reply to  Jeff Alberts
March 24, 2026 8:13 pm

Same difference. Water is a byproduct of the condensation. It doesn’t use water for anything.

Reply to  mleskovarsocalrrcom
March 23, 2026 8:17 pm

For cooling. All those electrons running through all of that equipment puts out tons of heat.

mleskovarsocalrrcom
Reply to  johnesm
March 24, 2026 8:20 pm

For cooling what? That’s what the AC is for. Unless they make computers different than when I was in the business there’s no water involved. We would size AC needs by the BTU output of the system being added or if it was a new install/data center add all the systems needs …. +200 BTU for each person 🙂

Reply to  mleskovarsocalrrcom
March 25, 2026 6:13 am

I guess it HAS changed, then. They make water-cooled PCs for high-end gaming – and these systems (CPUs in particular) almost certainly run hotter than that. Nothing that unusual anymore about water-cooled computers.

mleskovarsocalrrcom
Reply to  Tony_G
March 25, 2026 7:27 am

The PCs are a closed cooling system, much like a car ICE engine. Nothing like the rows and racks of servers in today’s data centers. Have you ever been inside a raised floor data center? They don’t need water cooling because the ambient air is always cool (around 70F) and the AC removes the water in the air responsible for the heat not dissipating and humidity that affects the computer components. I guess the answer to my question is nobody knows, or they aren’t telling 🙂

Reply to  mleskovarsocalrrcom
March 25, 2026 12:11 pm

The water-cooled PCs were an example because it appeared you had never heard of water cooled computers at all.

But I found this for you: https://www.datacenters.com/news/why-liquid-cooling-is-becoming-the-data-center-standard

mleskovarsocalrrcom
Reply to  Tony_G
March 25, 2026 4:58 pm

“A closed-loop system circulates coolant through these plates, removing heat at the source.” Just like with the PC. The water/coolant isn’t being consumed and that’s my point. I’m also betting the servers are still in AC controlled centers as that heat “removed at the source” still needs to go somewhere. I just don’t see where water requirements are a problem for data centers.

Reply to  mleskovarsocalrrcom
March 26, 2026 7:13 am

You appeared to be questioning the use of water at all. You started with “for cooling what?”
So I don’t understand what you were asking then.

mleskovarsocalrrcom
Reply to  Tony_G
March 26, 2026 10:49 am

The premise of the post was to debunk two myths about AI data centers … one being the use of water. I wanted to know how anyone even thought excess water was being used. Bed time for this thread 🙂

March 23, 2026 4:45 pm

Any data centre proponents in Australia will be required to build their own “renewables” power supply or lock in PPA with solar and wind farms.
https://www.energy.gov.au/business/equipment-guides/data-centres

Where circumstances and space allows, onsite solar PV arrays are an effective way to reduce energy costs and greenhouse gas emissions. However, since the energy density of data centres is so high, it’s sometimes not practical to use on-site renewables such as solar or wind. In such cases, power purchase agreements (PPAs) for offsite renewable energy are a viable alternative. Companies are also investigating more novel options, including energy storage solutions.

None of that will be low cost base load power. It means there will be no large scale data centres in Australia.

March 23, 2026 6:08 pm

How much do they use? Inquiring minds want to know. There were no specifics in the article.

Allen Pettee
March 23, 2026 6:47 pm

The biggest myth is that “AI” is intelligent. It is not. It is just a faster acting supercomputer system.

1saveenergy
Reply to  Allen Pettee
March 24, 2026 2:47 am

AI is just a fast consensus browser, not accurate or intelligent.

Keitho
Editor
March 24, 2026 1:52 am

Is the used water contaminated or can it be returned to the supply source?

Sparta Nova 4
Reply to  Keitho
March 24, 2026 5:42 am

Cooling water is, or should be, only contaminated with thermal energy.

March 24, 2026 4:38 am

Supposedly, the chip makers are trying to create less power demanding chips. That could help this problem. At least for the short term.

cipherstream
March 24, 2026 8:45 am

Another popular complaint is the infrasound effects being experienced with some people who reside near data centers. I believe the water use issue affects the availability of the water to citizens of the communities in which the data centers are being built. The drawdown of water from the data centers affects the water table in that the minimum depth for water retrieval is now deeper than the water wells are currently drilled. The water table doesn’t recover in these situations and so you must “rework” your well to get access to the water at the new depth. This would likely be the case whether it was a DataCenter or another Industrial “plant” with the same water requirements was built at that location. It just so happens that the DataCenter is the antagonist as they were what was being built.

Reply to  cipherstream
March 24, 2026 11:25 am

IF they’ve drilled their own well for supply, drill another one to the same aquafer and reinject the used water.

mcmgk5
March 25, 2026 1:15 pm

Somebody explain to me how a data center actually consumes water . . .

Reply to  mcmgk5
March 25, 2026 1:22 pm

cooling