Essay by Eric Worrall
Researchers have discovered trying to make AI reason like humans leads to between a 30x to 700x rise in energy consumption.
Hugging Face Says AI Models With Reasoning Use 30x More Energy on Average
Models that “think” through problems step by step before providing an answer use considerably more power than older models.
Edd Gent
Dec 15, 2025It’s not news to anyone that there are concerns about AI’s rising energy bill. But a new analysis shows the latest reasoning models are substantially more energy intensive than previous generations, raising the prospect that AI’s energy requirements and carbon footprint could grow faster than expected.
…
And a new analysis from researchers at Hugging Face and Salesforce suggests that the latest generation of models, which “think” through problems step by step before providing an answer, use considerably more power than older models. They found that some models used 700 times more energy when their “reasoning” modes were activated.
“We should be smarter about the way that we use AI,” Hugging Face research scientist and project co-lead Sasha Luccioni told Bloomberg. “Choosing the right model for the right task is important.”
…
Read more: https://singularityhub.com/2025/12/15/hugging-face-says-ai-models-with-reasoning-use-100x-more-energy-than-those-without/
The numbers from Hugging Face are an eye opener;
AI Energy Score v2: Refreshed Leaderboard, now with Reasoning 🧠
Community Article
Published December 4, 2025
Sasha Luccioni
Boris GamazaychikovToday, we’re excited to launch a refreshed AI Energy Score leaderboard, featuring a new cohort of text generation models and the introduction of reasoning as a newly benchmarked task. We’ve also improved the benchmarking code and submission process, enabling a more streamlined evaluation workflow. With the increased interest of the community towards measuring and comparing the energy use of AI models, our benchmarking efforts are more important than ever to inform sustainably-minded AI development and policymaking.
…
According to our analysis, reasoning models use, on average, 30 times more energy than models with no reasoning capabilities (or with reasoning turned off). Honing in on specific models with and without the reasoning functionality enabled, we can see a huge difference: between 150 and 700 times more energyused by the same model with reasoning enabled compared to without:
Model name Params Reasoning GPU energy (Wh) per 1k queries Energy Increase due to Reasoning DeepSeek-R1-Distill-Llama-70B 70B Off 49.53 154 DeepSeek-R1-Distill-Llama-70B 70B On 7,626.53 Phi-4-reasoning-plus 15B Off 18.42 514 Phi-4-reasoning-plus 15B On 9,461.61 SmolLM3-3B 3B Off 18.35 697 SmolLM3-3B 3B On 12,791.22 This can be explained in large part by the number of output tokens generated by the models themselves (for “reasoning” through their answers) – models with reasoning enabled use between 300 and 800 times more tokens than their base equivalents. This adds up as reasoning models are used more and more in consumer-facing tools and applications, since they will tend to output longer responses (which has also been found in recent research).
Read more: https://huggingface.co/blog/sasha/ai-energy-score-v2?utm_source=chatgpt.com
The last model, SmolLM3-3B, used 13 watt hours to answer a single question when reasoning was turned on.
I guess this explains why the US energy grid is buckling. Spend a few hours chatting to high end AI systems, as many people increasingly do, and you’ve consumed the same energy as your entire house has consumed for the last 24 hours.
What seems clear is any efficiency advances are not going to be used to reduce energy consumption. All the efficiency improvements are going to be fed into sustaining ever more powerful AI models, to gain a competitive advantage over rivals.
As for greens, they’re going to be left gasping in the coal smoke. Nobody is going to hold back from developing and using this new marvel in our midst, because they are worried the oceans might rise by a few inches in a hundred years.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Doesn’t look like there will be much coal smoke
Chart: Clean energy remains dominant in the US — despite Trump
https://www.canarymedia.com/articles/clean-energy/us-new-wind-solar-batteries-2025-trump
Some people will be really sad when the AI bubble bursts.
Quickest to build (even if it were true) but what good is that if all you have is unreliable and not flexible? Worse than pointless.
They are building CAPACITY, not ENERGY 🙂
And only potential capacity, if the conditions at any given time are good enough.
Gee , if W&S are so cheap and easy to deploy , why is there a backlog in gas turbines ?
“. trying to limit the development of solar, battery, and wind energy…”
Not limiting anything, just not subsidizing it.
“…These are the quickest sources of energy to deploy…”
But useless for data center power supply.
“…even if climate concerns weren’t a factor…”
Climate concerns are not a factor.
“Not limiting anything, just not subsidizing it.”.
Got it in one.
Except battery subsidies are high as ever. That is why batteries are booming.
Does Trump plan to stop those subsidies?
“These are the quickest sources of energy to deploy…”
Luser misinformation. Gas is very quick to build.
You have to destroy a whole heap of habitat and environment to implement wind or solar.
When I would prepare a “forest cutting plan” for a nice THINNING of a forest here in Wokeachusetts, the state and enviro groups acted as if I was building a giant nuclear reactor. The plan would be evaluated by numerous agencies- all hoping to find something they wouldn’t like- to prove they’re worth their huge salaries. That drove me crazy and why I decided to retire at the young age of 75 though I was fully capable of continuing. When a solar “farm” was proposed next to my ‘hood in north central Wokeachusetts, I contacted those agencies and mentioned all the rare and endangered species ON the site. One agency guy told me to forget it- “because the people at the top have decided to move ahead with green energy”. So, an agency that might have stopped my excellent forestry project because of some rare insect- didn’t have the guts to speak up when several species of wildlife were actually threatened, and not insects- include coyotes, box turtles, rare species of salamanders, rare species of birds, etc.
Undependables are useless for AI
Weather dependency has never been positive for any form of “intelligence”.
Watch while Google with massive solar arrays, batteries, and gas turbines leaves everything else in the comment section dust.
After spending the past week with my two granddaughters, I finally came up with the perfect analogy for Wind and Solar power….its like using a 2 year old to run your energy system!:) Snuggly, irratic, sleeps or runs randomly. Plenty of tantrums. An AI will love it!
Does “irratic” mean they’re irritating?
Delusional as always.
Wind and solar provide very little energy in the US.
The dip in coal has been taken over by GAS.
If you click on the chart, it will expand and become clear. Click on the “x” in the circle to contract the chart and return to Comments.
Harold:
Thanks for the tip.
Maybe you can help me: How do you add a chart {like the one bnice did above}
to a comment?
After you post a comment, click on the sun and mountain icon in the lower right corner of the comment box. The instruction is:Attach image to this comment. You need to create a file with the images. You can use screen capture to obtain an image to put into the file. After you click on image icon, the
image file will appear on the screen, click on the image and it will appear below the comment box and the image file will disappear. You can attach only one image to a comment. This feature is easy to use and unique to WUWT.
To create a file with images in MS 11, Ask Bing or Copilot: How do create a file with images which I want to attach to a comment at the WUWT website? Since I am 81 and have little computer skills, my tech savy son does image capture and storage for me.
Shown below is the home page of the late John L. Daly” website: “Still Waiting For Greenhouse”. From the home page, go to end and click on the selection: Station Temperature Data. On the “World Map” click on Oz. There is a list of stations. Click on “Adelaide”. The chart shows the average annual temperature from 1857 to 1999. No effect of CO2 on weather and temperature in Adelaide.
As has been pointed out many many many times, not all users have the “add image” icon available. If you’re going to be Mr Explainer every time someone adds an image, at least be thorough.
Thank you for the info. I just try to be helpful. My OS is MS 11. Why don’t some commenters have the “add image icon”?
When I see a typo or incorrect info in a posted comment, I mention to the commenter that there is a procedure for correcting typos or adding more info to it. I mention that if you move the pointer to lower right corner of the comment box, there will appear a small gear wheel. If you click on it, there appears the instruction “Manage Comment”. If you click on it, the comment appears and the instruction “Edit”. After making corrections, you click on “Save”. You have a five minute window for making corrections after posting a comment.
BTW, Did you go the late John Daly’s website? If everybody learned of this website, all this nonsense about greenhouse gases, global warm and climate change would vanish overnight.
“ limit the development of solar, battery, and wind energy”
Yes…. Wind and solar, with or without batteries…. are of very limited real usefulness.
And totally unsustainable.
Right the old AI bubble burst will save you … sure 🙂
GREEN MADNESS: The Waste and Destruction of One Industrial Wind Turbine Project:
https://www.youtube.com/watch?v=ubmYlwMpUVY
The Cassadaga Wind Farm has a nameplate capacity of approximately 125 MW and a projected annual output of around 397,353 MWh, although one source indicates an actual annual output of 164,250 MWh.
Capacity vs. Output Details
Nameplate Capacity: The project has an installed, or nameplate, capacity of approximately 125 to 126 MW. This is the maximum potential power the facility could produce under ideal, continuous operating conditions.
Projected Annual Output: The facility was initially expected to operate at an annual net capacity factor of approximately 36%, which translates to a projected annual energy production of up to 397,353 MWh.
Actual Annual Output: A power plant profile published in October 2024 states the project currently generates 164,250 MWh of electricity annually.
Capacity Factor Implication: The actual annual output of 164,250 MWh corresponds to a significantly lower capacity factor than the projected 36%.
Calculation: (164,250 MWh) / (125 MW * 24 hours/day * 365 days/year) ≈ 15% capacity factor
Expanding production of gas turbines ain’t gonna a big problem.
I actually don’t think the AI power consumption will increase as sharply as projected .
As a kid reading Popular Science in the ’50s I’d read descriptions of a computer as` powerful as the brain would be as big as the Empire State Building & require the power of Niagara .
These data centers are matching that , but they do something quite different than brains , crunching the entire contents of the Web . But the chips , evolving from GPUs are far from optimal for the tasks . They are incredibly power hungry — as much as a toaster in a few cm^2 .
By comparison , our brains draw ~ 15 – 20 watts .
“By comparison , our brains draw ~ 15 – 20 watts .”
Will AI have intuition? It takes more time and experience in how the brain operates which still seems different than all these powerful circuits. Is the human bias ‘madness of crowds’ or being skeptical?
Will AI sleep 8 hr a day?
Will AI forget to put a spoon in the dish washer?
Will millions of AI take out mortgage they can’t afford?
If all of the queries can be reduced to a few thousand that come up again and again, then the whole program can be reduced to a look-up table. Once someone has asked AI to photoshop a sea otter playing violin, then it’s done – the AI has a photo of a sea otter playing violin. Someone just needs to tell AI to build a photo database with a structure that includes “animal” and “instrument” as keys.
Without a superconductor or optical breakthrough it will be worse than predicted. The AI starts on more an more complex tasks the consumption goes up with it.
No AI is in any way “crunching the entire contents of the Web”.
The waste heat from Data/AI centers could be captured and fed into district heating networks, which would offset it somewhat.
But building all those heat networks (as per Miliband’s insane vision for the UK) would involve decades of chaos trenching down all the roads in our towns and cities!
Maybe they could put it in bottles, and ship it that way 😉
Like blinker fluid….. or wiring harness smoke…. 🙂
Squelch oil, grid squares…
When I was stationed at Ft Riley, we got one guy so often with that stuff, that when we told him to check the water in the APC batteries, he wouldn’t believe us that batteries had water, until we showed him.
“…Choosing the right model for the right task is important…”
Maybe we could use an AI to do that. /s
This isn’t really all that surprising. Brains and thinking require a large amount of energy. In biological evolution there is probably selective against the evolution of large brains. Metabolic energy used for nerve cells and a large brain comes at the expense of metabolic energy that could be used to produce more off-spring, grow muscles tissue, antlers or tusks, or survive a period of food scarcity. In the earth’s four billion year history humans are the first animals to evolve a large thinking brain.
In humans about 20% of our metabolic energy is consumed by the brain. For comparison a chimpanzee’s brain consumes only about 10% of the chimpanzee’s metabolic energy. It is interesting to note that brain expansion in our earliest ancestors in genus Homo didn’t actually start until after our ancestors switched to a more calory rich meat supplemented diet and until after the use of fire allowed our ancestors to unlock more energy from their food by cooking it. Additionally, there were some changes in the brain’s blood circulation which allowed for more cooling of the brain.
I’m not saying it is exactly the same for artificial intelligence. Just saying it isn’t surprising that “thinking” (or whatever AI is doing) requires a lot of energy.
I would call it comparing/analyzing rather than thinking.
The quoted text shows zero-sum thinking. A better way of looking at it is that growth of AI and its use of energy can both lift our standard of living and, by economies of scale, reduce our energy unit cost. We must already be using AI to try to develop nuclear fusion – now there is an amazing way forward. With AI’s energy guzzling we have the potential for mankind’s greatest advance ever, dwarfing agriculture and the industrial revolution. I would still put democracy first, though.
I asked AI to make me a nuclear fusion power plant. It said I would have it in 20 years. Now I see solar panel ads on every search result.
The “AI” responded: “OK, you’re a nuclear fusion power plant.”
To not see ads, get an ad blocker. Do a search for a free ad blocker.
Europe (and Vermont) need to ban AI data centers so we can move ahead elsewhere with plentiful energy.
Are there any AI data centers in VT?
The latest generations of AI chips/modules are dramatically more power efficient, while also producing significantly higher throughput. This is why they also drive so much demand that orders are being taken months before the 1st modules are even produced.
Adding context for the article’s chart:
“An average U.S. home uses around 10,700 kilowatt-hours (kWh) of electricity annually, which breaks down to roughly 900 kWh per month or 30 kWh per day”
30 kWh = 30,000 Wh.
1250watts every hour of every day … sun, rain, wind or no wind
It’s actually worse as it peaks morning and evening.
Someone correct me if I’m wrong but I don’t see a problem here. The tech companies chip into build more power, nuclear and clean fossil fuel required. Use AI responsibly, you don’t use a rock truck to transport a lawn chair. This article is pretty much nonsense.
The whole responsibly in your answer is the problem. It’s a race both military and companies and they don’t care about responsibly and good luck trying to enforce it as it’s a war. First one there wins and there are no prizes for second.
I think your numbers are a bit off. You indicate that the example AI with reasoning ON consumes 13 watt-hours per question. Then you indicate that a couple of hours of asking questions is as much power as your house consumes in a day. That is simply ridiculous. I live in a small 2 bedroom apartment and my average electricity consumption is about 1,000 kWhrs per month. That puts my daily consumption at 33 kWhrs, which is 33,000 watt-hours per day. You are surely not saying one could ask 2,500 questions to an AI in a couple of hours now are you?
I’m not challenging that AI with reasoning consumes 1 or 2 orders of magnitude more energy per question. I’m saying it’s really not that much of a portion of one’s total energy consumption as you alluded to. My high end computer system which I use for my hobby of flight simulation consumes about 500 watts and a 2 hour flight is then about 1,000 watt-hours. That would be equivalent to about 77 questions asked of this AI “energy hog”.
Or my air conditioning system (live in south Florida so even now Dec 20 the AC runs) which consumes 3,000 watts, runs for between 4 and 6 hours per day. So it eats 12,000 to 18,000 watt-hours per day. Your AI consumption example is completely off base. Does it tax an already burdened power grid? Maybe, especially if the green nutbar movement keeps shutting down reliable hydrocarbon fueled generating plants. But it is hardly the most worrying addition to overall power consumption, and certainly not as serious as owning an EV, where a Tesla with an 85 kWhr battery then requires close to 100,000 watt-hours to fully charge. (well you are never supposed to drain any battery below 20% charge, but my point is charging is not 100% efficient)
Charging one’s virtue signalling Tesla every few days is a far cry from asking a few questions to a reasoning AI every day in terms of burdening the overall energy grid.
“Researchers have discovered trying to make AI reason like humans leads to between a 30x to 700x rise in energy consumption.”
If it’s reasoning like a lefty, it doesn’t take any energy.
Human thought runs on protein chemistry not silicon chips. There is probably a great reason for that.
Takes more energy to deliver the right answer as compared to an answer that simply sounds right.
Politicians and MSM case in point.