From Yale News: Climate models have underestimated Earth’s sensitivity to CO2 changes, study finds
A Yale University study says global climate models have significantly underestimated how much the Earth’s surface temperature will rise if greenhouse gas emissions continue to increase as expected.
Yale scientists looked at a number of global climate projections and found that they misjudged the ratio of ice crystals and super-cooled water droplets in “mixed-phase” clouds — resulting in a significant under-reporting of climate sensitivity. The findings appear April 7 in the journal Science.
Equilibrium climate sensitivity is a measure used to estimate how Earth’s surface temperature ultimately responds to changes in atmospheric carbon dioxide (CO2). Specifically, it reflects how much the Earth’s average surface temperature would rise if CO2 doubled its preindustrial level. In 2013, the Intergovernmental Panel on Climate Change (IPCC) estimated climate sensitivity to be within a range of 2 to 4.7 degrees Celsius.
The Yale team’s estimate is much higher: between 5 and 5.3 degrees Celsius. Such an increase could have dramatic implications for climate change worldwide, note the scientists.

“It goes to everything from sea level rise to more frequent and extreme droughts and floods,” said Ivy Tan, a Yale graduate student and lead author of the study.
Trude Storelvmo, a Yale assistant professor of geology and geophysics, led the research and is a co-author of the study. The other co-author is Mark Zelinka of Lawrence Livermore National Laboratory’s Program for Climate Model Diagnosis and Intercomparison.
A key part of the research has to do with the makeup of mixed-phase clouds, which consist of water vapor, liquid droplets, and ice particles, in the upper atmosphere. A larger amount of ice in those clouds leads to a lower climate sensitivity — something known as a negative climate feedback mechanism. The more ice you have in the upper atmosphere, the less warming there will be on the Earth’s surface.
“We saw that all of the models started with far too much ice,” said Storelvmo, an assistant professor of geology and geophysics. “When we ran our own simulations, which were designed to better match what we found in satellite observations, we came up with more warming.”
Storelvmo’s lab at Yale has spent several years studying climate feedback mechanisms associated with clouds. Little has been known about such mechanisms until fairly recently, she explained, which is why earlier models were not more precise.
“The overestimate of ice in mixed-phase clouds relative to the observations is something that many climate modelers are starting to realize,” Tan said.
The researchers also stressed that correcting the ice-water ratio in global models is critical, leading up to the IPCC’s next assessment report, expected in 2020.
Support for the research came from the NASA Earth and Space Science Fellowship Program, the National Science Foundation, and the U.S. Department of Energy.
(Note: title and date were corrected about 15 minutes after publication – this study was published in 2016, not 2017, and it was sent to WUWT as a tip. The error in not noticing the tip was for a study over a year old is mine- Anthony)
On the current Yale News website… models say no GW pause.
El Nino and the end of the global warming hiatus
A new climate model developed by Yale scientists puts the “global warming hiatus” into a broader historical context and offers a new method for predicting global mean temperature.
http://news.yale.edu/2017/04/26/el-ni-o-and-end-global-warming-hiatus
Very funny article. They claim a lower rate of El Nino was the reason for the pause but give no evidence that they understand what a normal rate would be. Of course, the opposite conclusion, that it was a high rate of El Nino that lead to the warming, is never considered.
I think Mr Palmer makes the essential point and backs it up with mathematical logic: if the Yale team are right why have we not seen much greater temperature rise over the last 30, 40, 100 years? If they have no explanation for that I prefer to believe the low sensitivity estimates, especially when you look far back into Earth history and see relatively reasonable temperatures with far higher CO2 levels. Or is the Yale team asking us to believe the laws of physics operated differently then?
Or is the Yale team asking us to believe the laws of physics operated differently then?
Absolutely, – human caused co2 repealed the law of physics – its basic science. ( sarc)
As a matter of fact the low sensitivity estimates are based on actual energy-balance data, not climate models. The only way they could be strongly too low is if HADCRUT/GISS etc temperatures are are also way too low and/or heat transfer to the deep ocean is vastly underestimated (which is only possible if ice in Greenland and Antarctica is growing fast and thereby hiding the thermosteric sea-level rise).
This has got to be the dumbest study anyone has ever done.
Look, we KNOW that climate sensitivity is. Global warming theory states temperatures increase by 1.3C for each doubling of carbon dioxide concentrations. This is due to the direct, physical effects of additional carbon dioxide in the atmosphere. The theory is that increasing surface temperatures will engender a positive feedback loop, that is to say warming will lead to even more warming (climate sensitivity).
Carbon dioxide concentrations have increased since 1950 from 270 ppm to 400 ppm, or 130 ppm, which is a 48% of the first doubling. Ergo, assuming everything is proportional, the temperature increase to date should be 48%*1.3 degrees = 0.62 degrees C based on carbon dioxide alone.
Okay, so how much actual warming have we observed since 1950? Around 0.60 degrees C. This strongly implies that climate sensitivity is close to zero.
ZERO.
A high number for climate sensitivity, based on some sort of esoteric study of cloud droplets, must be flawed because we have run this experiment in the real world, and the real word says the number is zero.
ZERO.
This Yale study is like someone studying aerodynamics and determining that it is scientifically impossible for birds to fly.
You mean since 1850, when it was about 280 ppm.
In 1950, CO2 was around 310 ppm. if the reconstruction is to be believed.
http://wwws3.eea.europa.eu/data-and-maps/figures/atmospheric-concentration-of-co2-ppm/image_xlarge
This study is for equilibrium climate sensitivity. If we are now at equilibrium, why are the oceans still gaining heat?
Are they?
They just blew off a lot of heat in a super El Nino. The Arctic Ocean is colder this year than it has been for a good long while.
Chimp:
[/IMG]
[IMG
“Why are the oceans gaining heat. ?”
A common belief – except there is very data to reach any conclusion. simply put, scientists ability to measure ocean heat content with reasonable level of reliability did not exist up until the 8-10 years.
Mie,
Those “data” are largely imaginary for centuries before the 21st (OK, 2000 was still technically in the 20th).
Argo, as you know, is a development of this century, and still far from complete or reliable:
http://www.argo.ucsd.edu/
Argo deployments began in 2000, and by November 2007, the millionth profile was collected. Today, even with close to 400 active floats, there are still some areas of the ocean that are over-populated while others have gaps that need to be filled with additional floats. Today’s tally of floats is shown in the figure above and additional float statistics can be found here. To maintain the Argo array, national programs need to provide about 800 floats per year.
The original global Argo array was designed for the open ocean excluding seasonal sea-ice zones and marginal seas. Thanks to both two-way communication and ice-sensing algorithms on floats, these technical limitations are largely mitigated. The concept of Argo has always been of a spatially complete global array. Therefore, including seasonal sea-ice zones and marginal seas moves the target number of Argo floats from 3000 to 3800.
In addition to the globalization of core Argo described above, there are several Argo enhancements that are in various stages of development and implementation. These include extended coverage to the ocean bottom, additional floats equipped with bio-geochemical sesnors, and enhanced spatial coverage in boundary current regions and equatorial regions.
Besides float deployment, Argo has worked hard to develop two separate data streams: real time and delayed mode. A real time data delivery and quality control system has been established that delivers 90% of profiles to users via two global data centers (GDACs) within 24 hours. A delayed mode quality control system (DMQC) has been established and 65% of all eligible profiles have had DMQC applied.
Float reliability has improved almost every year and the float lifetime has been extended. Argo has developed a large user community in universities, government labs and meteorological/climate analysis/forecasting centers. The need for global Argo observations will continue indefinitely into the future, though the technologies and design of the array will evolve as better instruments are built, models are improved, and more is learned about ocean variability.
At 400 floats, ARGOS is between 1 and 2 orders of magnitude short of having a sufficient number. Even if they could keep them from clustering.
Beyond that, the belief that you can use sensors rated at 0.1C to measure something to 0.01C is absurd.
Mark,
I agree, but they’re better than what went before. Unless they’re worse than nothing, being so inadequate as to give a false picture, even of the areas they do cover. Overrepresent, in fact.
More data is better. My complaint is focused solely on how some people use that data and the unsupported conclusions they derive from that data.
Such as the claim that we know the temperature of the oceans, sea bed to sea surface, with an accuracy of 0.01C.
I guess throwing out data as was done with Argo floats is acceptable science in the world of MieScatter. No reason other than it felt right. Sorry dude, Argo is not a scientific source of data.
[snip – you are using a fake name, fake IP address, and fake email to troll here – banned- Anthony]
MieScatterBrain,
It’s that big bright thingy in the sky that heats the oceans. Downward longwave radiation can only penetrate the ocean service about 50 micrometers, and the ocean surface is always cooler than the layer 50 micrometers below the surface. So, bottom line, CO2 does not heat the oceans.
[snip – you are using a fake name, fake IP address, and fake email to troll here – banned- Anthony]
MieScatterBrain,
Listen dumbass, it’s the physical properties of water. Backradiation can only penetrate a few micrometers. You are clueless.
Dear MieScatterBrain,
Did I say there was no heat transfer in water? No, stupid. And then you give some moronic analogy of boiling water on a stove.
[snip – you are using a fake name, fake IP address, and fake email to troll here – banned- Anthony]
MieScatterBrain,
The ocean surface is COOLER than the thin skin layer below. See:
https://disc.gsfc.nasa.gov/oceans/science-focus/modis/MODIS_and_AIRS_SST_comp.html
Notice Figure 2.(a) for the temperature profile at night. Yes, backradiation “shines” at night with an average of about 320 W/m2. The ocean surface is COOLER. If backradiation was heating the oceans (especially at night), the gradient should be the opposite.
I’m done with you, since it’s obvious you’re a science bullsh***er.
>>
In the meantime my egg is ready.
<<
Heat transfer by conduction is different than heat transfer by radiation. The physics of a liquid-air boundary is extremely complex. Your inappropriate and trivial example about boiling an egg is ludacruos.
>>
Clinging to denial of CO2-caused ocean warming means rejecting thermodynamics. It’s absurd.
<<
Thermodynamics is more about what is possible rather than what is probable. How about we put your raw egg in the ocean and cook it there. I’ll even breathe heavily over it to provide more CO2 to hurry the process along.
Jim
Tenn,
“Okay, so how much actual warming have we observed since 1950? Around 0.60 degrees C. This strongly implies that climate sensitivity is close to zero.”
Doesn’t it imply that “feedbacks” are zero and sensitivity is around 1.3 °C?
1) The world hasn’t really warmed that much since 1950. Whatever warming has occurred can’t be attributed to man-made CO2.
2) It cooled dramatically from 1950 (starting in the ’40s, raising fears of a return to ice age conditions) until c. 1977, when the PDO flipped, then warmed slightly until the late ’90s, since when GASTA, if such a thing exist, has stayed flat, fluctuating with ENSO.
3) Since CO2 has risen since 1950, no correlation has been observed between its increase and temperature.
4) Hence it’s possible that net feedbacks at least cancel out whatever GHE more CO2 might occasion.
No. feedbacks = climate sensitivity. the 1.3 degrees is the amount due to the carbon dioxide alone. For climate theory, this number is a given. In fact, I completely agree with this number – for each doubling of carbon dioxide concentrations, the temperature will increase by around 1.3 degrees C.
Naturally since this is a doubling, it means that the increase is going to be less for each incremental unit of carbon dioxide. Doubling from 250 to 500 ppm = 1.3 degrees. Doubling from 500 to 1,000 ppm = 1.3 degrees. As you can imagine, the total increase from this source is highly self limiting.
The premise of climate sensitivity is that warming begets more warming. That is, a small amount of warming from increased carbon dioxide concentrations would be amplified by some mechanism. Say, the warming increases the amount of total water vapor in the atmosphere, and water vapor, being a greenhouse gas, amplifies the warming. Climate sensitivity has to be high to create the kind of climate catastrophe scenarios that Global Warming Enthusiasts seem to prefer. Carbon dioxide, alone, cannot create the disaster they envision. So you need a big number for climate sensitivity – otherwise there is no crisis. Problem is, every speck of data we have points in the opposite direction – climate sensitivity is low. very low. maybe even a negative number.
Which make perfect sense. If climate sensitivity were high, and temperature could easily “run-away” because of small warming periods, then how could the climate have remained relatively stable for thousands, even millions, of years?
[snip – you are using a fake name, fake IP address, and fake email to troll here – banned- Anthony]
To me, the real bottom line on the effects of CO2 as a climate driver is simply this – if nothing happened, to speak, of other than data fiddling, between 1998 and 2015 while CO2 increased steadily, if is NOT the control knob. The pause, whether it exists now or not is absolute proof that the models do NOT work, that CO2 is not a control knob, and that current climate changes are natural. I don’t care how much they try to excuse the pause to natural variability having hidden the CO2 affect. They can write a million papers about what might happen in the future, but if it can’t explain the present or recent past it is just pure bull droppings. No real science works “part of the time,” it either works or its fantasy.
That was supposed to be “it is NOT the control knob, not “if.”
Well, and if they concede that natural variability is the cause of the pause, why couldn’t natural variability be the cause of the increase? It certainly was prior to 1950.
The facts that the late 19th century warming and the early 20th century warming are virtually indistinguishable in slope and duration from the late 20th century warming leaves little room for human activity, aside perhaps from generally cleaner air, causing slightly more warming. China and India however muddy those waters, to mix physical state metaphors.
The early 18th century warming was even greater in amplitude and lasted longer than the late 20th century warming. Which is not surprising, as it was a rebound from the depths of the LIA during the chilly Maunder Minimum.
So the models that have been completely wrong for the last 40 years can be relied upon to be completely wrong for at least another 80 years?
A higher climate sensitivity leads directly to more missing heat.
It looks like April is going to have a colder anomaly temperature than previous months, so temperatures are still going down. No good for alarmists. Half or more of the warming from the past El Niño is already gone. If we go back to the 2003-2014 average, defending a high climate sensitivity is going to be a tough cookie to sell.
Have you read any of the long-run sensitivity papers by people like Reto Knutti and Kyle Armour? Which ones?
Nope, none of those. Whatever I tried to read about sensitivity calculations I couldn’t quite make half of it. I don’t even fully understand Lewis posts at Climate etc. I don’t think it is worth the effort and time to study the issue. Specially considering that after 35 years very little progress has been made in the determination of ECS.
“Without human caused global warming, we would have expected to cool since 1998 instead of warm.”
NO. After the 1998 El Niño ejected massive amounts of heat into the atmosphere, one would expect temperatures to decline or remain stable for a period regardless of the then present amount of atmospheric CO2.
All your other comments above cannot overcome the deficiency of hard science showing that one extra CO2 molecule per ten-thousand other air molecules can produce the affects of CAGW. And don’t attempt to persuade us by employing the old trick of correlates.
Bottom line is this: There is insufficient understanding of climate to make costly economic decisions.
Rubbish01 + Rubbish 02 * Rubbish 3 /Rubbish 05 – Rubbish 06 * Rubbish 07 = Rubbish
Somebody may remember that I wrote a story about the reproduction of the Myhre’s equation, which is the basis of the CS calculations
RF = 5.35*ln(C/280)
where C is the CO2 concentration in ppm. I did the very same calculations in order to find out, if I can get the same result. My equation is also logarithmic but not the same
RF = 3.12*ln(C/280)
I also found that there is no positive water feedback doubling the warming effects of GH gases. Therefore, the ECS value is only 0.6 degrees Celsius. The IPCC’s model calculated temperature is today about 50 % above the UAH temperature value of the 2000’s. Below is a figure showing a lot of information.
The temperature graph shows very clearly that the global temperature goes up and down according to ENSO events. It is also very clear that during ENSO events the absolute water content changes doubles the original temperature changes, i.e there is a positive water feedback in ENSO temperature changes. But it is also very clear that in long term changes (more than 10 years), there is no positive water feedback: the average absolute water content is about constant. For example, from 1979 to 2000 the UAH temperature increased 0.35C but the absolute humidity decreased a bit. So where is the positive water feedback?
By the way, there is also a Factor X in figure, which shows a driving force needed to explain the observed temperature, when water and CO2 effects (according to Ollila) are decreased. The CO2 effect by Myhre & IPCC went through the roof when the present temperature pause emerged after 2000.
Dr. Antero Ollila
Thanks.
Warmunistas don’t believe me when I tell them that the assumed net positive feedbacks from rising CO2 are not in evidence, but merely assumed by so-called “climate scientists”. The net feedback effects might well be negative, so that TCS, if there be such a thing, could be less than 1.2 degrees C per doubling of CO2.
Yup. exactly. the magical mechanism creating a positive value for climate sensitivity is not in evidence at all. It was always just an assumed number.
Climate activists intentional blur the distinction between the actual temperature due to carbon dioxide, and the assumed number for climate sensitivity. They pretend they are the same thing.
The only sensible climate change answer for this is that it must precipitate a bit more and that’s your climate change.
In 2008, Jim Hansen gave a lecture claiming that the ECS is 3°C. It is duly reported on at this site:
Jim Hansen’s AGU presentation: “He’s ‘nailed’ climate forcing for 2x CO2”
https://wattsupwiththat.com/2008/12/21/jim-hansens-agu-presentation-hes-nailed-climate-forcing-for-2x-co2/
For these young whippersnappers* to publish a paper that sees Hansen’s 3° and raises him 2° is bold. Very Bold.
*the lead author is a graduate student, of all things, and the PI is a mere assistant professor. The impudence of these people. The nerve.
Yale scientists looked at a number of global climate projections and found that they misjudged the ratio of ice crystals and super-cooled water droplets in “mixed-phase” clouds — resulting in a significant under-reporting of climate sensitivity.
No doubt there are many overlooked processes which would tend to make things hotter. But, there are very likely as many or more that would tend to make them cooler. As they are only looking for the things that make it hotter, that is what they find.
Confirmation bias is what it is all about.
Apparently it is only CO2s direct forcing of about 1C per doubling that precipitate these feedbacks resulting in this purported 5C long-term temperature rise not seen at least during this interglacial, amazing stuff.
Wow! Note the precision: 5-5.3 degrees C. Clearly the science is now settled. Stop funding all other models.
Yeah, and wasn’t CO2 supposed to be responsible for cooling of the upper atmosphere?
If the no feedback CS be 1.2 degrees C per doubling, then IMO the range should be 0.0 to 2.4 degrees C. It’s improbable that net feedbacks could yield a near quadrupling, ie to 4.5 degrees C.
This range includes most of the most reasonable estimates and measurements. The “canoncial” 3.0 degrees C is clearly too high, IMO. It was derived from nothing more than the average of two WAGs back in the ’70s and hasn’t changed since. The higher guess of 4.0 degrees was by Hansen, so should be rejected on that basis alone. Even the lower, better guess of 2.0 degrees can now be seen to be too high as well.
Not improbable. Impossible. We ran the experiment on the entire Earth already, and the data says their result is crap. Full stop.
Based on the results of the experiment we have performed on the entire earth since 1950, climate sensitivity is very low. It is certainly less than 1 degree, probably less than 0.5 degrees.
Hence negative feedbacks of 0.2 to 0.7 degrees C, if the no feedbacks number be 1.2 degrees C.
Chimp. many of the studies showing about 1.2 C climate sensitivity have used Myhre’s RF value for CO2 without questioning it. They should have calculated it by themselves.
Perhaps I did not understand but these folks are saying that there should be more warming as a result of additional CO2 in the atmosphere and the actual temperature rise has already been way less than predicted so doesn’t that mean that all of these models are even more screwed up than anyone ever thought that they were?
They could not possibly be more screwed up than I have thought they are.
They are worse than worthless, wanton wastes of tax receipts, GIGO computer games perpetrated by corrupt, rent-seeking, trough-feeding, anti-scientific, second and third-rate Watermelon ideologues.
Was there a pine cone tree in their study/paper.
Yes, they modeled one.
Because as we all know, a better fit means a better projection.
sigh/sarc
As with all of these models, the first question always has to be “show me how you validated it”. Without validation, it’s just castles in the sky… or out-and-out propaganda, as in this case.
“The overestimate of ice in mixed-phase clouds relative to the observations is something that many climate modelers are
starting to realize,
” Tan said.
____________________________________________
Tan starting to realize. Sad.
This is very unsettling.