From the University of Exeter, comes this statistical prediction that doesn’t seem to be getting a lot of press, and rightly so. Read on.
Future Climate Change Revealed by Current Climate Variations
Uncertainty surrounding the extent of future climate change could be dramatically reduced by studying year-on-year global temperature fluctuations, new research has shown.
A team of scientists from the University of Exeter and the Centre of Ecology and Hydrology has pioneered a new process to reduce uncertainty around climate sensitivity – the expected long-term global warming if atmospheric carbon dioxide is stabilised at double pre-industrial levels.
While the standard ‘likely’ range of climate sensitivity has remained at 1.5-4.5°C for the last 25 years the new study, published in leading scientific journal Nature, has reduced this range by around 60 per cent.
The research team believe that by dramatically reducing the range of climate sensitivity, scientists will be able to have a much more accurate picture of long-term changes to the Earth climate.
Lead-author Professor Peter Cox from the University of Exeter said: “You can think of global warming as the stretching of a spring as we hang weights from it, and climate sensitivity as related to the strength of the spring.
“To relate the observed global warming to climate sensitivity you need to know the amount of weight being added to the spring, which climate scientists call the ‘forcing’, and also how quickly the spring responds to added weight. Unfortunately, we know neither of these things very well”.
The new research made their breakthrough by moving their focus away from global warming trends to date, and instead studying variations in yearly global temperatures.
Co-author Professor Chris Huntingford, from the Centre for Ecology and Hydrology, explained: “Much of climate science is about checking for general trends in data and comparing these to climate model outputs, but year-to-year variations can tell us a lot about longer-term changes we can expect in a physical system such as Earth’s climate.”
Mark Williamson, co-author of the study and a postdoctoral researcher at the University of Exeter, carried out the calculations to work-out a measure of temperature fluctuations that reveals climate sensitivity.
This metric of temperature fluctuations can also be estimated from climate observations, allowing the model line and the observations to be combined to estimate climate sensitivity.
Using this approach, the team derive a range of climate sensitivity to doubling carbon dioxide of 2.8+/-0.6°C, which reduces the standard uncertainty in climate sensitivity (of 1.5-4.5°C) by around 60%.
Mark said: “We used the simplest model of how the global temperature varies, to derive an equation relating the timescale and size of the fluctuations in global temperature to the climate sensitivity. We were delighted to find that the most complex climate models fitted around that theoretical line”.
Explaining the significance of the results, Professor Cox added:
“Our study all but rules-out very low or very high climate sensitivities, so we now know much better what we need to. Climate sensitivity is high enough to demand action, but not so high that it is too late to avoid dangerous global climate change”.
The research was supported by the European Research Council (‘ECCLES’ project), the EU Horizon 2020 Programme (‘CRESCENDO’ project), and the UK’s Natural Environment Research Council.
The paper: https://www.nature.com/articles/nature25450
Emergent constraint on equilibrium climate sensitivity from global temperature variability
Abstract
Equilibrium climate sensitivity (ECS) remains one of the most important unknowns in climate change science. ECS is defined as the global mean warming that would occur if the atmospheric carbon dioxide (CO2) concentration were instantly doubled and the climate were then brought to equilibrium with that new level of CO2. Despite its rather idealized definition, ECS has continuing relevance for international climate change agreements, which are often framed in terms of stabilization of global warming relative to the pre-industrial climate. However, the ‘likely’ range of ECS as stated by the Intergovernmental Panel on Climate Change (IPCC) has remained at 1.5–4.5 degrees Celsius for more than 25 years1. The possibility of a value of ECS towards the upper end of this range reduces the feasibility of avoiding 2 degrees Celsius of global warming, as required by the Paris Agreement. Here we present a new emergent constraint on ECS that yields a central estimate of 2.8 degrees Celsius with 66 per cent confidence limits (equivalent to the IPCC ‘likely’ range) of 2.2–3.4 degrees Celsius. Our approach is to focus on the variability of temperature about long-term historical warming, rather than on the warming trend itself. We use an ensemble of climate models to define an emergent relationship2between ECS and a theoretically informed metric of global temperature variability. This metric of variability can also be calculated from observational records of global warming3, which enables tighter constraints to be placed on ECS, reducing the probability of ECS being less than 1.5 degrees Celsius to less than 3 per cent, and the probability of ECS exceeding 4.5 degrees Celsius to less than 1 per cent.
https://wattsupwiththat.files.wordpress.com/2018/01/cox-et-al-2018.pdf
Here’s how I see it:
They seem to overlook one very important thing. In their method, they look at “variations in yearly global temperatures”. They are assuming that the envelope created by the variations will reveal an underlying trend, and from that, a measure of climate sensitivity by comparing it to model output. Their analogy in the press release, using a weighted spring reveals their thinking as believing Earths climate as being a “constrained system”.
Earth’s climate does have some constraints, but it also has chaos, and the chaotic nature of the myriad of forces in Earth’s atmosphere is often pushed beyond what is considered a normal for such constraints. Chaos itself becomes a “forcing”. It is why we get occasional extremes of weather and climate. Edward Lorentz was the first to describe the chaotic nature of the atmosphere with his “butterfly effect” paper in 1972. http://eaps4.mit.edu/research/Lorenz/Butterfly_1972.pdf
Lorenz describes the evidence that the atmosphere is inherently unstable as “overwhelming”.
It’s that instability that they are trying to quantify, and put an envelope around, but it is a fools errand in my opinion becuase there’s so much noise in that chaos.
To see why, have a look at this presentation from Stephens et al. 2104. http://wind.mit.edu/~emanuel/Lorenz/Lorenz_Workshop_Talks/Stephens.pdf
That team asks: “Is Earth’s climate system constrained”?
Their answer is that it is,
– The reflected energy from Earth is highly regulated & this regulation by clouds. The most dramatic example of this appears in hemispheric symmetry of reflected solar radiation
– Hemispheric OLR also appears regulated by clouds
but…. Stephens et al also uses the CMIP5 models and say this about them:
– Models don’t have the same behavior as the observed Earth – they lack the same degree of regulation and symmetry. Does this really matter? It seems so.
Yes, the problem is clouds. And as most anyone knows in climatology, models don’t do clouds well. If you do a search of the literature you’ll find statements suggesting clouds limit warming, and clouds enhance warming. There’s no good agreement of what affect clouds actually have had on long term climate trends. But the key component of clouds, water vapor, has been revealing as a primary forcing as our presentation at AGU16 demonstrated:
In the Cox et al 2018 paper, they say”
“… the emergent relationship from the historical runs and observational constraint can be combined to provide an emergent constraint on ECS.”
On the face of it, that “seems” reasonable, however, the flaw here is that they are doing this:
“We use an ensemble of climate models to define an emergent relationship.”
First, making an average of model output also averages their error along with their predictions. And if models don’t do clouds well, and “Models don’t have the same behavior as the observed Earth – they lack the same degree of regulation and symmetry.” and because they are comparing to the highly biased and adjusted surface temperature record for confirmation, then all Cox et al is doing is making a classical statistical blunder of “correlation is not causation”. They are looking for forcing in the surface temperature record for confirmation of the models, but the surface temperature record is really highly dependent on clouds as well as being highly adjusted. It’s has a wide envelope of base noise from “chaos” creating weather extremes.
This paper in 2013 says this about CMIP5 and clouds: http://onlinelibrary.wiley.com/doi/10.1029/2012JD018575/full
“Despite a variety of discrepancies in the simulated cloud structures, a universal feature is that in all models, the cloud parameterization errors dominate, while the large-scale and the covariation errors are secondary. This finding confirms the deficiency in the current state of knowledge about the governing mechanisms for subgrid cloud processes…”
Really, in my view, all they have done is to plot the envelope of possible values, then constrain it (figure 4A), and come up with a new ECS average based on that assumed constraint.

There’s more noise and less signal, and from that they create a statistical probability of a climate sensitivity of 2.8C. I think they are fooling themselves. “The first principle is that you must not fool yourself — and you are the easiest person to fool.” – Richard Feynman
Basically, they are comparing two smoothed time series (HadCRUT4 and CMIP5 model mean) to come up with an ECS value. Statistician William Briggs points out the folly of this: http://wmbriggs.com/post/195/
Now I’m going to tell you the great truth of time series analysis. Ready? Unless the data is measured with error, you never, ever, for no reason, under no threat, SMOOTH the series! And if for some bizarre reason you do smooth it, you absolutely on pain of death do NOT use the smoothed series as input for other analyses! If the data is measured with error, you might attempt to model it (which means smooth it) in an attempt to estimate the measurement error, but even in these rare cases you have to have an outside (the learned word is “exogenous”) estimate of that error, that is, one not based on your current data.
If, in a moment of insanity, you do smooth time series data and you do use it as input to other analyses, you dramatically increase the probability of fooling yourself! This is because smoothing induces spurious signals—signals that look real to other analytical methods.
Their figure 1A in Cox et al 2018 compares smoothed series. The dots represent yearly averages of global temperature.

The surface temperature record is highly smoothed, and the model mean they chose is smoothed.
And there’s also by a bit of cherry picking on their part:
If we instead use all 39 historical runs in the CMIP5 archive, we find a slightly weaker emergent relationship, but derive a very similar emergent constraint on ECS (Extended Data Table 2).
So, why limit the number of models used? It’s because it makes them believe they are more certain of the ECS value.
I’m also reminded of this quote:
“If your experiment needs statistics, you ought to have done a better experiment.” – Ernest Rutherford
As Dr. Judith Curry says, “Climate [is] a wicked problem”, and that’s why (from the Exeter press release) “…the standard ‘likely’ range of climate sensitivity has remained at 1.5-4.5°C for the last 25 years…”. I don’t think this study has contributed any precision to that problem.
There are other observational estimates of climate sensitivity to observations. They come up with much lower values.
Willis came up with this:
The results were that the equilibrium climate sensitivity to a change in forcing from a doubling of CO2 (3.7 W/m2) are 0.4°C in the Northern Hemisphere, and 0.2°C in the Southern Hemisphere. This gives us an overall average global equilibrium climate sensitivity of 0.3°C for a doubling of CO2.
https://wattsupwiththat.com/2012/05/29/an-observational-estimate-of-climate-sensitivity/
Dr. Roy Spencer came up with this:

In this case, we see that a climate sensitivity of only 1.5 C was required, a 40% reduction in climate sensitivity. Notably, this is at the 1.5C lower limit for ECS that the IPCC claims. Thus, even in the new pause-busting dataset the warming is so weak that it implies a climate sensitivity on the verge of what the IPCC considers “very unlikely”.
He adds:
The simplicity of the model is not a weakness, as is sometimes alleged by our detractors — it’s actually a strength. Since the simple model time step is monthly, it avoids the potential for “energy leakage” in the numerical finite difference schemes used in big models during long integrations. Great model complexity does not necessarily get you closer to the truth.
In fact, we’ve had 30 years and billions of dollars invested in a marching army of climate modelers, and yet we are no closer to tying down climate sensitivity and thus estimates of future global warming and associated climate change. The latest IPCC report (AR5) gives a range from 1.5 to 4.5 C for a doubling of CO2, not much different from what it was 30 years ago.
Climate sensitivity remains the “Holy Grail” of climate science; a lot of people think they know where it is hidden, but so far it seems nobody has the actual location of it.

All I had to read was “models”. They have not been able to account for any feedbacks, negative or positive. So they should start with the logarithmic curve shown by David Archibald, which NOAA/NASA are quite aware of. That makes the climate sensitivity about 0.1 deg.
It doesn’t rule out very low sensitivities at all.
Global temperatures have been only showing about 0.15 c per decade increase and this over century equals the very low sensitivity at 1.5c
Even if all this warming was blamed on humans it still falls at very low sensitivity. The fact being the warming is not all caused by humans and natural cycles have at least contributed towards this. There would had been no pause if natural events had no influence at all. The pause on its own hinted humans had no more than 50% influence.
BUT,
The AMO and ENSO has been responsible for this change so there is no where human warming fits in. Even with the bias confirmation of adjusting data to match more closely models, increasing warming when it failed to materialise.
http://www.woodfortrees.org/plot/nsidc-seaice-n/from:1979/normalise/plot/esrl-amo/from:1979/plot/uah6/from:1979
The AMO if continues similar to past cycles will fit two negative phases during this century, so there will be no chance of even representing the last century. The fiddling/faking of data with HADCRUT, GISS/NOAA will only get worse to hide the lack of future warming. With only one warm phase in future this century, there can only be about 0.4c increase with honest observation data set.
At least one third of that is just from the Adjustments to the data.
So, Hansen’s misapplication of the electrical Bode equations didn’t work out so well, and now, we’re going to explain climate change with Hooke’s Law of springs?
Sounds like a 7th grade science experiment to me!
The IPCC published the exact same range of values for the climate sensivity of CO2 in their first report that they publiahed in their last report. So for more than two decades of effort they have learned nothing that would allow them to narrow their range of guesses one iota.
The initial radiametric calculations of the climate sensivity of CO2 came up with a value of 1.2 degrees C not including any feedbacks. One researcher has pointed out that these calculations do not take into consideration that a doubling of CO2 in the Earth’s atmosphere will cause a slight decrease in the dry lapse rate in the troposphere enough to decrease the climate sensivity of CO2 by more than a factor of 20. So a better number for the climate sensivity of CO2 not including feedbacks would be less than .06 degrees C.
The big issue has been H2O feedback and its related uncertainity is largely responsible for the wide range of guesses. The idea here is that an increase in CO2 causes warming which causes more H2O to enter the atmosphere. H2O is the primary greenhouse gas so more H2O causes even more warming which causes even more H2O to enter the atmoshere and so forth. A typical assumption is that the positive feedback of H2O causes an amplification of the warming effects of CO2 by roughly a factor of 3. It is the uncertainity of this feedback factor that causes the wide range of climate sensivity questimates. What this calculation ignores is that while H2O is the primary greenhouse gas in the Earth’s atmosphere, H2O is also a major coolant in the Earth’s atmosphere moving heat related energy from the Earth’s surface which is mostly some form of H2O to where clouds form via the heat of vaporization. The overwheimling cooling effect of H2O is evidenced by the fact that the wet lapse rate is significantly less than the dry lapse rate in the troposphere. So instead of an amplification factor of 3 a more realistic amplification factor would be 1/3 yielding a climate sensivity of CO2 of less than .02 degrees C which is quite trivial. If the IPCC were not so policital they would be adding calculations like mine to their range of guestimates.
Another conconcern is that the radiant greenhouse effect that the AGW conjecture depends upon has not been observed, in a real greenhouse, on Earth, or anywhere else in the solar system. The radiant greenhouse effect is fiction. Taking this into acocunt’, the climate sensivity of CO2 is zero. So a better and more realistic range of estimates for the climate sensivity of CO2 would be between 0.0 and .02 degrees C.
The real Earth’s climate history pretty well agrees – ECS of CO2 level changes is essentially zero.
At the Cato Institute table at the end it is the “Holy Grail” will be somewhere about 2.5, so 2.8+/-0.6°C is plausible. That is still a sharp increase, I suppose it wouldn’t be too bad for some people and polar bears too.
Anthony,
Should be Climate is a wicked problem
[Thank you. Peer-review works! .mod]
BTW, there should be a whole post/discussion of wicked problems
no, The Climate AGENDA is a wicked problem
Couldn’t agree more. If things are warming, “climate” is a non-problem. When it starts cooling THAT will be a problem, which will have its effects multiplied heavily if we squander our resources chasing our tails to “fix” the non-problem of AGW.
I’m an engineer whose computer at graduation was a slide rule. You had to estimate the answer to place the decimal correctly. You had the terror in the background of the possibility of making an error. People died because of errors!.
The computer was a wonderful invention, but it did let grossly incompetent people in the door and this at a time when “industrial democracy” resulted in universities throwing their doors open so wide that they had to invent ‘faculties lite’ to welcome illiterates and numerically challenged students and professors in. It was simply a matter of money, which put scholarship in the back seat.
Simple Excel made calculations of the whole range of statistical techniques easy to do (It’s noteworthy that Phil Jones admitted he didn’t understand how to do Excel) so you could try out dozens of techniques with stuff that that didn’t even represent data and even invent tailormade stat treatments until you got what you wanted. This is how Mann made the hockey stick using an invention that made all noise into hockey sticks. He even employed a contaminated proxy upside down and even repeated this in subsequent renditions. Even better than climategate would be to see all the trials they did and rejected! Hey, the worst of them won’t let you see the code. I suspect it would be a Rube Goldberg concoction.
Briggs’s admonition is simple to understand. If you manipulate data to make a smooth graph, any statistical technique you use afterwards recognizes your improved “fit” as superior data with acceptably narrow error bars!
Bad stats isn’t the worst of it. They are going to get the right answer however its arrived at because that’s what they are paid for.
Widespread access to cheap computers has also led to abuse in the world of public health where people churn through huge amounts of data to identify correlations which are in fact purely chance findings. There is no doubt that since the 1980s much damage to society has arisen from cheap computers and recreational drugs – sometimes simultaneously.
The new lower limit of 2.2 C is larger than the old lower limit of 1.5 C which iwas already larger than the maximum possible effect as limited by the laws of physics of about 1.1C. All they did was move the range further away from what the physical ECS actually is.
Looks like post 1999 both high and low models run hot to observation.
The interval most if interest has the lowest correlation.
I am constantly surprised at how climate science authors forget both the scientific method and ethics. The abstract talks about policy with reference to the Paris Agreement yet the source temperature data set has uncertainty level set by hypothetical means that bear little relevance to actual measurement standards for verification. Pure theory being touted as important. I am also surprised at myself because I keep repeating this point on this site like Mugatu and the Crazy Pills!!
But anyway, if a purely scientific discussion is being made then at least state the limits of results based on input assumptions.
But then it would look like what it is: a what-if exercise.
See also the post at And Then There’s Physics: https://andthentheresphysics.wordpress.com/2018/01/20/narrowing-the-climate-sensitivity-range/
Which is more drivel assuming that CO2 “causes” warming, a (non)”fact” not in “evidence.”
Of course many a real scientist will point out that climate scientists needed to focus on trends because the data points arising annually are inherently so variable that it is almost impossible to draw meaningful conclusions in the short term.
The good thing about loan-sharks is that they are actually often legally obliged to quote an APR on the products they offer the public. Being forced to do this makes it harder for them to deceive their customers/victims. But climate scientists have no such restrictions, so they effectively jump between the time-domain and the frequency-domain as and when they think their audience is not watching the pea very closely. Thus: Long term temperature trends are disappointingly normal-looking?… Never mind. No real change in the science, but let’s focus on making claims about individual years instead. That way, nobody has to wait to actually see what happens to the trends when they are not producing the desired effects soon enough. So previously we had hot hot hot “global warming weather”, but then it became longer term “climate change” because nobody really felt much hotter. And then it becomes “weird” weather events again because climate takes a long time….And so the wheels on the bus go round and round…..https://www.youtube.com/watch?v=gOSabglQAvc
“We use an ensemble of climate models to define an emergent relationship.”
I don’t understand this. If you don’t understand a system enough to establish an emergent relationship, why would a model you had programmed do it. Worse, if one model doesn’t do it, why would an ensemble do it?
Me thinks somebody should go back and do systems engineering 101 again.
The ensemble fallacy, i.e. thinking that a collection of only partially valid models would make more reliable results, is somewat odd.
Monkeys throwing darts at a wall covered with different possible future climate states are about as “scientific” as “climate models” at this point. “Climate models” do nothing but spit out the false validation of the incorrect input assumptions.
30 years!
A trillion dollars.
A hundred thousand of mankind’s brightest minds [ Ah! Sir! Surely you do but Jest, Sir ! ]
A few hundreds of thousands of mankind’s most complex numerical model’s runs .
A massed array of the supporting quagmire of the Elites of politics, of the media , of the Inner City goat cheese circle Expert Greens, of Ivory towered academics who know nought of the real world outside, and a vast array of the good, the great and the most outspoken of the ignorants and incompetents of this Earth.
Just One assumption they in their wisdom, all have!
That a minor and not very concentrated atmospheric gas is in complete control of Earth’s critical temperatures.
Nations rise, Nations fall,
They bury the great and not so great,
They bury the believers and the unbelievers and the d—-rs
El Ninos come and go.
La Ninas come and go, neither rarely predicted with any accuracy and sometimes not at all..
The great sea ice and land ice sheets of the Arctic and Antarctic come and go while the Great Minds of mankind argue if they will come again or go again next year or five years or ten years .
Nature laughs!
The simple minds of the media and the politicians and the greens who can’t propnounce or spell “CO2” talk in horrified whispers of the deadly “Carbon” which will destroy the planet.
When such a catastrophe is due is after 30 years of assidious research is still the subject of debate , of argument and of growing contempt towards those who know, they know, but know not why they know.
The Great Minds of Politics and Media and Academia and Climate Science and Experts from inside of Green Goat Cheese inner city Circle continue to grapple with finding a solution as to “when” , not “IF” such a simple three atom atmospheric gas, essential to all life on earth and found only by careful measurement, will exercise its power and destroy everything on this planet by revealing of how little consequence it really is as a gas and thereby destroy that very mirage and cloud created psuedo scientific edifice the Elites have so carefully built up and have treasured and protected over the decades past against the mass assaults of the ignorant hordes of Unbelievers, the D—–rs and from the those ever more ignorant and revolting peasants far below them in status.
“They seem to overlook one very important thing. In their method, they look at “variations in yearly global temperatures”. They are assuming that the envelope created by the variations will reveal an underlying trend, and from that, a measure of climate sensitivity by comparing it to model output. Their analogy in the press release, using a weighted spring reveals their thinking as believing Earths climate as being a “constrained system”.
Earth’s climate does have some constraints, but it also has chaos, and the chaotic nature of the myriad of forces in Earth’s atmosphere is often pushed beyond what is considered a normal for such constraints. Chaos itself becomes a “forcing”. It is why we get occasional extremes of weather and climate. Edward Lorentz was the first to describe the chaotic nature of the atmosphere with his “butterfly effect” paper in 1972. http://eaps4.mit.edu/research/Lorenz/Butterfly_1972.pdf”
1. No they are NOT assuming the envelope reveals an underlying trend.
2. An analogy is just that. it is not critical to the actual math. analogies are just meant to simplify..
like the greenhouse and blanket analogy they are NOT substitutes for the math and they
are just cartoon versions that hopefully illuminate. But all analogies fail. That’s why you
do the math.
3. Climate is not chaotic. Chaos is not a forcing. Forcings have units. Watts. The butterfly effect
will work on small scales and on certain metrics. It does not change energy balance.
If it could change energy balance then we would live in a MORE SENSITIVE climate.
Ok, the GHE is gone for good.
I am stil working on finishing my paper, but things are pretty much settled already, and it gets better everytime I check against empirical data. Yet the story it so absurd because it is soooo simple and obvious.
https://www.weather.gov/jetstream/energy
There is a lot of nonsense in this consensus GH-model. The most significant part however are the 12% (342*0.12=41W/m2) going into space. This is the window causing diurnal temperature variations. As these are average values, a clear sky will have a larger window, something like 63-71W/m2. This loss of heat will drop temperatures during night as we know it.
At the same time we are being told, that clouds would have a net cooling effect of about 18W/m2 on average. On average means including all sky conditions. As clouds can only have that effect if they are present, and assuming they would cover 35% of the surface, the negative 18W/m2 must attributed to these 35%. So if there is complete cloud cover of a 100%, the negative effect should be like 18/0.35 = 51W/m2.
During a clear night temperatures will fall by 1.5-2K per hour. If the negative cloud effect was true, temperatures would need to fall by at least 1K per hour, as an average over night and day. But temperatures do not fall at all with clouds. Rather, and that is what I have been researching, temperatures increase with clouds.
Clouds having a positive net effect, is a death sentence to the GH concept..
I do not see how that can possibly be true.
It is not just the Day vs Night affect of clouds, it is Daytime High Energy Solar Radiation 0.08 – 12eV compared to Nighttime LWIR 0.08ev.
The daytime radiation warms the Oceans to depth, deprive them of that with cloud and it has a majrt effect on long term Energy storage.
Nighttime cloud merely slows the rate of energy loss from the surface.
There is a problem using year to year average temperature changes to calculate sensitivity.
A nominated year showing a temperature rise, on average, will have some locations where the change is less than the average, even some locations where there has been a fall. While the average process hides this, the mechanism still exists. The work is deficient if it cannot explain why these negative excursions can and do happen. What happened, at that time in history, other one site with a rise while there was another with a fall? Geoff.
I would like to copy here a shortly what the IPCC has written about the ECS : IPCC summarizes the differences of ECS and TCR (IPCC has changed the term TCS to TCR (Transient Climate Response)) in AR5 like this (p. 1110): “ECS determines the eventual warming in response to stabilization of atmospheric composition on multi-century time scales, while TCR determines the warming expected at a given time following any steady increase in forcing over a 50- to 100-year time scale.” And further on page 1112, IPCC states that “TCR is a more informative indicator of future climate than ECS”.
Even the IPCC says that ECS values are very theoretical. IPCC uses TCR/TCS model in calculating the RCP-warming values in the end of this century.
Still too high — by a factor of 2.
Anything higher than zero is too high, given the Earth’s climate history. And the more people start to look to that for the correct answer, the sooner the Climate Fascists will launch a campaign to “adjust” that climate history, like the Ministry of Truth in Orwell’s “1984.”
The ultimate irony for me is that Bill Nye, who apparently is somewhat qualified as an engineer, would never accept such magnitude of a fudge factor in his specialty, but would smear those who point it out.
I hate living in 2018 sometimes…
‘Using this approach, the team derive a range of climate sensitivity to doubling carbon dioxide of 2.8+’
I am amused they used a decimal point.
Correction: 2.8 C is not a “statistical prediction.” For there to be a “statistical prediction” there has to be a “statistical population” underlying a “statistical model” but this population does not exist. Long ago, Svante Arrhenius supplanted probability theory and statistics through an application of the reification fallacy.