From the University of Exeter, comes this statistical prediction that doesn’t seem to be getting a lot of press, and rightly so. Read on.
Future Climate Change Revealed by Current Climate Variations
Uncertainty surrounding the extent of future climate change could be dramatically reduced by studying year-on-year global temperature fluctuations, new research has shown.
A team of scientists from the University of Exeter and the Centre of Ecology and Hydrology has pioneered a new process to reduce uncertainty around climate sensitivity – the expected long-term global warming if atmospheric carbon dioxide is stabilised at double pre-industrial levels.
While the standard ‘likely’ range of climate sensitivity has remained at 1.5-4.5°C for the last 25 years the new study, published in leading scientific journal Nature, has reduced this range by around 60 per cent.
The research team believe that by dramatically reducing the range of climate sensitivity, scientists will be able to have a much more accurate picture of long-term changes to the Earth climate.
Lead-author Professor Peter Cox from the University of Exeter said: “You can think of global warming as the stretching of a spring as we hang weights from it, and climate sensitivity as related to the strength of the spring.
“To relate the observed global warming to climate sensitivity you need to know the amount of weight being added to the spring, which climate scientists call the ‘forcing’, and also how quickly the spring responds to added weight. Unfortunately, we know neither of these things very well”.
The new research made their breakthrough by moving their focus away from global warming trends to date, and instead studying variations in yearly global temperatures.
Co-author Professor Chris Huntingford, from the Centre for Ecology and Hydrology, explained: “Much of climate science is about checking for general trends in data and comparing these to climate model outputs, but year-to-year variations can tell us a lot about longer-term changes we can expect in a physical system such as Earth’s climate.”
Mark Williamson, co-author of the study and a postdoctoral researcher at the University of Exeter, carried out the calculations to work-out a measure of temperature fluctuations that reveals climate sensitivity.
This metric of temperature fluctuations can also be estimated from climate observations, allowing the model line and the observations to be combined to estimate climate sensitivity.
Using this approach, the team derive a range of climate sensitivity to doubling carbon dioxide of 2.8+/-0.6°C, which reduces the standard uncertainty in climate sensitivity (of 1.5-4.5°C) by around 60%.
Mark said: “We used the simplest model of how the global temperature varies, to derive an equation relating the timescale and size of the fluctuations in global temperature to the climate sensitivity. We were delighted to find that the most complex climate models fitted around that theoretical line”.
Explaining the significance of the results, Professor Cox added:
“Our study all but rules-out very low or very high climate sensitivities, so we now know much better what we need to. Climate sensitivity is high enough to demand action, but not so high that it is too late to avoid dangerous global climate change”.
The research was supported by the European Research Council (‘ECCLES’ project), the EU Horizon 2020 Programme (‘CRESCENDO’ project), and the UK’s Natural Environment Research Council.
The paper: https://www.nature.com/articles/nature25450
Emergent constraint on equilibrium climate sensitivity from global temperature variability
Abstract
Equilibrium climate sensitivity (ECS) remains one of the most important unknowns in climate change science. ECS is defined as the global mean warming that would occur if the atmospheric carbon dioxide (CO2) concentration were instantly doubled and the climate were then brought to equilibrium with that new level of CO2. Despite its rather idealized definition, ECS has continuing relevance for international climate change agreements, which are often framed in terms of stabilization of global warming relative to the pre-industrial climate. However, the ‘likely’ range of ECS as stated by the Intergovernmental Panel on Climate Change (IPCC) has remained at 1.5–4.5 degrees Celsius for more than 25 years1. The possibility of a value of ECS towards the upper end of this range reduces the feasibility of avoiding 2 degrees Celsius of global warming, as required by the Paris Agreement. Here we present a new emergent constraint on ECS that yields a central estimate of 2.8 degrees Celsius with 66 per cent confidence limits (equivalent to the IPCC ‘likely’ range) of 2.2–3.4 degrees Celsius. Our approach is to focus on the variability of temperature about long-term historical warming, rather than on the warming trend itself. We use an ensemble of climate models to define an emergent relationship2between ECS and a theoretically informed metric of global temperature variability. This metric of variability can also be calculated from observational records of global warming3, which enables tighter constraints to be placed on ECS, reducing the probability of ECS being less than 1.5 degrees Celsius to less than 3 per cent, and the probability of ECS exceeding 4.5 degrees Celsius to less than 1 per cent.
https://wattsupwiththat.files.wordpress.com/2018/01/cox-et-al-2018.pdf
Here’s how I see it:
They seem to overlook one very important thing. In their method, they look at “variations in yearly global temperatures”. They are assuming that the envelope created by the variations will reveal an underlying trend, and from that, a measure of climate sensitivity by comparing it to model output. Their analogy in the press release, using a weighted spring reveals their thinking as believing Earths climate as being a “constrained system”.
Earth’s climate does have some constraints, but it also has chaos, and the chaotic nature of the myriad of forces in Earth’s atmosphere is often pushed beyond what is considered a normal for such constraints. Chaos itself becomes a “forcing”. It is why we get occasional extremes of weather and climate. Edward Lorentz was the first to describe the chaotic nature of the atmosphere with his “butterfly effect” paper in 1972. http://eaps4.mit.edu/research/Lorenz/Butterfly_1972.pdf
Lorenz describes the evidence that the atmosphere is inherently unstable as “overwhelming”.
It’s that instability that they are trying to quantify, and put an envelope around, but it is a fools errand in my opinion becuase there’s so much noise in that chaos.
To see why, have a look at this presentation from Stephens et al. 2104. http://wind.mit.edu/~emanuel/Lorenz/Lorenz_Workshop_Talks/Stephens.pdf
That team asks: “Is Earth’s climate system constrained”?
Their answer is that it is,
– The reflected energy from Earth is highly regulated & this regulation by clouds. The most dramatic example of this appears in hemispheric symmetry of reflected solar radiation
– Hemispheric OLR also appears regulated by clouds
but…. Stephens et al also uses the CMIP5 models and say this about them:
– Models don’t have the same behavior as the observed Earth – they lack the same degree of regulation and symmetry. Does this really matter? It seems so.
Yes, the problem is clouds. And as most anyone knows in climatology, models don’t do clouds well. If you do a search of the literature you’ll find statements suggesting clouds limit warming, and clouds enhance warming. There’s no good agreement of what affect clouds actually have had on long term climate trends. But the key component of clouds, water vapor, has been revealing as a primary forcing as our presentation at AGU16 demonstrated:
In the Cox et al 2018 paper, they say”
“… the emergent relationship from the historical runs and observational constraint can be combined to provide an emergent constraint on ECS.”
On the face of it, that “seems” reasonable, however, the flaw here is that they are doing this:
“We use an ensemble of climate models to define an emergent relationship.”
First, making an average of model output also averages their error along with their predictions. And if models don’t do clouds well, and “Models don’t have the same behavior as the observed Earth – they lack the same degree of regulation and symmetry.” and because they are comparing to the highly biased and adjusted surface temperature record for confirmation, then all Cox et al is doing is making a classical statistical blunder of “correlation is not causation”. They are looking for forcing in the surface temperature record for confirmation of the models, but the surface temperature record is really highly dependent on clouds as well as being highly adjusted. It’s has a wide envelope of base noise from “chaos” creating weather extremes.
This paper in 2013 says this about CMIP5 and clouds: http://onlinelibrary.wiley.com/doi/10.1029/2012JD018575/full
“Despite a variety of discrepancies in the simulated cloud structures, a universal feature is that in all models, the cloud parameterization errors dominate, while the large-scale and the covariation errors are secondary. This finding confirms the deficiency in the current state of knowledge about the governing mechanisms for subgrid cloud processes…”
Really, in my view, all they have done is to plot the envelope of possible values, then constrain it (figure 4A), and come up with a new ECS average based on that assumed constraint.
There’s more noise and less signal, and from that they create a statistical probability of a climate sensitivity of 2.8C. I think they are fooling themselves. “The first principle is that you must not fool yourself — and you are the easiest person to fool.” – Richard Feynman
Basically, they are comparing two smoothed time series (HadCRUT4 and CMIP5 model mean) to come up with an ECS value. Statistician William Briggs points out the folly of this: http://wmbriggs.com/post/195/
Now I’m going to tell you the great truth of time series analysis. Ready? Unless the data is measured with error, you never, ever, for no reason, under no threat, SMOOTH the series! And if for some bizarre reason you do smooth it, you absolutely on pain of death do NOT use the smoothed series as input for other analyses! If the data is measured with error, you might attempt to model it (which means smooth it) in an attempt to estimate the measurement error, but even in these rare cases you have to have an outside (the learned word is “exogenous”) estimate of that error, that is, one not based on your current data.
If, in a moment of insanity, you do smooth time series data and you do use it as input to other analyses, you dramatically increase the probability of fooling yourself! This is because smoothing induces spurious signals—signals that look real to other analytical methods.
Their figure 1A in Cox et al 2018 compares smoothed series. The dots represent yearly averages of global temperature.
The surface temperature record is highly smoothed, and the model mean they chose is smoothed.
And there’s also by a bit of cherry picking on their part:
If we instead use all 39 historical runs in the CMIP5 archive, we find a slightly weaker emergent relationship, but derive a very similar emergent constraint on ECS (Extended Data Table 2).
So, why limit the number of models used? It’s because it makes them believe they are more certain of the ECS value.
I’m also reminded of this quote:
“If your experiment needs statistics, you ought to have done a better experiment.” – Ernest Rutherford
As Dr. Judith Curry says, “Climate [is] a wicked problem”, and that’s why (from the Exeter press release) “…the standard ‘likely’ range of climate sensitivity has remained at 1.5-4.5°C for the last 25 years…”. I don’t think this study has contributed any precision to that problem.
There are other observational estimates of climate sensitivity to observations. They come up with much lower values.
Willis came up with this:
The results were that the equilibrium climate sensitivity to a change in forcing from a doubling of CO2 (3.7 W/m2) are 0.4°C in the Northern Hemisphere, and 0.2°C in the Southern Hemisphere. This gives us an overall average global equilibrium climate sensitivity of 0.3°C for a doubling of CO2.
https://wattsupwiththat.com/2012/05/29/an-observational-estimate-of-climate-sensitivity/
Dr. Roy Spencer came up with this:
In this case, we see that a climate sensitivity of only 1.5 C was required, a 40% reduction in climate sensitivity. Notably, this is at the 1.5C lower limit for ECS that the IPCC claims. Thus, even in the new pause-busting dataset the warming is so weak that it implies a climate sensitivity on the verge of what the IPCC considers “very unlikely”.
He adds:
The simplicity of the model is not a weakness, as is sometimes alleged by our detractors — it’s actually a strength. Since the simple model time step is monthly, it avoids the potential for “energy leakage” in the numerical finite difference schemes used in big models during long integrations. Great model complexity does not necessarily get you closer to the truth.
In fact, we’ve had 30 years and billions of dollars invested in a marching army of climate modelers, and yet we are no closer to tying down climate sensitivity and thus estimates of future global warming and associated climate change. The latest IPCC report (AR5) gives a range from 1.5 to 4.5 C for a doubling of CO2, not much different from what it was 30 years ago.
Climate sensitivity remains the “Holy Grail” of climate science; a lot of people think they know where it is hidden, but so far it seems nobody has the actual location of it.
There is no chaos, just various noise and large scale variables all interacting in a way complex enough that people shrug and call it chaotic. But if all variables, and how they interact with each other, are known, that’s when chaos becomes ordered and predictable.
https://www.nonlin-processes-geophys.net/17/431/2010/npg-17-431-2010.pdf
Their climate sensitivity to CO2 is still too high. It’s pretty obvious that without 1987/1998/2016 scale El Ninos, there is no warming, and the 2016 El Nino was probably the last big one for a few decades if the PDO cycle is any indication. The 2016 El Nino could be a lot like the 1940 one, the last major belch of heat from the Pacific before 30 years of conditions that favor cooling/heat going into the equatorial Pacific.
Actually not true in the mathematically precise sense of chaos discovered by Ed Lorentz. Any nonlinear dynamic system behaves chaotically in the math sense. Nonlinear just means feedbacks. Dynamic means those feedbacks are not instantaneous. Clouds are an example of both. Nontheless, we know that there are at least two strange attractors in Earths present geological configuration (since closure of Panama isthmus)—glacials and interglacials. The holocene climate proxy wobbles are not just noisy in the statistical sense, they also are chaotic around the interglacial attractor.
> Nonlinear just means feedbacks. Dynamic means those feedbacks are not instantaneous.
While I understand where you are going neither of those statements are correct nor applicable. Drop a steel ball bearing on a glass plate from ever higher distances for a singular example of why both are incorrect.
But what I’m saying is that chaos theory is just an illusion caused by incomplete knowledge. More complete knowledge makes what appears to be an unpredictable system predictable. Lorentz’ weather models weren’t the same as reality just like they aren’t today due to incomplete knowledge, and that’s why complex systems seem unpredictable.
That or an infinite number of realities exist and an infinite number more are created and are diverging at every single ‘moment’, anything is possible I suppose.
Non linear absolutely does not mean ‘just feedbacks’. I am surprised at you Ristvan
Non linear means non linear, such that the sum of the effect of many partial differential equations does not equal the sum of the parts.
Non linear means there IS NO GENERAL SOLUTION to the problem of integrating partial differential equations to arrive at a prediction. You have NO CHOICE but to do stepwise integrations on a vast data set using a computer program.
Non linear also implies chaos is likely to be present.
Non linear also implies edge effects. You aim to hit the President. A 5mph wind means you miss him, a 0 mph wind means you hit him, a 2.5mph wind means he is what? HALF dead?
anyway there is an issue with this paper, and that it is the underlying assumption that it’s CO2, and not something else – like the inherently chaotic nature of climate – driving ‘climate change’.
The easiest way to make a system more linear is to add negative feedback. link Adding positive feedback will make it a lot less linear.
Given the relative stability of the planet’s climate, it’s likely that such feedbacks as exist are negative.
For CAGW theory to work, positive feedbacks are necessary.
Without positive feedback there is no cause for alarm.
Well unusually for this blog, its possible to say that in this instance you are totally utterly and completely wrong, and do not really understand what is mean by ‘chaos’ …
Incomplete knowledge is GUARANTEED by quantum theory. Schrödinger’s benighted moggy…
And, more complete knowledge cannot, in the case of certain functions, make the future any more predictable, since the non linear nature of the relationships means there is no way to reduce the complexity of the problem to less than it is in real life.
As I think Roger Penrose points out in one of his books ‘to predict the future of the universe requires a computers the size of the universe’
And indeed that is what in a sense the universe IS.
Some dynamic systems are, to coin a phrase ‘algorithmically incompressible’ – we cannot reduce them to a simpler form, they are the simplest form, already…no (mathematical) models can be built to model them (accurately)
These are problems at the cutting edge of applied mathematics. We know we cant model them accurately. We dont know if we can even draw a boundary and say ‘they wont exceed that’
The astounding assertions of climate scientists that they can have even a cat’s chance in hell of predicting the climate shows an astounding naïveté that can only be understood as an expression of profound ignorance as to the mathematical class of problem they are attempting to solve.
Only by ignoring 50% of the problem and modelling it in terms of a ‘fudge factor constant’ can they get any answers at all
In reality climate science isn’t settled. Its not even cutting edge, its so far beyond the cutting edge that any answers it comes up with can be instantly disregarded as totally meaningless.
Climate scientists are not good mathematicians nor good physicists. Look at Mann – a third rate mind with a fourth rate personality.Who confuses correlation with causation and dismisses lack of correlation as no refutation of causality.
I mean the man is a clown, a total joke.
Yet millions believe him.
No, I understand what ‘chaos’ is in chaos theory. What people seem to confuse is that it is not just a dynamic system that is very sensitive to initial conditions, it means that the system becomes unpredictable after x amount due to imprecision.
And notice how in my OP I was referring to the real climate system, not the climate models, that’s my entire point of posting this about chaos. Chaos applies to the attempt at modeling reality, it does not apply to reality itself.
In the case of climate models, I think they suffer more from our ignorance of the interplays within the system (inaccuracy) than they do from imprecision of the initial conditions.
Here are a couple links claiming low CO2 sensitivity:
Recent CO2 Climate Sensitivity Estimates Continue Trending Towards Zero
“A recently highlighted paper published by atmospheric scientists Scafetta et al., (2017) featured a graph (above) documenting post-2000 trends in the published estimates of the Earth’s climate sensitivity to a doubling of CO2 concentrations (from 280 parts per million to 560 ppm).
The trajectory for the published estimates of transient climate response (TCR, the average temperature response centered around the time of CO2 doubling) and equilibrium climate sensitivity (ECS, the temperature response upon reaching an equilibrium state after doubling) are shown to be declining from an average of about 3°C earlier in the century to below 2°C and edging towards 1°C for the more recent years.”
http://notrickszone.com/2017/10/16/recent-co2-climate-sensitivity-estimates-continue-trending-towards-zero/#sthash.vnJbRTvd.dpbs
The 75 papers here:
75 Papers Find Extremely Low CO2 Climate Sensitivity
http://notrickszone.com/50-papers-low-sensitivity/#sthash.8K6XmGUz.dpbs
It seems like a lot of people have raked over this topic quite a bit already.
Chaos describes a condition in which a small change in initial conditions would result in a large change in conditions at some later time. It isn’t a result of incomplete knowledge (though it has consequences for predictions made from incomplete knowledge). Nonlinearity is necessary, but not sufficient, for chaos.
“You can think of global warming as the stretching of a spring as we hang weights from it,”… I never thought of it that way and probably never will.
Just like the name “global warming”, the hanging weights from a spring analogy suggests they only ever imagine things in one direction. They forgot to include the negative mass weights.
Not to mention the completely imaginary weights they use when the temperature doesn’t go the way they want it to.
It is just that the springs made by the old timers were just so much more robust.
Lee That must be why Weights and Measures must certify a commercial scale for accuracy annually. 🙂
Agree. A bit like trying to describe Einstein theory of relativity using Newton’s equations for gravity and motion… in that using Newton’s equations would miss Einstein’s points entirely.
They think of it as “Grant Fruit” hanging from the “Government Tree” ripe for picking…
Read Piers Forster’s praise for this new paper as he stepped through the methodology and result. Thought it made no sense. This post confirms that suspicion. All the energy budget observational approaches since 2013 are pegging ECS between 1.5 and 1.8. That also comports with net feedback above CO2 alone at Bode f~0.25-0.3 rather than the 0.65 needed for ECS of 3. Easy analysis. Implicit in the IPCC AR4 ECS analysis is water vapor at f=0.5 and the residual (mostly clouds) at 0.15. In reality, as Wills has repeatedly shown, clouds are ~f=0. And as other posts have shown, water vapor is overstated because precipitation is understated in CMIP5 except for INM-CR4. Water vapor f at 0.25-0.3 matches observationally derived ECS. A good example paper is Lewis and Curry 2014.
Yes, I noticed that Piers left out any mention of ECS and TCS central estimates
that fell below 2C.
Climate sensitivity is the same as a feedback loop. Mechanical and electronic systems with feedback loops are highly unstable. It seems highly unlikely that the earth, which has been around for a long time, and habitable by plants and animals, is dominated by climate feedback loops – either warming or cooling. Systems dominated by feedback don’t tend to be very stable.
The obvious answer is that the Earth has a very low climate sensitivity (<1 degree C). It simply has to be lower than the direct effects of carbon dioxide.
Negative feedback makes systems stable. The fact that the Earth’s temperature has not varied outside of narrow bands for millions of years, suggests that the feedback is very effective at keeping the temperature & therefore the negative feedback must be quite a strong influence.
Positive feedback makes a system unstable. Negative feedback makes it stable.
So if the University of Exeter believes that natural variations are narrower than previously thought, that makes the lower and upper error bars closer to the middle.
The problem is that if the middle is based on climate models, which have over-predicted warming in the past (by about a factor of 2), the top, middle, and bottom of the error bars are all too high, and need to be shifted downward.
It might be that nobody knows how clouds behave, either warming or cooling the planet. Let me give my personal best guess – negative feedback loop. When the planet warms the clouds cause cooling and when the planet cools the clouds cause warming. Why would I say such a thing?
1) There are negative feedback loops all around us. Everywhere we look in nature there seem to be negative feedback loops, why should this one be any different?
2) The earth has been incredibly stable climatically for a long long time. Even when we have had major asteroid events or cataclysmic volcanic activity the climate moves back to “normal.” That has to be from a feedback mechanism that returns the system to its normal operating state. Clouds are the only thing big enough to qualify.
Basically my suggestion would be to assume the system is engineered, then check the system architecture. I would bet you will get a lot closer to the truth than assuming it is a random mess set to fall apart as soon as we make even the slightest perturbation to the system.
So RCP 8.5 in AR 5 needs to tossed out and any composite models that include RCP 8.5 need to be invalidated.
By tossing all the models that are proven to be manically high, this lowers the “model mean” some small way towards reality.
They will still be all totally wrong.. But that doesn’t matter to them at all.
Considering that almost all of global warming is due to adjustments…..sensitivity could possibly be zero
L,
+1
B
No, it means the warming is almost undetectable so far. But not zero. Satellites prove that independently. People really should read Zeke Hausfather’s explanation on tobs in Climate Etc. two three years back. Tobs is definitely necessary, but in my opinion there is a big risk it is being used to inflate otherwise small trend due to confirmation bias. As I said in thread earlier today, the changepoint analysis might detect shelter changes like scheduled washing of them as a breakpoint, which would cause some drifting bias. I lack imagination on how one coukd detect that, of course that predicts there are more upwards trend anomaly steps than downward steps, which is a testable prediction.
Latitude Which means the proposed effects are illusory to nonexistent. Meanwhile drivers are hardly necessary nor predictable in a dynamic system with varying inputs. It’s like trying to chart the effects of a toy propeller in a windstorm. http://www.wnd.com/2017/07/study-blows-greenhouse-theory-out-of-the-water/?fref=gc&dti=536840413144498
“Chaos” is just a spiffy word that means “order so complex that there’s little chance in hell us humans will ever comprehend it”. So, it is a short, fancy word for “incomprehension” or the chosen markers, thereof, that we might call “forecasts”. And the average of multiple runs of incomprehension is supposed to be a BETTER rendition of incomprehension???
The average of chaos is, at best, chaos — at worst, undefinable.
I might have said something useful there. Sometimes I can’t tell until weeks later, when I read what I wrote again with insight gained during the interval from first writing to later reading.
I propose a new definition of “climate sensitivity” — a measure of how pissed off somebody gets when you say to them that CO2 has little effect on Earth’s climate.
Great climate sensitivity is when somebody gets really steamed. Zero sensitivity is when somebody yawns agreeably.
(Robert, all i had do was see the word yawn and i yawned)…
afonzarelli.
That wasn’t the power of suggestion, but rather the effect of my boring commentary. (^_^)
I have good days and bad days.
My understanding is that “chaotic” systems in the Lorenz sense are not random or indeterminate. They are produced by systems of differential equations (e.g. Navier-Stokes). The output of these systems is completely deterministic in that the same input at t=0 will always produce the same output at t=n, but the output looks chaotic because its displays a “sensitive dependence on initial conditions”. Very small input differences can cause very large output differences.
A butterfly flapping its wings at Beijing may cause a hurricane in Florida. The sensitivity of chaotic systems can be such that the effects of changes in conditions that are less than the standard error of their measurement can cause dramatically different results.
“Abandon hope, all ye who enter”.
Well Walter, the first poster who seems to understand it.
The point being that even if we have ultimate knowledge of initial conditions, random quantum noise in the system precludes it from ever being an accurate predictor. Its all very well saying ‘if we only knew the state of every molecule in the atmosphere…even if we did, there is still the issue of quantum uncertainty, which can have macro effects. Schrödingers cat my live, or it may die, let alone a butterfly in Beijing (traditionally I think its usually in Brazil, but it makes no difference)
What we are running into is the fundamental difference between a model of the world, and the world itself.
Even if the model is accurate, even if we have perfect knowledge, quantum uncertainty means we dont know where the roulette ball will end up.
Especially as the process of acquiring perfect knowledge is enough to change the outcome anyway.
“If a tree falls in the wood and no one is there to hear it does it make a sound”? asks the Zen master…
“If a tree falls in the wood, and no one is there to observe it has it actually fallen?” asks the quantum physicist. And the answer is “We have no way of telling”…
Science as most people with limited science understand it – is a process of making models that correspond to a reality as observed, accurately by the ‘detached observer’
And these models then predict accurately the behaviours of the system
But there are catches. Quantum physics has destroyed the validity of the ‘detached observer’ and ‘observed, accurately’. Chaos theory and non linear partial differentials equations show that even if the model is an accurate reflection of the underlying system, its doesn’t give predictable results, because of the issues raised above. Finally, the mathematical methods available mean that only approximate solutions may be possible – even if the starting conditions are exactly known, and the equations are 100% accurate.
This is literally horrifying. And is the hall mark of non linearity. You might be out by a million miles because you only went to 15 decimal places instead of 16….
I’ve looked into this. Te mathematics doesn’t yet exist to predict the climate, and no one knows when or if it ever will.
The models are approximate, the data is incomplete, the equations are chaotic and the computers are inadequate.
It is a right bugger’s muddle.
But plenty of people who are stupid enough to fail to appreciate how tough the real issues are, will be only to happy to claim that they have the solutions.
There is the additional problem of the open system so, even if we did know the initial state of every molecule in the system (and the exact behavior of every process affecting them) the system would still fail as a predictor because new molecules are constantly being introduced (and lost) by meteor impacts, volcanic activity, etc. and those, and their effect of the system, cannot be accounted for in the model
There is another criteria of chaotic systems besides “sensitive dependence on initial conditions” — that is that this sensitivity to initial conditions leads to unpredictability in the long run.
The progress that certain fields have made that concern predictions, such as meteorology, suggest that it is a lack of knowledge of dynamic systems that leads to their apparent unpredictability.
Also the classic Lorenz chaotic system is only chaotic (unpredictable) when trying to predict a long term value further forward in time than x. The system is predictable up to x (whatever x is).
This is why making climate predictions out to 100 years is so laughable. It is positively impossible, but the closer to “now” we make predictions of future weather (or climate) the closer to reality we’ll get with said prediction.
Leo
Thank you for both of your insightful and articulate comments.
There is probably a formal definition along the lines of “if there does not exist an interval epsilon around an initial point A at t=0 for which all solutions at t=tau are found within sigma of point B” or some such.
Reply to Leo Smith says: January 22, 2018 at 12:50 pm
Bingo. As a simple example.
A few years ago, modern predictions of maximum ocean wave height were thought to be explainable by linear modeling. At that time, linear modeling indicated that “rouge waves”(in the range of 30 meters in height) being reported by mariners could not exist. It was said such waves were scientifically impossible.
These “scientific” predictions were dashed when it was demonstrated that 25-30 meter high waves existed off the coast of South Africa. Linear theory was initially “salvaged” when the South African waves could be explained by modeling the interactions between waves and strong currents in the area.
Then, a couple of years ago, incidents were documented in the North Atlantic that not could be explained by current augmentation using the liner modeling. Subsequent German satellites monitoring the North Atlantic later confirmed the presence of 25-30 meter high waves in the North Atlantic.
Since that time it has been observed that the wave patterns the Germans discovered in the North Atlantic are strikingly similar to patterns observed in quantum wave theory. So, it’s back to the drawing board. Maybe in linear modeling of ocean waves will be salvaged again, or maybe not. We’ll see.
But, from my perspective, the bottom line is that even when the best of us believe that we’ve have “it all figured out” we’re later confronted with contrary evidence and compelled to admit we were wrong. And… maybe admit the world is a bit more complicated than we once firmly believed.
They should have read Mr Eschenbachs analysis of the Ceres data before that wrote that paper.
Love the title of the paper. Are there any awards for the titles of these warmest papers?
Emergent constraint on equilibrium climate sensitivity from global temperature variability
What papers are they warmer than? ;-D
I have yet to see any credible evidence that TCS is greater than ~1C/(2xCO2}.
See Christy and McNider 1994 and 2017. Their calculated TCS of 1.1C (+/- tolerances) attributes ALL modern warming in the satellite era to increased atmospheric CO2, allowing for major (century-scale) volcanoes. For clarity, they included NO natural warming, which obviously does exist.
That 1.1C is maximum TCS, whereas actual TCS is probably less than 1C, and possibly much less. There is no catastrophic global warming crisis.
I have yet to see any credible evidence that TCS is greater than ZERO.
No correlation whatsoever in the paleoclimate record on geologic time scales (geocarb), PLUS significant episodes of anti-correlation that simply wouldn’t be possible if there WERE any “climate sensitivity” to CO2 levels.
On shorter time scales where a correlation does exist (ice core reconstructions), it is exactly in REVERSE, i.e., TEMPERATURE drives CO2 level, not the other way around, since CO2 level FOLLOWS temperature changes, up AND down, with a similar time lag. PLUS temperatures always begin their DECLINE when CO2 levels ARE AT THEIR HIGHEST – AND are STILL RISING, which again underscores the complete lack of any CO2 influence on temperature.
CO2 induced climate catastrophe is 20-21st Century mythology, nothing more. It will leave a stain on the field of science as bad as those left by Lysenkoism and Eugenics.
“AGW is not” – there is more evidence to disprove CAGW, for example:
The mainstream climate debate is essentially an argument about the magnitude of Climate Sensitivity (TCS or ECS) to increasing atmospheric CO2:
Global warming alarmists say TCS is greater than or equal to about 3C/(2xCO2), which is false extremist nonsense, for which there is no credible evidence;
Global warming skeptics say TCS is less than or equal to about 1C/(2xCO2), which is so low that there is no real global warming crisis.
On January 31, 2008 (now ten years ago), I published that the rate of change of atmospheric CO2, that is “dCO2/dt” changes ~contemporaneously with atmospheric temperature, so that its integral CO2 lags temperature by ~9 months in the modern data record. In fact, atmospheric CO2 lags temperature at all measured time scales, from the above ~9 month lag for ~ENSO cycles to the ~~800 year lag inferred in the ice core data, for much longer cycles.
Paper at http://icecap.us/images/uploads/CO2vsTMacRae.pdf
Excel sheet at http://icecap.us/images/uploads/CO2vsTMacRaeFig5b.xls
IF CO2 were the primary driver of global temperature, as the warming alarmists allege, then CO2 would lead temperature at all time scales, and not lag it. Richard Feynman called this principle “Causality”. In layman’s terms, “the future cannot cause the past” (at least in this space/time continuum). 🙂
This does not prove that CO2 has NO impact on temperature, but it DOES mean that this impact of C02 on temperature is very small and not at all catastrophic. TCS must be very low, probably much less than 1C/(2xCO2).
It is apparent that temperature significantly drives CO2, and it is obvious that temperature drives CO2 more than CO2 drives temperature. If it were otherwise, this clear dCO2/dt vs. temperature signal and the resulting ~9-month-lag-of-CO2-after-temperature would not exist.
This does not preclude other drivers of CO2 such as fossil fuel combustion, deforestation, other land use changes, etc. – this last sentence is the one most people ignore when they argue about my conclusion – not all increasing CO2 is necessarily caused by increasing temperature, and yet the clear signal of dCO2/dt vs temperature survives loud and clear – not only in the satellite era, but all the way back to the origin of quality CO2 data in 1958, and I suggest long before then.
Humlum et al reached similar conclusions in 2013 here:
http://www.sciencedirect.com/science/article/pii/S0921818112001658
“Highlights:
– Changes in global atmospheric CO2 are lagging 11–12 months behind changes in global sea surface temperature.
– Changes in global atmospheric CO2 are lagging 9.5–10 months behind changes in global air surface temperature.
– Changes in global atmospheric CO2 are lagging about 9 months behind changes in global lower troposphere temperature.
– Changes in ocean temperatures explain a substantial part of the observed changes in atmospheric CO2 since January 1980.
– Changes in atmospheric CO2 are not tracking changes in human emissions.”
https://www.facebook.com/photo.php?fbid=1551019291642294&set=a.1012901982120697.1073741826.100002027142240&type=3&theater
I suggest that the global warming alarmists could not be more wrong. These are the true facts, which are opposite to their alarmist claims:
1. CO2 is plant food, and greater atmospheric CO2 is good for natural plants and also for agriculture.
2. Earth’s atmosphere is clearly CO2-deficient and the current increase in CO2 (whatever the causes) is beneficial.
3. Increased atmospheric CO2 does not cause significant global warming – regrettable because the world is too cold and about to get colder, imo.
Regards to all, Allan
For an actually sensible critique of this paper, read http://www.realclimate.org/index.php/archives/2018/01/the-claim-of-reduced-uncertainty-for-equilibrium-climate-sensitivity-is-premature/
I’ll see your spam….and raise you
Heat loss form Earth’s interior responsible for sliding ice sheets
The descent of of Greenland’s shrinking glaciers is well documented, but the latest research — published this week in the journal Scientific Reports — is the first to link the ice loss with escaped heat from Earth’s interior.
http://www.breitbart.com/news/heat-loss-form-earths-interior-responsible-for-sliding-ice-sheets/
“Scientists estimated 100 megawatts per square meter of energy was transferred from the Earth’s interior to the fjord.”
I think Breitbart flubbed it there by a factor of about a billion. It should of course be milliwatts.
However a high geothermal flux in this area isn’t unexpected given that this ice-stream is unique in Northern Greenland in being warm-based.
And Latitude: all ice sheets slide thanks to gravity, but some ice-sheets slide faster.
A link to the actual paper:
https://www.nature.com/articles/s41598-018-19244-x
Gees Tom, Thanks for pointing out that so-called climate scientists have made ZERO progress in like 30+ years.
They are locked into this zero-science mode, by their ridiculous erroneous assumptions about CO2..
Their problem to solve, or to remain looking like a bunch of headless chooks.
Model schmodel.
This post may pull Mosher out of his mining job in Korea? 🙂
…Mark said: “We used the simplest model…”
Lol.
Why use a model at all?
We have quality temperature data back to 1979 (more if we get the OLD Surface Temperature data “pre-adjustments”) and good CO2 data back to 1958.
When we run a full-Earth-scale test we calculate TCS equal to no more than ~1C/(2xCO2). That is all.
There is no credible global warming crisis – it exists only in the fevered minds of scoundrels and imbeciles.
And you only get that ~1C by ASSUMING that ALL of the warming over that time frame was CAUSED by CO2 level changes, when that isn’t the case. So even less of a (non-)crisis.
““We used the simplest model…””
What was that movie producer’s name again ???
When Exeter Uni, deepest climate trough in the world, so I have heard, try to reduce sensitivity estimates a bit…….
you KNOW they are worried about the coming cooling trend. 🙂
I think this is the real money quote
What they really needed was a paper that indicates that
1) It’s still a problem… AND
2) It’s not too late to act.
As many papers from the 1990’s, 2000’s have already had their “Too Late” thersholds crossed
All the Arctic Ice will vanish in 2000, 2002, 2007, 2012, 2017 etc. and the climate will pass a dangerous tipping point.
It MUST STILL be a problem
BUT
CAN’T be too late to act.
not too high….not too low……just right
Three bears sensitivity…just what they needed
Send grant money.
Is that Mama Polar Bear, Papa Polar Bear, and little Baby Polar Bear?
The Goldilocks period of enlightenment….. and sensitivity.
Goldilocks.
“Food time”, says bad Pa-Pa bear !!
While there has been an ongoing debate on this in the public media, with arguments of very short time available for action, I wonder if there are scientific reviews on how exactly different scientists have described the time window to act. Many prominent scientists have expressed their opinion informally, but I think much less statements like that are expressed in the scientific literature. Are there any attempts to collect them together showing how the ‘date of doom’ has changed? Global warming art, anyone?
Exactly. How convenient!
Exactly the reply I was going to write – you beat me to it.
Yes I previously made basically the same comment on the same quote, the last time (?) this (toilet) “paper” was the subject of a post.
How convenient, I believe was the theme of my remarks. Sounds like every “sales” pitch EVER. “It’s not too late if you ACT NOW!” It’s genuinely shocking to me that otherwise intelligent people can’t see through this crap.
I believe those “Otherwise Intelligent People” actually do see through it, and post here about it.
Unfortunately we don’t have more of those same “Otherwise Intelligent People” in control of Government…Until now that is.
Investor’s Business Daily editorial was headlined:
The Climate-Change Doomsday Just Got Cancelled
“As it happens, though, on the same day the Nature study was published, NASA released its latest report on global temperatures, declaring that 2017 was the second hottest year on record, with 2016 the hottest.
“Guess which story made front page news?
“The New York Times put the NASA story on its main webpage, and ignored the Nature study entirely.”
https://www.investors.com/politics/editorials/climate-change-doomsday-temperature-increase-co2/
I am surprised they are still arguing over the feedback effect of clouds. The simple answer is that clouds serve as the moderator. On hot days they keep things cooler, on cold nights they keep things warmer. In other words, they serve as negative feedback on both excessive heating and excessive cooling.
dbak,
But between heating and cooling is the neutral point. What physics do we have to describe how this neutral point is created? Geoff.
Am I missing something! What about the spread, quantity and quality of the historic data available and used covering all areas of the globe since the pre-industrial age? It seems to me that that is far more critical when attempting to establish any such sensitivity!
The photograph above showing cavemen lighting a fire is hilarious if you consider this precautionary principle! Imagine if the warmists were around then and they managed to stop any attempts of man to make fire for fear of burning out a cave, or decimating a forest or killing of the flora and fauna that the cavemen totally depended upon for food! In such a case, what alternative would the human race then have had to stay alive, keep warm in the Ice Ages and develop!
If there are 7 billion cavemen you can’t stop them.
Human access to reliable energy MUST BE CURTAILED !!!
or stopped completely! *except for the likes of Gore , DePaprio etc.
Given that “more C02 makes it warmer…all things being equal” –
Were all things equal during the period in question of this study?
Andrew
So in other words they are having a harder and harder time trying to explain the increasing discrepancies between real data and models. The data discrepancy keeps increasing no matter how they adjust the data so are inventing some gobbly gook to protect their fake jobs. Hey look over there, a squirrel.
RCP8.5 has been harshly criticized with good reason. I think that this paper is an attempt to abandon RCP8.5 in the hope of being able to defend the 2 degree we’re all going to die position at the trough.
Step by step, they are inching closer to what the data has always shown.
True sensitivity is probably more like 0.2 to 03.C.
It’s hidden in Davy Jones locker at the bottom of the deep, blue sea.
(OOPS! Sorry. That’s where the missing heat is.)
All I had to read was “models”. They have not been able to account for any feedbacks, negative or positive. So they should start with the logarithmic curve shown by David Archibald, which NOAA/NASA are quite aware of. That makes the climate sensitivity about 0.1 deg.
It doesn’t rule out very low sensitivities at all.
Global temperatures have been only showing about 0.15 c per decade increase and this over century equals the very low sensitivity at 1.5c
Even if all this warming was blamed on humans it still falls at very low sensitivity. The fact being the warming is not all caused by humans and natural cycles have at least contributed towards this. There would had been no pause if natural events had no influence at all. The pause on its own hinted humans had no more than 50% influence.
BUT,
The AMO and ENSO has been responsible for this change so there is no where human warming fits in. Even with the bias confirmation of adjusting data to match more closely models, increasing warming when it failed to materialise.
http://www.woodfortrees.org/plot/nsidc-seaice-n/from:1979/normalise/plot/esrl-amo/from:1979/plot/uah6/from:1979
The AMO if continues similar to past cycles will fit two negative phases during this century, so there will be no chance of even representing the last century. The fiddling/faking of data with HADCRUT, GISS/NOAA will only get worse to hide the lack of future warming. With only one warm phase in future this century, there can only be about 0.4c increase with honest observation data set.
At least one third of that is just from the Adjustments to the data.
So, Hansen’s misapplication of the electrical Bode equations didn’t work out so well, and now, we’re going to explain climate change with Hooke’s Law of springs?
Sounds like a 7th grade science experiment to me!
The IPCC published the exact same range of values for the climate sensivity of CO2 in their first report that they publiahed in their last report. So for more than two decades of effort they have learned nothing that would allow them to narrow their range of guesses one iota.
The initial radiametric calculations of the climate sensivity of CO2 came up with a value of 1.2 degrees C not including any feedbacks. One researcher has pointed out that these calculations do not take into consideration that a doubling of CO2 in the Earth’s atmosphere will cause a slight decrease in the dry lapse rate in the troposphere enough to decrease the climate sensivity of CO2 by more than a factor of 20. So a better number for the climate sensivity of CO2 not including feedbacks would be less than .06 degrees C.
The big issue has been H2O feedback and its related uncertainity is largely responsible for the wide range of guesses. The idea here is that an increase in CO2 causes warming which causes more H2O to enter the atmosphere. H2O is the primary greenhouse gas so more H2O causes even more warming which causes even more H2O to enter the atmoshere and so forth. A typical assumption is that the positive feedback of H2O causes an amplification of the warming effects of CO2 by roughly a factor of 3. It is the uncertainity of this feedback factor that causes the wide range of climate sensivity questimates. What this calculation ignores is that while H2O is the primary greenhouse gas in the Earth’s atmosphere, H2O is also a major coolant in the Earth’s atmosphere moving heat related energy from the Earth’s surface which is mostly some form of H2O to where clouds form via the heat of vaporization. The overwheimling cooling effect of H2O is evidenced by the fact that the wet lapse rate is significantly less than the dry lapse rate in the troposphere. So instead of an amplification factor of 3 a more realistic amplification factor would be 1/3 yielding a climate sensivity of CO2 of less than .02 degrees C which is quite trivial. If the IPCC were not so policital they would be adding calculations like mine to their range of guestimates.
Another conconcern is that the radiant greenhouse effect that the AGW conjecture depends upon has not been observed, in a real greenhouse, on Earth, or anywhere else in the solar system. The radiant greenhouse effect is fiction. Taking this into acocunt’, the climate sensivity of CO2 is zero. So a better and more realistic range of estimates for the climate sensivity of CO2 would be between 0.0 and .02 degrees C.
The real Earth’s climate history pretty well agrees – ECS of CO2 level changes is essentially zero.
At the Cato Institute table at the end it is the “Holy Grail” will be somewhere about 2.5, so 2.8+/-0.6°C is plausible. That is still a sharp increase, I suppose it wouldn’t be too bad for some people and polar bears too.
Anthony,
Should be Climate is a wicked problem
[Thank you. Peer-review works! .mod]
BTW, there should be a whole post/discussion of wicked problems
no, The Climate AGENDA is a wicked problem
Couldn’t agree more. If things are warming, “climate” is a non-problem. When it starts cooling THAT will be a problem, which will have its effects multiplied heavily if we squander our resources chasing our tails to “fix” the non-problem of AGW.
I’m an engineer whose computer at graduation was a slide rule. You had to estimate the answer to place the decimal correctly. You had the terror in the background of the possibility of making an error. People died because of errors!.
The computer was a wonderful invention, but it did let grossly incompetent people in the door and this at a time when “industrial democracy” resulted in universities throwing their doors open so wide that they had to invent ‘faculties lite’ to welcome illiterates and numerically challenged students and professors in. It was simply a matter of money, which put scholarship in the back seat.
Simple Excel made calculations of the whole range of statistical techniques easy to do (It’s noteworthy that Phil Jones admitted he didn’t understand how to do Excel) so you could try out dozens of techniques with stuff that that didn’t even represent data and even invent tailormade stat treatments until you got what you wanted. This is how Mann made the hockey stick using an invention that made all noise into hockey sticks. He even employed a contaminated proxy upside down and even repeated this in subsequent renditions. Even better than climategate would be to see all the trials they did and rejected! Hey, the worst of them won’t let you see the code. I suspect it would be a Rube Goldberg concoction.
Briggs’s admonition is simple to understand. If you manipulate data to make a smooth graph, any statistical technique you use afterwards recognizes your improved “fit” as superior data with acceptably narrow error bars!
Bad stats isn’t the worst of it. They are going to get the right answer however its arrived at because that’s what they are paid for.
Widespread access to cheap computers has also led to abuse in the world of public health where people churn through huge amounts of data to identify correlations which are in fact purely chance findings. There is no doubt that since the 1980s much damage to society has arisen from cheap computers and recreational drugs – sometimes simultaneously.
The new lower limit of 2.2 C is larger than the old lower limit of 1.5 C which iwas already larger than the maximum possible effect as limited by the laws of physics of about 1.1C. All they did was move the range further away from what the physical ECS actually is.
Looks like post 1999 both high and low models run hot to observation.
The interval most if interest has the lowest correlation.
I am constantly surprised at how climate science authors forget both the scientific method and ethics. The abstract talks about policy with reference to the Paris Agreement yet the source temperature data set has uncertainty level set by hypothetical means that bear little relevance to actual measurement standards for verification. Pure theory being touted as important. I am also surprised at myself because I keep repeating this point on this site like Mugatu and the Crazy Pills!!
But anyway, if a purely scientific discussion is being made then at least state the limits of results based on input assumptions.
But then it would look like what it is: a what-if exercise.
See also the post at And Then There’s Physics: https://andthentheresphysics.wordpress.com/2018/01/20/narrowing-the-climate-sensitivity-range/
Which is more drivel assuming that CO2 “causes” warming, a (non)”fact” not in “evidence.”
Of course many a real scientist will point out that climate scientists needed to focus on trends because the data points arising annually are inherently so variable that it is almost impossible to draw meaningful conclusions in the short term.
The good thing about loan-sharks is that they are actually often legally obliged to quote an APR on the products they offer the public. Being forced to do this makes it harder for them to deceive their customers/victims. But climate scientists have no such restrictions, so they effectively jump between the time-domain and the frequency-domain as and when they think their audience is not watching the pea very closely. Thus: Long term temperature trends are disappointingly normal-looking?… Never mind. No real change in the science, but let’s focus on making claims about individual years instead. That way, nobody has to wait to actually see what happens to the trends when they are not producing the desired effects soon enough. So previously we had hot hot hot “global warming weather”, but then it became longer term “climate change” because nobody really felt much hotter. And then it becomes “weird” weather events again because climate takes a long time….And so the wheels on the bus go round and round…..https://www.youtube.com/watch?v=gOSabglQAvc
“We use an ensemble of climate models to define an emergent relationship.”
I don’t understand this. If you don’t understand a system enough to establish an emergent relationship, why would a model you had programmed do it. Worse, if one model doesn’t do it, why would an ensemble do it?
Me thinks somebody should go back and do systems engineering 101 again.
The ensemble fallacy, i.e. thinking that a collection of only partially valid models would make more reliable results, is somewat odd.
Monkeys throwing darts at a wall covered with different possible future climate states are about as “scientific” as “climate models” at this point. “Climate models” do nothing but spit out the false validation of the incorrect input assumptions.
30 years!
A trillion dollars.
A hundred thousand of mankind’s brightest minds [ Ah! Sir! Surely you do but Jest, Sir ! ]
A few hundreds of thousands of mankind’s most complex numerical model’s runs .
A massed array of the supporting quagmire of the Elites of politics, of the media , of the Inner City goat cheese circle Expert Greens, of Ivory towered academics who know nought of the real world outside, and a vast array of the good, the great and the most outspoken of the ignorants and incompetents of this Earth.
Just One assumption they in their wisdom, all have!
That a minor and not very concentrated atmospheric gas is in complete control of Earth’s critical temperatures.
Nations rise, Nations fall,
They bury the great and not so great,
They bury the believers and the unbelievers and the d—-rs
El Ninos come and go.
La Ninas come and go, neither rarely predicted with any accuracy and sometimes not at all..
The great sea ice and land ice sheets of the Arctic and Antarctic come and go while the Great Minds of mankind argue if they will come again or go again next year or five years or ten years .
Nature laughs!
The simple minds of the media and the politicians and the greens who can’t propnounce or spell “CO2” talk in horrified whispers of the deadly “Carbon” which will destroy the planet.
When such a catastrophe is due is after 30 years of assidious research is still the subject of debate , of argument and of growing contempt towards those who know, they know, but know not why they know.
The Great Minds of Politics and Media and Academia and Climate Science and Experts from inside of Green Goat Cheese inner city Circle continue to grapple with finding a solution as to “when” , not “IF” such a simple three atom atmospheric gas, essential to all life on earth and found only by careful measurement, will exercise its power and destroy everything on this planet by revealing of how little consequence it really is as a gas and thereby destroy that very mirage and cloud created psuedo scientific edifice the Elites have so carefully built up and have treasured and protected over the decades past against the mass assaults of the ignorant hordes of Unbelievers, the D—–rs and from the those ever more ignorant and revolting peasants far below them in status.
“They seem to overlook one very important thing. In their method, they look at “variations in yearly global temperatures”. They are assuming that the envelope created by the variations will reveal an underlying trend, and from that, a measure of climate sensitivity by comparing it to model output. Their analogy in the press release, using a weighted spring reveals their thinking as believing Earths climate as being a “constrained system”.
Earth’s climate does have some constraints, but it also has chaos, and the chaotic nature of the myriad of forces in Earth’s atmosphere is often pushed beyond what is considered a normal for such constraints. Chaos itself becomes a “forcing”. It is why we get occasional extremes of weather and climate. Edward Lorentz was the first to describe the chaotic nature of the atmosphere with his “butterfly effect” paper in 1972. http://eaps4.mit.edu/research/Lorenz/Butterfly_1972.pdf”
1. No they are NOT assuming the envelope reveals an underlying trend.
2. An analogy is just that. it is not critical to the actual math. analogies are just meant to simplify..
like the greenhouse and blanket analogy they are NOT substitutes for the math and they
are just cartoon versions that hopefully illuminate. But all analogies fail. That’s why you
do the math.
3. Climate is not chaotic. Chaos is not a forcing. Forcings have units. Watts. The butterfly effect
will work on small scales and on certain metrics. It does not change energy balance.
If it could change energy balance then we would live in a MORE SENSITIVE climate.
Ok, the GHE is gone for good.
I am stil working on finishing my paper, but things are pretty much settled already, and it gets better everytime I check against empirical data. Yet the story it so absurd because it is soooo simple and obvious.
https://www.weather.gov/jetstream/energy
There is a lot of nonsense in this consensus GH-model. The most significant part however are the 12% (342*0.12=41W/m2) going into space. This is the window causing diurnal temperature variations. As these are average values, a clear sky will have a larger window, something like 63-71W/m2. This loss of heat will drop temperatures during night as we know it.
At the same time we are being told, that clouds would have a net cooling effect of about 18W/m2 on average. On average means including all sky conditions. As clouds can only have that effect if they are present, and assuming they would cover 35% of the surface, the negative 18W/m2 must attributed to these 35%. So if there is complete cloud cover of a 100%, the negative effect should be like 18/0.35 = 51W/m2.
During a clear night temperatures will fall by 1.5-2K per hour. If the negative cloud effect was true, temperatures would need to fall by at least 1K per hour, as an average over night and day. But temperatures do not fall at all with clouds. Rather, and that is what I have been researching, temperatures increase with clouds.
Clouds having a positive net effect, is a death sentence to the GH concept..
I do not see how that can possibly be true.
It is not just the Day vs Night affect of clouds, it is Daytime High Energy Solar Radiation 0.08 – 12eV compared to Nighttime LWIR 0.08ev.
The daytime radiation warms the Oceans to depth, deprive them of that with cloud and it has a majrt effect on long term Energy storage.
Nighttime cloud merely slows the rate of energy loss from the surface.
There is a problem using year to year average temperature changes to calculate sensitivity.
A nominated year showing a temperature rise, on average, will have some locations where the change is less than the average, even some locations where there has been a fall. While the average process hides this, the mechanism still exists. The work is deficient if it cannot explain why these negative excursions can and do happen. What happened, at that time in history, other one site with a rise while there was another with a fall? Geoff.
I would like to copy here a shortly what the IPCC has written about the ECS : IPCC summarizes the differences of ECS and TCR (IPCC has changed the term TCS to TCR (Transient Climate Response)) in AR5 like this (p. 1110): “ECS determines the eventual warming in response to stabilization of atmospheric composition on multi-century time scales, while TCR determines the warming expected at a given time following any steady increase in forcing over a 50- to 100-year time scale.” And further on page 1112, IPCC states that “TCR is a more informative indicator of future climate than ECS”.
Even the IPCC says that ECS values are very theoretical. IPCC uses TCR/TCS model in calculating the RCP-warming values in the end of this century.
Still too high — by a factor of 2.
Anything higher than zero is too high, given the Earth’s climate history. And the more people start to look to that for the correct answer, the sooner the Climate Fascists will launch a campaign to “adjust” that climate history, like the Ministry of Truth in Orwell’s “1984.”
The ultimate irony for me is that Bill Nye, who apparently is somewhat qualified as an engineer, would never accept such magnitude of a fudge factor in his specialty, but would smear those who point it out.
I hate living in 2018 sometimes…
‘Using this approach, the team derive a range of climate sensitivity to doubling carbon dioxide of 2.8+’
I am amused they used a decimal point.
Correction: 2.8 C is not a “statistical prediction.” For there to be a “statistical prediction” there has to be a “statistical population” underlying a “statistical model” but this population does not exist. Long ago, Svante Arrhenius supplanted probability theory and statistics through an application of the reification fallacy.