
Guest essay by Eric Worrall
The researchers claim adding historical data derived fudge factors to correct the discrepancy between climate models and historical observations, producing a Frankenmodel mix of fudge factors and defective physics, will make climate predictions more reliable.
New approach to global-warming projections could make regional estimates more precise
Computer models found to overestimate warming rate in some regions, underestimate it in others
Date: May 15, 2018
Source: McGill University
Summary:
A new method for projecting how the temperature will respond to human impacts supports the outlook for substantial global warming throughout this century – but also indicates that, in many regions, warming patterns are likely to vary significantly from those estimated by widely used computer models.
…
“By establishing a historical relationship, the new method effectively models the collective atmospheric response to the huge numbers of interacting forces and structures, ranging from clouds to weather systems to ocean currents,” says Shaun Lovejoy, a McGill physics professor and senior author of the study.
“Our approach vindicates the conclusion of the Intergovernmental Panel on Climate Change (IPCC) that drastic reductions in greenhouse gas emissions are needed in order to avoid catastrophic warming,” he adds. “But it also brings some important nuances, and underscores a need to develop historical methods for regional climate projections in order to evaluate climate-change impacts and inform policy.”
…
Read more: https://www.sciencedaily.com/releases/2018/05/180515113555.htm
The abstract of the study;
Regional Climate Sensitivity‐ and Historical‐Based Projections to 2100
Raphaël Hébert, Shaun Lovejoy
First published: 13 March 2018
Abstract
Reliable climate projections at the regional scale are needed in order to evaluate climate change impacts and inform policy. We develop an alternative method for projections based on the transient climate sensitivity (TCS), which relies on a linear relationship between the forced temperature response and the strongly increasing anthropogenic forcing. The TCS is evaluated at the regional scale (5° by 5°), and projections are made accordingly to 2100 using the high and low Representative Concentration Pathways emission scenarios. We find that there are large spatial discrepancies between the regional TCS from 5 historical data sets and 32 global climate model (GCM) historical runs and furthermore that the global mean GCM TCS is about 15% too high. Given that the GCM Representative Concentration Pathway scenario runs are mostly linear with respect to their (inadequate) TCS, we conclude that historical methods of regional projection are better suited given that they are directly calibrated on the real world (historical) climate.
Plain Language Summary?
In this paper, we estimate the transient climate sensitivity, that is, the expected short‐term increase in temperature for a doubling of carbon dioxide concentration in the atmosphere, for historical regional series of temperature. We compare our results with historical simulations made using global climate models and find that there are significant regional discrepancies between the two. We argue that historical methods can be more reliable, especially for the more policy‐relevant short‐term projections, given that the discrepancies of the global climate models directly bias their projections.
Read more (paywalled): https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1002/2017GL076649
The researchers hope this mixture of historical fudge factors and defective physics will be more acceptable as the basis of climate policy decisions than just using the defective physics.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
But of course! How could we have been so blind? The models have always been correct it was the historical data all along. That’s why we’ve been altering the data sets. Now do you understand?
However, they will use astrology rather than astronomy for their solar system fudge factors. It’s a given the way they pick and chose their needs.
Stop maligning astrologers. They don’t ‘adjust’ the positions of the planets, after all.
Astrologers now have a higher degree of accuracy than climate sci-fi prognosticators.
Chthulhu rules!
If they understood the history of astronomy, they would realize that they are proposing ptolemaic Epicycles all over again.
The more off base your underlying model is, the more and more you have to keep adding ad-hoc mathematical adjustments to track your actual observations.
It’s more circular thinking from the folks whose theory predicts everything… And therefore nothing.
More like epicyclic thinking.
I have decided that AGW pushing climatologists are simply telephone psychics with a degree or two. They cannot tell you about the past, cannot predict the future and try to keep you on the line to keep the grant money flowing.
One of the authors is Shaun Lovejoy? A TOTAL ALARMIST ? Look him up. I recall seeing his name as an ‘alarmist’ over at Dr. Judith Curry’s website, ‘Climate Etc.’.
Check him out. Please.
Check this lInk about Shaun at Dr. Curry’s website:
https://judithcurry.com/2015/10/23/climate-closure/
Yes, from a couple years ago. But obvously Shaun’s ‘WARMIST ALARMIST’ position hasn’t changed . . .
( . .to quote Bugs Bunny, “WHAT A MAROON . .! “
Adding in observations from reality, to influence climate models, can only be a good thing.
What else should models be influenced by? The biases of the programmer?
Now if we could just find an impartial way to select which historical observations to use we’ll be sorted.
And the new ‘n improved GCM will still output ludicrous hysterical nonsense.
Actually, no. Models are generally initialized from real world data, and then the computations alone are supposed to produce the results. If you have to continuously update the internal data values using real world data, then your models are of no value for forecasting (projecting). For example, if you need to update the model every month of simulated time, then it is only good for projecting forwards a month. Not very useful for a climate model.
Katharine Hayhoe – has done numerous studies of the effect of climate change 50-100 years for very localized areas of the country ie the state of maine, or NW washington –
and this study for southern california 60 years out due to climate change
Gotta be impressed with being able to predict the weather oops – I mean climate 60 years out.
oops forgot to link the study https://www.skepticalscience.com/california-weather-whiplash-fighting-to-stop-it.html
do i need a sarc tag?
I think sarc is the default mode here, given the ludicrous nature of Warmist cant.
You just caused me to visit Skeptical Science. I need a shower!
Surely if an area is expecting heavy rain in one season and drought in another the investment should be in reservoirs? Is that what California is doing?
Susan
PS: as Joseph told Pharoah……
Susan, seriously… this is too obvious and simple.
You need must better solutions, like:
* making solar roof compulsory
* painting road white
* planting as much windmill as possible
* subsidizing rich guys buying electric toy cars with money taken from average Joe paying electricity bill
*…
But I guess you are not smart enough to understand how these actions will help water management.
I wonder why previous comment went to moderation. This blog seems easily “triggered”
………..damn computer
Only the “data” which fits the narrative.
fudge factor:
” a figure included in a calculation to ensure a desired result.”
They be guilty as charged.
Finagle’s constant
It is time to pull out the pitchforks and torches and run down the Climate Alarmists Frankenmodel hobgoblins.
So, more-and-better garbage in, more-and-better garbage out.
I found this analysis of IPCC climate models enlightening:
http://edberry.com/blog/climate-physics/agw-hypothesis/human-co2-not-change-climate/
The physics of the atmospheric concentration of CO2 situation is parallel to a situation I have been considering:
The energy contribution of man’s CO2 is about 1% of the planet’s energy fluxes. How the heck can any honest investigator say that 1% drives the climate with any certainty?
There are situations where we are damn sure that even less than 1% of the flux drives the system. In fact, for the purpose of control human engineered lots of device where a small change in a part of the system as huge effect on the whole. Transistors, switch, gates, hydraulic press, medicines, etc.
So the % of planet energy flux in human control is irrelevant
It’s an extension of the homeopathic principle – the less you have, the more its effect…
you know about this guy who died from homeopathy overdose? He forgot to take his granule
The question is, then, “IS the flux caused by CO2 in the Earth system similar to those fluxes in those engineering systems?”
Is CO2 like a strong poison, for example ? I get the impression that it is NOT.
Are the boundary conditions defining the barriers or thresholds over which the CO2 flux traverses of a similar configuration to justify regarding the engineering analogy as a correct analogy? A switch, for example, has a pretty definite place to separate or connect an electrical current over a very small space. CO2 does NOT operate this way — it’s more “all over” in a much greater volume. [Forgive my engineering deficit that probably disables me from saying this in the best technical way, but, hopefully, you get some gist of what I’m trying to say.]
I have been arguing that CMIP6 modelers must all use the same historical data, especially for aerosols. That’ll separate the sheep from the goats.
Additionally, none of those spaghetti graphs, especially for hindcasts.
Yes, I have never understood this. Maybe Nick can comment. The idea that we take some failing models and some successful ones, then average them, and get a prediction, seems utterly irrational. The existence of the spaghetti graphs seems clear proof that the science is not settled and that we do not actually know what the future holds.
Why would anyone proceed like this? Like, imagine we are proposing to iintroduct a new drug to prevent dementia. So we construct a few dozen models, and we end up with predictions of dementia in the absence of the drug which are all over the place. We decline to take those models which have the best fit to recent dementia levels. Instead we take all we happen to have to hand, without any criteria as to why we just include them from the total set of all possible models, and average them? Its a nonsense.
Is the criterion for inclusion of a model maybe that one of our friends has constructed it? Regardless of its success in forecasting anything?
“A new method for projecting how the temperature will respond to human impacts supports the outlook for substantial global warming throughout this century”
And there you have it. As other climate scientists have noted, statistically there will be “overestimate warming rate in some regions, underestimate it in others” and a whole lot in the middle. In other words, we’re still warming, still CO2, and “substantial global warming throughout this century.”
Read a little more carefully. In the abstract the researchers admit … the global mean GCM TCS is about 15% too high …. Thats a pretty serious defect. To match long term projections the actual climate has to catch up to the model projections, but given the models which provide long term predictions incorporate the defective physics which produces the 15% transient climate sensitivity discrepancy it is far from certain that their long term predictions are any good.
But 15% is so much superior to 40% or 50%. Think of the children.
” the defective physics which produces the 15% transient climate sensitivity discrepancy it is far from certain that their long term predictions are any good.”
I’ll see your “defective physics”…and raise you a jiggered temperature history that constantly changes
We will never know what the temp is today…because their algorithm will change it tomorrow..and the next day..and the next..
That alone makes all climate models total trash…..
@ur momisugly Scott K…Why has it taken these geniuses so many years to think of this???
That we have warmed since the bottom of the LIA was never in doubt.
The claim that it was caused by CO2 has never been proven and in fact has been disproven.
Curve matching is not the same as a valid model.
From one of the fathers of climate modelling we have this:
The use of fudge factors to make a model match historical data violates the basic fundamentals of a valid physics based model. We’ve known that ever since the very beginnings of computer modelling. What we have is nothing better than curve matching. Using it to predict future trends is just extrapolation … unfit for engineering purposes.
commieBob,
So much for the pious claims of Hayhoe and others that the models are all based on first principles.
But what are Hahoe’s principles. First, second or third?
They are but they then fail to predict the past.
The fraud (I don’t like saying that but it is sadly true now) comes from not admitting that therefore our understanding of first principles is lacking.
They then add fudge factors in the hope (mistaken belief) that fudge required to predict the past will be the same as the fudge required to predict the future. That’s very unlikely.
They also ignore the fundamental problem of starting conditions in a dynamic, compex, chaotic system. Get those wrong (and you must) and your forecasts go wrong immediately.
Lee,
I’m not sure that Hayhoe has any principles. 🙂
Agree completely CB. I’ve had experience with a few models that, it turned out, included “calibration factors” which were really just fudge factors to make results match real experimental data. Of course when inputs varied significantly from the calibration values, results became absurdly wrong.
Plain language summary:
Even though everything we reported here was already reported…. send more money so we can keep everyone in our group employed churning out more stuff already reported.
Ed note: The Climate Yobs-machine rolls ever onward
The thing is that we now have more than enough instrumental data and recent high resolution proxy data to accurately model how the “climate” should respond to a doubling of the atmospheric concentration of CO2 from 280-560 ppm.
The problem is that these sorts of “models” conclusively demonstrate that the “2.0 °C limit” won’t be breached before James T. Kirk assumes command of NCC-1701 USS Enterprise and that the “1.5 °C limit” won’t fall before Jonathan Archer assumes command of NX-01 USS Enterprise.
The researchers were careful to avoid mentioning equilibrium response, they focussed on transient climate response…
ECS puts the 2.0 °C limit out beyond Jean Luc Piccard”s retirement date… assuming a realistic climate sensitivity.
The researchers admit TCS is running 15% too hot in the abstract, but they skirt around whether this has implications for ECS.
President Camacho will be in charge by then.
At that 2.0 C the plants will probably need the “lectrolites” in Brawndo.
JimG1.
. .NICE ! ( . .the electrolytes in ‘Brawndo’ . .! )
Looking at this, I doubt we’ll get there, at least not for long.
http://notrickszone.com/2018/05/03/its-here-a-1900-2010-instrumental-global-temperature-record-that-closely-aligns-with-paleo-proxy-data/
The GCMs will never work correctly until at least they recognize thermalization and Quantum Mechanics. Science shows that thermalization takes place because absorbed energy starts being shared by conduction with surrounding molecules within 0.0002 microseconds while relaxation averages about 5 microseconds at room temperature. Common knowledge demonstrates thermalization by the observation that clear nights cool faster and farther in the desert than where it is humid.
Quantum Mechanics (Hitran does the calculations) shows that, at low altitude, radiant emission from the atmosphere is essentially all from water vapor.
http://energyredirect3.blogspot.com
Interesting, +10, and thanks for the link.
Your explanation in terms of net gas kinetics rather than photon re-emmision is, dare I say, elegant. Well done.
Rule one of climate ‘science’ , when reality and models differ in value it is always reality which is wrong and needs adjusting ,
Rule Two: Never let reality stand in the way of the next grant.
“It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.”
Richard P. Feynman
For the up/down/”back” radiation of greenhouse theory’s GHG energy loop to function as advertised earth’s “surface” must radiate as an ideal black body, i.e. 16 C/289 K, 1.0 emissivity = 396 W/m^2.
As demonstrated by my modest science-not-fudge experiment (1 & 2) the presence of the atmospheric molecules participating in the conductive, convective and latent heat movement processes renders this ideal black body radiation impossible. Radiation’s actual share and effective emissivity is 0.16, 63/396.
Without this GHG energy loop, radiative greenhouse theory collapses.
Without RGHE theory, man-caused climate change does not exist.
Of course, that doesn’t stop hosts of pompous “experts” ‘splaining the mechanisms of this non-existent loop with quantum electro-dynamics, molecular level physics, photo-electrics which, when faced with real data, ends up as pretentious, handwavium, nonsense.
(1) https://principia-scientific.org/experiment-disproving-the-radiative-greenhouse-gas-effect/
(2) http://www.writerbeat.com/articles/21036-S-B-amp-GHG-amp-LWIR-amp-RGHE-amp-CAGW
Nic – Emissivity is a property of a surface. It has nothing to do with the environment. Apparently you misinterpreted your experiment. IMO no lab experiment will ever prove or disprove that CO2 has no significant effect on global climate. The so-called GHE exists and the planet would be colder if it didn’t.
On the up side, CO2 does not now, never has and never will have a significant effect on climate. The reason, demonstrated with thermalization and Quantum Mechanics is that, at low altitude, photon energy absorbed by CO2 is rerouted to water vapor. http://energyredirect3.blogspot.com
Dan
The whole idea is based on the proposition that the reduction of emitted IR at TOA due to more CO2 will cause the whole climate system to correct to balance this. So the flea on the tail wags the dog.
Of course temperature is a function of total emission not specific. The other processes in the atmosphere and their coupling is unknown. And the actual impact beyond the lab of CO2 is hard to describe. So you don’t even have a hypothesis (as was pointed out to me here a while back). You have a partial partial process.
No worries though it’s all just a spherical horse in a vacuum winning all those races for the Sheik.
The theory that CO2/GHGs absorb/radiate/trap LWIR is an attempt to explain the mechanism behind the GHG energy loop which I claim is nothing but a theoretical calculation, a “what if” scenario, no actual physical reality. Tough to explain what does not exist.
Emissivity is the ratio between what a “surface” actually radiates and what it would radiate were it a theoretical black body emissivity 1.0. (See above.)
The GHG energy loop which is the foundation of RGHE requires that the surface radiate as a BB at 16 C/289 K/396 W/m^2.
What my experiment demonstrates is that in the presence of other heat transfer pathways the surface can not radiate as a BB. In fact the surface LWIR is 63 W/m^2 which at 289 K yields an emissivity of 0.16.
Without this GHG energy loop there is no RGHE and no man-caused climate change and a whole lot of pretenders suddenly unemployed.
So still can’t grasp that they are trying to model a vast non-linear close coupled chaotic system with a parameterized linear computer? Faster, bigger computers with tweaked data are still just spewing faster, bigger piles of effluent.
Using a starting point you cannot hope to get right!
Try landing on the Moon if you don’t know where you are starting from!
Well, just because astrologers and fortune tellers got out of fashion, humans didn’t quit the bad habit to act as if unpredictable were somehow predictable. So computer assisted numerology took over.
I don’t count the number of time I participated in dialogue such as below with higher level managers:
manager (anxious): what will happen next?
me: I can tell you there is no way to tell, because, too much unknown, both known unknown and unknown unknown.
manager (even more anxious): ye, sure, understood, but, nonetheless, what would you predict, if you could?
me [coercing lots of people to give me lots of painful numbers –that’s an important part for the alchemy of belief–, putting them in large matrix, multiplying, dividing, averaging, mixing numbers, with a mystery method etc; and finally getting a result in a big nice unreadable and very impressive matrix, with comments in “summary for executive” to explain a final guess, not more stupid than any other, anyway]: “there we are, boss, here is your guess. Of course you understand the result is of no value, and anything could still happen
manager (happy): YES! whatever the incertitude, it still allow me to set a course of action,
The answer is simple throw out the models, and vigorously study the regional data until some level of comprehension sets in. Then go back and attempt to formulate a new model that has a basis in reality.
Model it in four lines:
Change in CO2
Change in global average temperature due to CO2 change
Feedback effect on temperature change caused by CO2
Result
You can put in what you like for lines 1,2 and 3.
But of course this model shows that lines 2 and 3 are just assumptions, not “emergent facts” as the climate scientists claim.
What sort of “science” is this?
Climatology has trumped the social sciences with regards to producing nearly impossible to reproduce and often inane or patently obvious statistics.
A variable TCS ???????
I thought it was a fixed constant based on “well mixed” gasses and constant solar radiation etc. and etc.
Isn’t this fundamental to how the models are supposed to be constructed?
Now they seem to want to set up a TCS for each cell with the Earth carved up into 5 X 5 degree cells.
All this does is introduce a huge number of tuning variables. You over-fit, your parameters inadvertently model the wrong thing, and you extrapolate.
Then you do a spectacular crash and burn.
Hey, my comment is straight up science, hypothesis, experiment, observations, by someone actually trained in applying science.
Talk about fingers-in-your-ears denial!!
Dry labbing used to get you an “A.” Now, it’s the way to get you a gov’t grant.
Pitiful behavior!
GIPO. Garbage in, policy out.
A little bondo and jb weld and they’ll have those models going down the road in no time
Please be advised that this researcher had already looked at some historical data, data that climate scientists appear to studiously ignore such as the time series for the UAH satellite lower troposphere temperature relative to ground based observatory CO2 concentration data.
The results to date show:
a. There is no statistically significant correlation between satellite lower troposphere temperature and atmospheric CO2 concentration, that is, the term CO2:temperature climate sensitivity is meaningless;
b. There is a statistically significant correlation between the temperature and the rate of change of CO2 concentration;
c. Temperature change precedes CO2 change so it cannot possibly be caused by the latter;
d. The temperature and the rate of change of CO2 concentration have practically identical autocorrelation coefficients with a prominent maximum at about 42 months;
e. This is confirmed by having practically identical Fourier amplitude spectra with a glaringly obvious peak at a period of 42 months which period does not appear to feature in discussion of any climate models.
The Fourier amplitude spectra display a number of maxima such as the Moon’s 27.21 day draconic period and its 29.53 day synodic period. This should be evident to anyone knowing that the Moon passes between the Sun and the Earth every month. However it would appear that climate scientists are incapable of performing such elementary time series analysis.
For more detail see:
https://www.climateauditor.com
Thank you Bevan – a few people like you are finally discussing the points I made in my January 2008 icecap.us paper, where I pointed out the close correlation between dCO2/dt and global temperature, similar to your points a to d . I have no opinion yet on your point e, etc.
It is amazing that this issue has largely been ignored for a decade by both sides of the fractious global warming debate. It seems like everyone is having too much fun arguing about global warming bullsh!t to just look at the incontrovertible facts, as evidenced by this data.
A few more thoughts here.
https://wattsupwiththat.com/2018/05/02/is-climate-alarmist-consensus-about-to-shatter/comment-page-1/#comment-2805310
https://wattsupwiththat.com/2018/05/09/clouds-and-el-nino/comment-page-1/#comment-2813713
Best, Allan