Study: Climate Models Need More Historical Fudge Factors

Fromm the movie “young Frankenstein” by Mel Brooks. Igor peruses the brain of “Abby Normal”

Guest essay by Eric Worrall

The researchers claim adding historical data derived fudge factors to correct the discrepancy between climate models and historical observations, producing a Frankenmodel mix of fudge factors and defective physics, will make climate predictions more reliable.

New approach to global-warming projections could make regional estimates more precise

Computer models found to overestimate warming rate in some regions, underestimate it in others

Date: May 15, 2018

Source: McGill University

Summary:

A new method for projecting how the temperature will respond to human impacts supports the outlook for substantial global warming throughout this century – but also indicates that, in many regions, warming patterns are likely to vary significantly from those estimated by widely used computer models.

“By establishing a historical relationship, the new method effectively models the collective atmospheric response to the huge numbers of interacting forces and structures, ranging from clouds to weather systems to ocean currents,” says Shaun Lovejoy, a McGill physics professor and senior author of the study.

Our approach vindicates the conclusion of the Intergovernmental Panel on Climate Change (IPCC) that drastic reductions in greenhouse gas emissions are needed in order to avoid catastrophic warming,” he adds. “But it also brings some important nuances, and underscores a need to develop historical methods for regional climate projections in order to evaluate climate-change impacts and inform policy.

Read more: https://www.sciencedaily.com/releases/2018/05/180515113555.htm

The abstract of the study;

Regional Climate Sensitivity‐ and Historical‐Based Projections to 2100

Raphaël Hébert, Shaun Lovejoy

First published: 13 March 2018

Abstract

Reliable climate projections at the regional scale are needed in order to evaluate climate change impacts and inform policy. We develop an alternative method for projections based on the transient climate sensitivity (TCS), which relies on a linear relationship between the forced temperature response and the strongly increasing anthropogenic forcing. The TCS is evaluated at the regional scale (5° by 5°), and projections are made accordingly to 2100 using the high and low Representative Concentration Pathways emission scenarios. We find that there are large spatial discrepancies between the regional TCS from 5 historical data sets and 32 global climate model (GCM) historical runs and furthermore that the global mean GCM TCS is about 15% too high. Given that the GCM Representative Concentration Pathway scenario runs are mostly linear with respect to their (inadequate) TCS, we conclude that historical methods of regional projection are better suited given that they are directly calibrated on the real world (historical) climate.

Plain Language Summary?

In this paper, we estimate the transient climate sensitivity, that is, the expected short‐term increase in temperature for a doubling of carbon dioxide concentration in the atmosphere, for historical regional series of temperature. We compare our results with historical simulations made using global climate models and find that there are significant regional discrepancies between the two. We argue that historical methods can be more reliable, especially for the more policy‐relevant short‐term projections, given that the discrepancies of the global climate models directly bias their projections.

Read more (paywalled): https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1002/2017GL076649

The researchers hope this mixture of historical fudge factors and defective physics will be more acceptable as the basis of climate policy decisions than just using the defective physics.

0 0 votes
Article Rating
116 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
markl
May 16, 2018 2:47 pm

But of course! How could we have been so blind? The models have always been correct it was the historical data all along. That’s why we’ve been altering the data sets. Now do you understand?

higley7
Reply to  markl
May 16, 2018 6:33 pm

However, they will use astrology rather than astronomy for their solar system fudge factors. It’s a given the way they pick and chose their needs.

Ian Macdonald
Reply to  higley7
May 17, 2018 1:04 am

Stop maligning astrologers. They don’t ‘adjust’ the positions of the planets, after all.

Sara
Reply to  higley7
May 17, 2018 3:50 am

Astrologers now have a higher degree of accuracy than climate sci-fi prognosticators.
Chthulhu rules!

wws
Reply to  higley7
May 17, 2018 5:17 am

If they understood the history of astronomy, they would realize that they are proposing ptolemaic Epicycles all over again.
The more off base your underlying model is, the more and more you have to keep adding ad-hoc mathematical adjustments to track your actual observations.

Pop Piasa
Reply to  markl
May 16, 2018 7:22 pm

It’s more circular thinking from the folks whose theory predicts everything… And therefore nothing.

Reply to  Pop Piasa
May 16, 2018 7:38 pm

More like epicyclic thinking.

Weylan McAnally
Reply to  Pop Piasa
May 18, 2018 3:35 pm

I have decided that AGW pushing climatologists are simply telephone psychics with a degree or two. They cannot tell you about the past, cannot predict the future and try to keep you on the line to keep the grant money flowing.

Martin C
Reply to  markl
May 16, 2018 9:43 pm

One of the authors is Shaun Lovejoy? A TOTAL ALARMIST ? Look him up. I recall seeing his name as an ‘alarmist’ over at Dr. Judith Curry’s website, ‘Climate Etc.’.
Check him out. Please.

Martin C
Reply to  Martin C
May 16, 2018 9:47 pm

Check this lInk about Shaun at Dr. Curry’s website:
https://judithcurry.com/2015/10/23/climate-closure/
Yes, from a couple years ago. But obvously Shaun’s ‘WARMIST ALARMIST’ position hasn’t changed . . .
( . .to quote Bugs Bunny, “WHAT A MAROON . .! “

May 16, 2018 2:48 pm

Adding in observations from reality, to influence climate models, can only be a good thing.
What else should models be influenced by? The biases of the programmer?
Now if we could just find an impartial way to select which historical observations to use we’ll be sorted.

WXcycles
Reply to  M Courtney
May 17, 2018 7:10 am

And the new ‘n improved GCM will still output ludicrous hysterical nonsense.

Paul Penrose
Reply to  M Courtney
May 17, 2018 9:58 am

Actually, no. Models are generally initialized from real world data, and then the computations alone are supposed to produce the results. If you have to continuously update the internal data values using real world data, then your models are of no value for forecasting (projecting). For example, if you need to update the model every month of simulated time, then it is only good for projecting forwards a month. Not very useful for a climate model.

joe - the non climate scientist
May 16, 2018 2:51 pm

Katharine Hayhoe – has done numerous studies of the effect of climate change 50-100 years for very localized areas of the country ie the state of maine, or NW washington –
and this study for southern california 60 years out due to climate change
Gotta be impressed with being able to predict the weather oops – I mean climate 60 years out.

joe - the non climate scientist
Reply to  joe - the non climate scientist
May 16, 2018 2:52 pm
jorgekafkazar
Reply to  joe - the non climate scientist
May 16, 2018 3:05 pm

I think sarc is the default mode here, given the ludicrous nature of Warmist cant.

Peter Wilson
Reply to  joe - the non climate scientist
May 16, 2018 5:34 pm

You just caused me to visit Skeptical Science. I need a shower!

Susan Howard
Reply to  joe - the non climate scientist
May 16, 2018 8:14 pm

Surely if an area is expecting heavy rain in one season and drought in another the investment should be in reservoirs? Is that what California is doing?
Susan
PS: as Joseph told Pharoah……

paqyfelyc
Reply to  joe - the non climate scientist
May 17, 2018 1:44 am

Susan, seriously… this is too obvious and simple.
You need must better solutions, like:
* making solar roof compulsory
* painting road white
* planting as much windmill as possible
* subsidizing rich guys buying electric toy cars with money taken from average Joe paying electricity bill
*…
But I guess you are not smart enough to understand how these actions will help water management.

paqyfelyc
Reply to  joe - the non climate scientist
May 17, 2018 1:46 am

I wonder why previous comment went to moderation. This blog seems easily “triggered”

Latitude
May 16, 2018 2:53 pm

………..damn computer

Felix
May 16, 2018 2:55 pm

Only the “data” which fits the narrative.

Tom in Florida
May 16, 2018 3:05 pm

fudge factor:
” a figure included in a calculation to ensure a desired result.”
They be guilty as charged.

MarkW
Reply to  Tom in Florida
May 16, 2018 5:47 pm

Finagle’s constant

Bill Powers
May 16, 2018 3:13 pm

It is time to pull out the pitchforks and torches and run down the Climate Alarmists Frankenmodel hobgoblins.

May 16, 2018 3:21 pm

So, more-and-better garbage in, more-and-better garbage out.
I found this analysis of IPCC climate models enlightening:
http://edberry.com/blog/climate-physics/agw-hypothesis/human-co2-not-change-climate/

Dave Fair
Reply to  Robert Kernodle
May 16, 2018 5:35 pm

The physics of the atmospheric concentration of CO2 situation is parallel to a situation I have been considering:
The energy contribution of man’s CO2 is about 1% of the planet’s energy fluxes. How the heck can any honest investigator say that 1% drives the climate with any certainty?

paqyfelyc
Reply to  Dave Fair
May 17, 2018 2:01 am

There are situations where we are damn sure that even less than 1% of the flux drives the system. In fact, for the purpose of control human engineered lots of device where a small change in a part of the system as huge effect on the whole. Transistors, switch, gates, hydraulic press, medicines, etc.
So the % of planet energy flux in human control is irrelevant

David Chappell
Reply to  Dave Fair
May 17, 2018 4:16 am

It’s an extension of the homeopathic principle – the less you have, the more its effect…

paqyfelyc
Reply to  Dave Fair
May 17, 2018 4:57 am

you know about this guy who died from homeopathy overdose? He forgot to take his granule

Reply to  Dave Fair
May 17, 2018 9:23 am

There are situations where we are damn sure that even less than 1% of the flux drives the system. In fact, for the purpose of control human engineered lots of device where a small change in a part of the system as huge effect on the whole.

The question is, then, “IS the flux caused by CO2 in the Earth system similar to those fluxes in those engineering systems?”
Is CO2 like a strong poison, for example ? I get the impression that it is NOT.
Are the boundary conditions defining the barriers or thresholds over which the CO2 flux traverses of a similar configuration to justify regarding the engineering analogy as a correct analogy? A switch, for example, has a pretty definite place to separate or connect an electrical current over a very small space. CO2 does NOT operate this way — it’s more “all over” in a much greater volume. [Forgive my engineering deficit that probably disables me from saying this in the best technical way, but, hopefully, you get some gist of what I’m trying to say.]

Dave Fair
May 16, 2018 3:26 pm

I have been arguing that CMIP6 modelers must all use the same historical data, especially for aerosols. That’ll separate the sheep from the goats.
Additionally, none of those spaghetti graphs, especially for hindcasts.

michel
Reply to  Dave Fair
May 17, 2018 1:04 am

Yes, I have never understood this. Maybe Nick can comment. The idea that we take some failing models and some successful ones, then average them, and get a prediction, seems utterly irrational. The existence of the spaghetti graphs seems clear proof that the science is not settled and that we do not actually know what the future holds.
Why would anyone proceed like this? Like, imagine we are proposing to iintroduct a new drug to prevent dementia. So we construct a few dozen models, and we end up with predictions of dementia in the absence of the drug which are all over the place. We decline to take those models which have the best fit to recent dementia levels. Instead we take all we happen to have to hand, without any criteria as to why we just include them from the total set of all possible models, and average them? Its a nonsense.
Is the criterion for inclusion of a model maybe that one of our friends has constructed it? Regardless of its success in forecasting anything?

Scott Koontz
May 16, 2018 3:26 pm

“A new method for projecting how the temperature will respond to human impacts supports the outlook for substantial global warming throughout this century”
And there you have it. As other climate scientists have noted, statistically there will be “overestimate warming rate in some regions, underestimate it in others” and a whole lot in the middle. In other words, we’re still warming, still CO2, and “substantial global warming throughout this century.”

goldminor
Reply to  Eric Worrall
May 16, 2018 4:20 pm

But 15% is so much superior to 40% or 50%. Think of the children.

Latitude
Reply to  Eric Worrall
May 16, 2018 4:40 pm

” the defective physics which produces the 15% transient climate sensitivity discrepancy it is far from certain that their long term predictions are any good.”
I’ll see your “defective physics”…and raise you a jiggered temperature history that constantly changes
We will never know what the temp is today…because their algorithm will change it tomorrow..and the next day..and the next..
That alone makes all climate models total trash…..

goldminor
Reply to  Scott Koontz
May 16, 2018 4:22 pm

@ Scott K…Why has it taken these geniuses so many years to think of this???

MarkW
Reply to  Scott Koontz
May 16, 2018 5:48 pm

That we have warmed since the bottom of the LIA was never in doubt.
The claim that it was caused by CO2 has never been proven and in fact has been disproven.

commieBob
May 16, 2018 3:27 pm

Curve matching is not the same as a valid model.

With four parameters I can fit an elephant, and with five I can make him wiggle his trunk. John von Neumann

From one of the fathers of climate modelling we have this:

Provided, however, that the observed trend has in no way entered the construction or operation of the models, the procedure would appear to be sound. Edward Norton Lorenz

The use of fudge factors to make a model match historical data violates the basic fundamentals of a valid physics based model. We’ve known that ever since the very beginnings of computer modelling. What we have is nothing better than curve matching. Using it to predict future trends is just extrapolation … unfit for engineering purposes.

Clyde Spencer
Reply to  commieBob
May 16, 2018 3:53 pm

commieBob,
So much for the pious claims of Hayhoe and others that the models are all based on first principles.

lee
Reply to  Clyde Spencer
May 16, 2018 9:13 pm

But what are Hahoe’s principles. First, second or third?

Phoenix44
Reply to  Clyde Spencer
May 17, 2018 1:05 am

They are but they then fail to predict the past.
The fraud (I don’t like saying that but it is sadly true now) comes from not admitting that therefore our understanding of first principles is lacking.
They then add fudge factors in the hope (mistaken belief) that fudge required to predict the past will be the same as the fudge required to predict the future. That’s very unlikely.
They also ignore the fundamental problem of starting conditions in a dynamic, compex, chaotic system. Get those wrong (and you must) and your forecasts go wrong immediately.

Clyde Spencer
Reply to  Clyde Spencer
May 17, 2018 7:39 pm

Lee,
I’m not sure that Hayhoe has any principles. 🙂

Rick C PE
Reply to  commieBob
May 16, 2018 3:57 pm

Agree completely CB. I’ve had experience with a few models that, it turned out, included “calibration factors” which were really just fudge factors to make results match real experimental data. Of course when inputs varied significantly from the calibration values, results became absurdly wrong.

May 16, 2018 3:45 pm

“Our approach vindicates the conclusion of the Intergovernmental Panel on Climate Change (IPCC)…
…But it also brings some important nuances, and underscores a need to develop historical methods for regional climate projections in order to evaluate climate-change impacts and inform policy.”

Plain language summary:
Even though everything we reported here was already reported…. send more money so we can keep everyone in our group employed churning out more stuff already reported.
Ed note: The Climate Yobs-machine rolls ever onward

Editor
May 16, 2018 3:47 pm

The thing is that we now have more than enough instrumental data and recent high resolution proxy data to accurately model how the “climate” should respond to a doubling of the atmospheric concentration of CO2 from 280-560 ppm.
The problem is that these sorts of “models” conclusively demonstrate that the “2.0 °C limit” won’t be breached before James T. Kirk assumes command of NCC-1701 USS Enterprise and that the “1.5 °C limit” won’t fall before Jonathan Archer assumes command of NX-01 USS Enterprise.

Reply to  Eric Worrall
May 16, 2018 3:57 pm

ECS puts the 2.0 °C limit out beyond Jean Luc Piccard”s retirement date… assuming a realistic climate sensitivity.

JimG1
Reply to  Eric Worrall
May 16, 2018 4:16 pm

President Camacho will be in charge by then.

JimG1
Reply to  Eric Worrall
May 16, 2018 4:25 pm

At that 2.0 C the plants will probably need the “lectrolites” in Brawndo.

Martin C
Reply to  Eric Worrall
May 16, 2018 9:50 pm

JimG1.
. .NICE ! ( . .the electrolytes in ‘Brawndo’ . .! )

WXcycles
Reply to  Eric Worrall
May 17, 2018 8:15 am
May 16, 2018 3:47 pm

The GCMs will never work correctly until at least they recognize thermalization and Quantum Mechanics. Science shows that thermalization takes place because absorbed energy starts being shared by conduction with surrounding molecules within 0.0002 microseconds while relaxation averages about 5 microseconds at room temperature. Common knowledge demonstrates thermalization by the observation that clear nights cool faster and farther in the desert than where it is humid.
Quantum Mechanics (Hitran does the calculations) shows that, at low altitude, radiant emission from the atmosphere is essentially all from water vapor.
http://energyredirect3.blogspot.com

goldminor
Reply to  Dan Pangburn
May 16, 2018 4:27 pm

Interesting, +10, and thanks for the link.

WXcycles
Reply to  Dan Pangburn
May 17, 2018 9:03 am

Your explanation in terms of net gas kinetics rather than photon re-emmision is, dare I say, elegant. Well done.

knr
May 16, 2018 3:48 pm

Rule one of climate ‘science’ , when reality and models differ in value it is always reality which is wrong and needs adjusting ,

Tom in Florida
Reply to  knr
May 16, 2018 4:06 pm

Rule Two: Never let reality stand in the way of the next grant.

May 16, 2018 3:55 pm

“It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.”
Richard P. Feynman
For the up/down/”back” radiation of greenhouse theory’s GHG energy loop to function as advertised earth’s “surface” must radiate as an ideal black body, i.e. 16 C/289 K, 1.0 emissivity = 396 W/m^2.
As demonstrated by my modest science-not-fudge experiment (1 & 2) the presence of the atmospheric molecules participating in the conductive, convective and latent heat movement processes renders this ideal black body radiation impossible. Radiation’s actual share and effective emissivity is 0.16, 63/396.
Without this GHG energy loop, radiative greenhouse theory collapses.
Without RGHE theory, man-caused climate change does not exist.
Of course, that doesn’t stop hosts of pompous “experts” ‘splaining the mechanisms of this non-existent loop with quantum electro-dynamics, molecular level physics, photo-electrics which, when faced with real data, ends up as pretentious, handwavium, nonsense.
(1) https://principia-scientific.org/experiment-disproving-the-radiative-greenhouse-gas-effect/
(2) http://www.writerbeat.com/articles/21036-S-B-amp-GHG-amp-LWIR-amp-RGHE-amp-CAGW

Reply to  nickreality65
May 16, 2018 8:33 pm

Nic – Emissivity is a property of a surface. It has nothing to do with the environment. Apparently you misinterpreted your experiment. IMO no lab experiment will ever prove or disprove that CO2 has no significant effect on global climate. The so-called GHE exists and the planet would be colder if it didn’t.
On the up side, CO2 does not now, never has and never will have a significant effect on climate. The reason, demonstrated with thermalization and Quantum Mechanics is that, at low altitude, photon energy absorbed by CO2 is rerouted to water vapor. http://energyredirect3.blogspot.com

Reply to  Dan Pangburn
May 17, 2018 1:50 am

Dan
The whole idea is based on the proposition that the reduction of emitted IR at TOA due to more CO2 will cause the whole climate system to correct to balance this. So the flea on the tail wags the dog.
Of course temperature is a function of total emission not specific. The other processes in the atmosphere and their coupling is unknown. And the actual impact beyond the lab of CO2 is hard to describe. So you don’t even have a hypothesis (as was pointed out to me here a while back). You have a partial partial process.
No worries though it’s all just a spherical horse in a vacuum winning all those races for the Sheik.

Reply to  Dan Pangburn
May 18, 2018 11:02 am

The theory that CO2/GHGs absorb/radiate/trap LWIR is an attempt to explain the mechanism behind the GHG energy loop which I claim is nothing but a theoretical calculation, a “what if” scenario, no actual physical reality. Tough to explain what does not exist.
Emissivity is the ratio between what a “surface” actually radiates and what it would radiate were it a theoretical black body emissivity 1.0. (See above.)
The GHG energy loop which is the foundation of RGHE requires that the surface radiate as a BB at 16 C/289 K/396 W/m^2.
What my experiment demonstrates is that in the presence of other heat transfer pathways the surface can not radiate as a BB. In fact the surface LWIR is 63 W/m^2 which at 289 K yields an emissivity of 0.16.
Without this GHG energy loop there is no RGHE and no man-caused climate change and a whole lot of pretenders suddenly unemployed.

James Beaver
May 16, 2018 4:13 pm

So still can’t grasp that they are trying to model a vast non-linear close coupled chaotic system with a parameterized linear computer? Faster, bigger computers with tweaked data are still just spewing faster, bigger piles of effluent.

Phoenix44
Reply to  James Beaver
May 17, 2018 1:07 am

Using a starting point you cannot hope to get right!
Try landing on the Moon if you don’t know where you are starting from!

paqyfelyc
Reply to  James Beaver
May 17, 2018 2:40 am

Well, just because astrologers and fortune tellers got out of fashion, humans didn’t quit the bad habit to act as if unpredictable were somehow predictable. So computer assisted numerology took over.
I don’t count the number of time I participated in dialogue such as below with higher level managers:
manager (anxious): what will happen next?
me: I can tell you there is no way to tell, because, too much unknown, both known unknown and unknown unknown.
manager (even more anxious): ye, sure, understood, but, nonetheless, what would you predict, if you could?
me [coercing lots of people to give me lots of painful numbers –that’s an important part for the alchemy of belief–, putting them in large matrix, multiplying, dividing, averaging, mixing numbers, with a mystery method etc; and finally getting a result in a big nice unreadable and very impressive matrix, with comments in “summary for executive” to explain a final guess, not more stupid than any other, anyway]: “there we are, boss, here is your guess. Of course you understand the result is of no value, and anything could still happen
manager (happy): YES! whatever the incertitude, it still allow me to set a course of action,

goldminor
May 16, 2018 4:16 pm

The answer is simple throw out the models, and vigorously study the regional data until some level of comprehension sets in. Then go back and attempt to formulate a new model that has a basis in reality.

Phoenix44
Reply to  goldminor
May 17, 2018 1:11 am

Model it in four lines:
Change in CO2
Change in global average temperature due to CO2 change
Feedback effect on temperature change caused by CO2
Result
You can put in what you like for lines 1,2 and 3.
But of course this model shows that lines 2 and 3 are just assumptions, not “emergent facts” as the climate scientists claim.

Michael Thies
May 16, 2018 4:18 pm

What sort of “science” is this?
Climatology has trumped the social sciences with regards to producing nearly impossible to reproduce and often inane or patently obvious statistics.

TonyL
May 16, 2018 4:20 pm

We find that there are large spatial discrepancies between the regional TCS from 5 historical data sets

A variable TCS ???????
I thought it was a fixed constant based on “well mixed” gasses and constant solar radiation etc. and etc.
Isn’t this fundamental to how the models are supposed to be constructed?
Now they seem to want to set up a TCS for each cell with the Earth carved up into 5 X 5 degree cells.
All this does is introduce a huge number of tuning variables. You over-fit, your parameters inadvertently model the wrong thing, and you extrapolate.
Then you do a spectacular crash and burn.

May 16, 2018 4:41 pm

Hey, my comment is straight up science, hypothesis, experiment, observations, by someone actually trained in applying science.
Talk about fingers-in-your-ears denial!!

eric
May 16, 2018 4:44 pm

Dry labbing used to get you an “A.” Now, it’s the way to get you a gov’t grant.
Pitiful behavior!

Donald Kasper
May 16, 2018 5:21 pm

GIPO. Garbage in, policy out.

Jacob Frank
May 16, 2018 5:32 pm

A little bondo and jb weld and they’ll have those models going down the road in no time

May 16, 2018 5:57 pm

Please be advised that this researcher had already looked at some historical data, data that climate scientists appear to studiously ignore such as the time series for the UAH satellite lower troposphere temperature relative to ground based observatory CO2 concentration data.
The results to date show:
a. There is no statistically significant correlation between satellite lower troposphere temperature and atmospheric CO2 concentration, that is, the term CO2:temperature climate sensitivity is meaningless;
b. There is a statistically significant correlation between the temperature and the rate of change of CO2 concentration;
c. Temperature change precedes CO2 change so it cannot possibly be caused by the latter;
d. The temperature and the rate of change of CO2 concentration have practically identical autocorrelation coefficients with a prominent maximum at about 42 months;
e. This is confirmed by having practically identical Fourier amplitude spectra with a glaringly obvious peak at a period of 42 months which period does not appear to feature in discussion of any climate models.
The Fourier amplitude spectra display a number of maxima such as the Moon’s 27.21 day draconic period and its 29.53 day synodic period. This should be evident to anyone knowing that the Moon passes between the Sun and the Earth every month. However it would appear that climate scientists are incapable of performing such elementary time series analysis.
For more detail see:
https://www.climateauditor.com

Allan MacRae
Reply to  Bevan Dockery
May 16, 2018 9:22 pm

Thank you Bevan – a few people like you are finally discussing the points I made in my January 2008 icecap.us paper, where I pointed out the close correlation between dCO2/dt and global temperature, similar to your points a to d . I have no opinion yet on your point e, etc.
It is amazing that this issue has largely been ignored for a decade by both sides of the fractious global warming debate. It seems like everyone is having too much fun arguing about global warming bullsh!t to just look at the incontrovertible facts, as evidenced by this data.
A few more thoughts here.
https://wattsupwiththat.com/2018/05/02/is-climate-alarmist-consensus-about-to-shatter/comment-page-1/#comment-2805310
https://wattsupwiththat.com/2018/05/09/clouds-and-el-nino/comment-page-1/#comment-2813713
Best, Allan

Curious George
May 16, 2018 6:01 pm

Do I understand correctly that the famed Climate Sensitivity is not one global number, but that it is highly localized? What a beauty – now the uncertainty interval means simply that you could always find a locale for any particular value. That’s a great research project.

Michael Jankowski
May 16, 2018 6:21 pm

Before this thread gets derailed…the photo caption says “Fromm” instead of “From”…

Michael Jankowski
May 16, 2018 6:25 pm

“…Our approach vindicates the conclusion of the Intergovernmental Panel on Climate Change (IPCC) that drastic reductions in greenhouse gas emissions are needed in order to avoid catastrophic warming…”
Which part of their “approach” addresses that conclusion? Where was warming demonstrated to be “catastrophic?” And why would someone’s “approach” knowingly “vindicate the conclusion” of an intergovernmental party? Shouldn’t the results of a scientific study potentially do that rather than the approach?

May 16, 2018 6:27 pm

This latest rationalisation of methods from the developers of Climate Models makes Frankenstein look like scientific fact by comparison.

Dave Fair
Reply to  ntesdorf
May 16, 2018 9:02 pm

Demand that CMIP6 models all use the same inputs for hindcasts. Clouds, aerosols, etc.

May 16, 2018 7:04 pm

Please be advised that this researcher had already looked at some historical data, data that climate scientists appear to studiously ignore such as the time series for the UAH satellite lower troposphere temperature relative to ground based observatory CO2 concentration data.
The results to date show:
a. There is no statistically significant correlation between satellite lower troposphere temperature and atmospheric CO2 concentration, that is, the term CO2:temperature climate sensitivity is meaningless;
b. There is a statistically significant correlation between the temperature and the rate of change of CO2 concentration;
c. Temperature change precedes CO2 change so it cannot possibly be caused by the latter;
d. The temperature and the rate of change of CO2 concentration have practically identical autocorrelation coefficients with a prominent maximum at about 42 months;
e. This is confirmed by having practically identical Fourier amplitude spectra with a glaringly obvious peak at a period of 42 months which period does not appear to feature in discussion of any climate models.
The Fourier amplitude spectra display a number of maxima such as the Moon’s 27.21 day draconic period and its 29.53 day synodic period. This should be evident to anyone knowing that the Moon passes between the Sun and the Earth every month. However it would appear that climate scientists are incapable of performing such elementary time series analysis.
For more detail see:
https://www.climateauditor.com

Reply to  bmrgeophyz
May 16, 2018 7:58 pm

We heard you the first time. Thanks

Pop Piasa
May 16, 2018 7:44 pm

“…projections based on the transient climate sensitivity (TCS), which relies on a linear relationship between the forced temperature response and the strongly increasing anthropogenic forcing.”
Where is there proof of this linear relationship between anthropogenic forcing and temperature?
This is a craftly presented dose of pseudo-scientific jargon.

Phoenix44
Reply to  Pop Piasa
May 17, 2018 1:15 am

An assumption, built into their models.
In my work, I insist on modellers setting out all their assumptions, so that both they and we know where the results are coming from. Climate modellers hide their assumptions away and then claim that doing arithmetic to assumptions produces non-assumptions.
That is simply false. Any number that uses an assumption to get to that number is also an assumption.

michael hart
May 16, 2018 7:44 pm

“Plain Language Summary?”
Models which incorporate the worst of both worlds were called “semi-empirical” when I was a student.

Robert B
May 16, 2018 8:35 pm

Science is the art of discovering that you got it wrong. Its not the art of convincing others that you are right. That’s politics.

Reply to  Robert B
May 16, 2018 9:18 pm

That correct . Science is never right. The best that we can hope for is that its not wrong.
AGW hypotheses are clearly wrong.

Ian Macdonald
Reply to  Robert B
May 17, 2018 1:17 am

Meanwhile, religion is the art of persuading people to believe things which patently don’t make sense.

dodgy geezer
May 16, 2018 9:23 pm

…Plain Language Summary?
In this paper, we estimate the transient climate sensitivity, that is, the expected short‐term increase in temperature for a doubling of carbon dioxide concentration in the atmosphere, for historical regional series of temperature. We compare our results with historical simulations made using global climate models…

Plainer Language Summary
The models are bust – their predictions don’t match reality.
So our method simply looks at reality, and picks times when the temperature is going up.
Since we are coming out of the LIA, we can easily show a historical rise over the Modern period.
Therefore we’re all going to die!!!!
P.S. Send Money…

Allan MacRae
May 16, 2018 9:30 pm

This McGill study is warmist bullsh!t. Climate sensitivity to atmospheric CO2 is clearly less than about 1C/(2xCO2) and probably much less – closer to 0.0C than 1.0C.
[excerpt from below]
There is ample Earth-scale evidence that TCS is less than or equal to about 1C/(2xCO2).
There is no credible evidence that it is much higher than that.
At this magnitude of TCS, the global warming crisis does not exist.
Best, Allan
https://wattsupwiththat.com/2018/05/05/the-biggest-deception-in-the-human-caused-global-warming-deception/comment-page-1/#comment-2809025
I suggest that global warming alarmism has never been credibly demonstrated to be correct and has repeatedly been falsified.
The argument is about one parameter – the sensitivity of climate to increasing CO2. Let’s call that TCS.
There is ample Earth-scale evidence that TCS is less than or equal to about 1C/(2xCO2).
There is no credible evidence that it is much higher than that.
At this magnitude of TCS, the global warming crisis does not exist.
Examples:
1. Prehistoric CO2 concentrations in the atmosphere were many times today levels, and there was no runaway or catastrophic warming.
2. Fossil fuel combustion and atmospheric CO2 strongly accelerated after about 1940, but Earth cooled significantly from ~1940 to ~1977.
3. Atmospheric CO2 increased after ~1940 , but the warming rates that occurred pre-1940 and post-1977 were about equal.
4. Even if you attribute ALL the warming that occurred in the modern era to increasing atmospheric CO2, you only calculate a TCS of about 1C/(2xCO2). [Christy and McNider (1994 & 2017), Lewis and Curry (2018).]
There are many more lines of argument that TCS is low – these are a few.
Regards, Allan

tom0mason
May 16, 2018 10:10 pm

Climate communicreator Kathy Hayhoe recommends
“The models need more eye of newt and less toads’ skins when there’s a full moon.”

May 16, 2018 10:52 pm

I’m not quite sure if some commenters know the paper, it’s remarkable indeed. I cited some parts of the conclusions over at Judys , see https://judithcurry.com/2018/04/30/why-dessler-et-al-s-critique-of-energy-budget-climate-sensitivity-estimation-is-mistaken/#comment-871885
The core finding: “Given these facts, it is questionable how accurate the mean projection of the MME will be given that its spatial pattern of warming evolves little over the 21st century and remains very close to its RTCS over the historical period, which significantly differs over most of the globe from the actual RTCS of historical observations. Therefore, the resulting MME projection will suffer from the same faults as the historical runs; that is, it will be a projection of the faulty GCM climate in historical runs. GCMs are important research tools, but their regional projections are not yet reliable enough to be taken at face value. It is therefore required to develop further historical approaches that will thus project the real world climate rather than the GCM climates.
(MME: multi model mean; RCTS: regional transient climate sensivity)
The paper clearly contradicts some approaches to increase the observed ( relatively low) TCR of about 1.3 according to Lewis/Curry 2018 by estimating that models better simulate the possible “warming patterns”. It shows that those modelled patterns for the future are only extrapolations of wrong patterns from the presence. The study bolsters the observed sensivity values and is a very worth read. Google the DOI of the paper and you’ll find a full. It’s always better to discuss about a known issue instead of writing about a guess.

Sceptical lefty
May 16, 2018 11:47 pm

The (climate) science is barely relevant here. The appropriate perspective is basic psychology. You have to expect a modeller to defend his models. The alternative is to expect him to dispassionately recognise their inherent, irremediable defects and seek employment elsewhere. This is contrary to human nature.
Perceived self-worth is as important to modellers as to the rest of us.

Phoenix44
Reply to  Sceptical lefty
May 17, 2018 1:18 am

Modellers should be honest, and admit that they are ALWAYS just modelling assumptions.
As the doyen of modelling said: “all models are wrong, but some are useful.”
Models can be valuable, but only if they are used correctly.

May 17, 2018 12:47 am

Fewer cyclones with climate change – but why?
7 April 2011
“We can’t give a lucid answer at this time,” replied Knutson – which he admitted, was a concern.
One clue, however, came from the modelling.
Knutson said that removing carbon dioxide from the models wiped off around half of the cyclone’s predicted intensity.
https://www.newscientist.com/blogs/shortsharpscience/2011/04/fewer-cylones-with-climate-cha.html
[the link 404’s out . . . mod]

Reply to  Mark M
May 17, 2018 3:01 am

Link still works for me.
Apologies.

Reply to  Mark M
May 17, 2018 3:08 am

I tried ‘oogle’ of the headline: “Fewer cyclones with climate change – but why?”, and it was on the first page;
“Short sharp science: Fewer cyclones with climate change – but why?”
Hope that works for anyone interested.

Yirgach
Reply to  Mark M
May 17, 2018 11:46 am

Here’s the erudite (speculative) answer from the article:

Speculating, Steve Sherwood at the University of New South Wales, Sydney, Australia says that since CO2 absorbs heat, more CO2 in the atmosphere could affect where, and with what intensity, water vapour is being heated over the ocean.

Back to university for you Steve…

ScienceABC123
May 17, 2018 3:12 am

Comparing one computer model (even if that’s not what you called it) with another computer model is not science.

willhaas
May 17, 2018 3:47 am

If they had one reliable model without any fudge factors then that would be the only model that they would use. The fact that they have a plethora of models with fudge factors giving different results signifies that a lot of guess work is involved. Only one model can possible be right but they do not know which one and it may be that all their models are wrong. I believe that they have hard coded that CO2 causes warming so that their climate simulations all beg the question and are hence useless. Predictions made by models that are both wrong and useless are themselves wrong and useless. There is also plenty of scientific rational to support the idea that the climate sensitivity of CO2 is zero.

Sara
May 17, 2018 4:26 am

Eric Worrall May 16, 2018 at 3:52 pm said: The researchers were careful to avoid mentioning equilibrium response, they focussed on transient climate response…
No offense meant to anyone at all, but some of this is getting sillier and sillier. Study after study with no practical solution pops up like weeds in the garden. Alarmism is turning into some weird cult mentality that seems to fill a need for unaddressed spirituality. Maybe these twits should go to church on Sunday. They might chill out a little.
I’m fascinated by silly stuff. Climate sci-fi is entering the Twilight Zone of ‘the sillier and more alarming, the better’, but the weather guessers can’t even provide a mildly accurate weather forecast any more. This Climate Sci-fi stuff is rapidly approaching the category of faddishness, something that will soon pass because it is baloney. But it gets the prognasticators money and attention and soothes their need for approval. It must be a lonely line of work when your only response comes from a computer screen.
However, some of the silly stuff coming out of this is useful for sci-fi and fantasy fiction, so I’m making notes of a lot of it in my Big What If? Notebook, for future use. For example, what if “the researchers” (unknown but closeted, like Irish monks copying stuff in cells in the 10th century) predict that a decades-long heat wave would sweep the entire planet in mid-winter (because they forgot that seasonal changes are hemispherical) and instead, an excessive snow load pounds the northern hemisphere and the southern hemisphere gets excessive rain and flooding. Oh, wait – that’s already happening, isn’t it?
The more I see of these research “results”, the more I think these people should find other jobs, something that connects them to the real world. Gardening on a rooftop in summer is a good start. I think what they’re trying to tell us is that, deep down inside, they just don’t like winter weather and want it to go away permanently, and it is NOT going to happen. I want to see all the alarmists mired in snow up to their hoohahs.
I may grumble a little about keeping my furnace running longer than normal, but I’m in the real world. The Climate Sci-Fi peeps are not.
Chthulhu rules! Thanks for listening!
Rant over. Breakfast!!!

Allan MacRae
Reply to  Sara
May 17, 2018 6:06 am

Hi Sara,
Andrew Weaver was Professor of Climate Modelling at University of Victoria, British Columbia until recently, when he became head of the 3-elected-member fringe Green Party of British. Columbia.
Pretty sure that was NOT a good thing for British Columbia or Canada.
Warmist climate models typically employ(ed) values of climate sensitivity that are (were) 3 to 10 times too high, and so these models deliberately run hot. Quelle surprise!
These climate models are not remotely scientific – they are political instruments – they just run hot, which is what they are designed to do. Why? To motivate corrupt and ignorant politicians and scare the idiots who vote for them.

Allan MacRae
Reply to  Allan MacRae
May 17, 2018 6:07 am

Global warming alarmism is a multi-trillion dollar scam – now the most costly scam in human history. Corrupt politicians love a big scam – the bigger the better – because they can skim off the most graft – a little scam is hardly worth the trouble.

Yirgach
Reply to  Sara
May 17, 2018 11:54 am

I want to see all the alarmists mired in snow up to their hoohahs.

Chthulhu rules!

In the same post no less!!
Thanks for that!

Kermit Johnson
May 17, 2018 5:20 am

I’m really pleased to see that this is becoming more and more apparent. Years ago when I would argue about using fudge factors in the models, almost no one, including scientists, seemed to know enough to comment. The interesting thing is – there is not just *a* fudge factor for the various models. Each model has its own fudge factor. This, IMHO, clearly indicates that it is not only the physics of the CO2/temp relationship that is unknown.
I’ve modeled commodity prices for about twenty-five years now. I started with an 80286 processor. The advantage I have had in thinking about these climate models is that, when I ran a model, I would bet on the output. It is very difficult to ignore when the model is not right when you look at your balance sheet. People who model climate do not have this advantage, it seems. (g)

Allan MacRae
Reply to  Kermit Johnson
May 17, 2018 6:54 am

Hi Kermit,
Since ALL the models run hot, and the modelers know it, they will not bet – because they would lose their money.
They make their money in other ways, by contributing to the global warming scam, getting research grants to scare people with fictions about very-scary global warming, travelling to conferences in exotic locales, getting money under-the-table for supporting green energy schemes, etc.
It’s a great life, being a global warming scam alarmist.

May 17, 2018 6:21 am

Ummmmm. Fudge……

May 17, 2018 6:31 am

Climate is controlled by natural cycles. Earth is just past the 2003+/- peak of a millennial cycle and the current cooling trend will likely continue until the next Little Ice Age minimum at about 2650.See the Energy and Environment paper at http://journals.sagepub.com/doi/full/10.1177/0958305X16686488
and an earlier accessible blog version at http://climatesense-norpag.blogspot.com/2017/02/the-coming-cooling-usefully-accurate_17.html
Here is part of section 1 of the paper which deals with climate sensitivity.
“For the atmosphere as a whole therefore cloud processes, including convection and its interaction with boundary layer and larger-scale circulation, remain major sources of uncertainty, which propagate through the coupled climate system. Various approaches to improve the precision of multi-model projections have been explored, but there is still no agreed strategy for weighting the projections from different models based on their historical performance so that there is no direct means of translating quantitative measures of past performance into confident statements about fidelity of future climate projections. The use of a multi-model ensemble in the IPCC assessment reports is an attempt to characterize the impact of parameterization uncertainty on climate change predictions. The shortcomings in the modeling methods, and in the resulting estimates of confidence levels, make no allowance for these uncertainties in the models. In fact, the average of a multi-model ensemble has no physical correlate in the real world.
The IPCC AR4 SPM report section 8.6 deals with forcing, feedbacks and climate sensitivity. It recognizes the shortcomings of the models. Section 8.6.4 concludes in paragraph 4 (4): “Moreover it is not yet clear which tests are critical for constraining the future projections, consequently a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed”
What could be clearer? The IPCC itself said in 2007 that it doesn’t even know what metrics to put into the models to test their reliability. That is, it doesn’t know what future temperatures will be and therefore can’t calculate the climate sensitivity to CO2. This also begs a further question of what erroneous assumptions (e.g., that CO2 is the main climate driver) went into the “plausible” models to be tested any way. The IPCC itself has now recognized this uncertainty in estimating CS – the AR5 SPM says in Footnote 16 page 16 (5): “No best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies.” Paradoxically the claim is still made that the UNFCCC Agenda 21 actions can dial up a desired temperature by controlling CO2 levels. This is cognitive dissonance so extreme as to be irrational. There is no empirical evidence which requires that anthropogenic CO2 has any significant effect on global temperatures.
The climate model forecasts, on which the entire Catastrophic Anthropogenic Global Warming meme rests, are structured with no regard to the natural 60+/- year and, more importantly, 1,000 year periodicities that are so obvious in the temperature record. The modelers approach is simply a scientific disaster and lacks even average commonsense. It is exactly like taking the temperature trend from, say, February to July and projecting it ahead linearly for 20 years beyond an inversion point. The models are generally back-tuned for less than 150 years when the relevant time scale is millennial. The radiative forcings shown in Fig. 1 reflect the past assumptions. The IPCC future temperature projections depend in addition on the Representative Concentration Pathways (RCPs) chosen for analysis. The RCPs depend on highly speculative scenarios, principally population and energy source and price forecasts, dreamt up by sundry sources. The cost/benefit analysis of actions taken to limit CO2 levels depends on the discount rate used and allowances made, if any, for the positive future positive economic effects of CO2 production on agriculture and of fossil fuel based energy production. The structural uncertainties inherent in this phase of the temperature projections are clearly so large, especially when added to the uncertainties of the science already discussed, that the outcomes provide no basis for action or even rational discussion by government policymakers. The IPCC range of ECS estimates reflects merely the predilections of the modelers – a classic case of “Weapons of Math Destruction” (6).

Reply to  Dr Norman Page
May 17, 2018 8:22 am

“The structural uncertainties inherent in this phase of the temperature projections are clearly so large, especially when added to the uncertainties of the science already discussed, that the outcomes provide no basis for action or even rational discussion by government policymakers. The IPCC range of ECS estimates reflects merely the predilections of the modelers – a classic case of “Weapons of Math Destruction” (6).”
Wow, well written and pretty damning. If you have to use fudge factors to even go back 150 years, then it is obvious you are not including relevant calculations in your model. Any such model is worthless.

Olen
May 17, 2018 7:09 am

Fudge factor
A fudge factor is an ad hoc quantity introduced into a calculation, formula or model in order to make it fit observations or expectations. Examples include Einstein’s Cosmological Constant, dark energy, dark matter and inflation.
And that’s why it is theory. Or is it? Stay tuned.

Allan MacRae
Reply to  Olen
May 17, 2018 8:18 am

To be clear, Catastrophic Man-made Global Warming is not a Theory, it is a Hypothesis, and it is a FAILED Hypothesis because it has been repeatedly disproved.
The argument is about one parameter – the sensitivity of climate to increasing CO2. Let’s call that TCS.
There is ample Earth-scale evidence that TCS is less than or equal to about 1C/(2xCO2).
There is no credible evidence that TCS is 3C to 10C/(doubling of CO2), as warmists have falsely claimed now and/or in the past.
At TCS less than or equal to 1C/(2xCO2), THE GLOBAL WARMING CRISIS DOES NOT EXIST, except in the fevered minds of warmist fanatics.
Examples of evidence of low TCS:
1. Prehistoric CO2 concentrations in the atmosphere were many times today levels, and there was no runaway or catastrophic warming.
2. Fossil fuel combustion and atmospheric CO2 strongly accelerated after about 1940, but Earth cooled significantly from ~1940 to ~1977.
3. Atmospheric CO2 strongly increased after ~1940 , but the warming rates that occurred pre-1940 and post-1977 were about equal.
4. Even if you attribute ALL the warming that occurred in the modern era to increasing atmospheric CO2, you only calculate a TCS of about 1C/(2xCO2). [Christy and McNider (1994 & 2017), Lewis and Curry (2018).]
There are many more lines of argument that TCS is low – these are just a few.
Scoundrels and imbeciles continue to advocate that increasing atmospheric CO2 is causing dangerous global warming, wilder weather, etc. These warmist claims are not only unproven, they are false. Increasing atmospheric CO2 is beneficial to humanity and the environment – plant and crop growth is enhanced, and any resulting warming will be mild and beneficial.
Regards, Allan

Reply to  Olen
May 17, 2018 10:10 am

A fudge factor is an ad hoax quantity, more than it is an “ad hoc” quantity.

May 17, 2018 9:58 am

This Hebert & Lovejoy paper is sensible and well argued. It shows that the incorrect normalized pattern of historical warming in global climate models looks almost the same as the future warming under RCP scenarios. It quantifies the historical and future warming pattern correlation in 32 CMIP5 models, at 0.95 to 0.98 depending on scenario. The CMIP5 ensemble-mean warming pattern does not evolve over the 21st century, even for RCP2.6, remaining virtually identical between 2000-2020 and each sucessive 20-year period up to 2080-2100. They conclude that this is warming pattern is unrealistic and that projected warming is likely to be about 15% too high in the global mean, based on what happened over the historical period in models and in reality.
The study’s finding that the CMIP5 ensemble-mean global warming response in historical simulations is only 15% above that observed reflects its use of the same RCP forcing data for both models (for which is it is arguably a reasonable approximation, on average) and observations (for which it is much too low). The 15% excess warming in models becomes a 40% excess when using extended AR5 forcings with revised, updated greenhouse gas forcing formulae and post-1990 ozone & aerosol forcing changes, as in Lewis & Curry 2018.

Reply to  niclewis
May 17, 2018 2:25 pm

In my opinion your forecasts are still much too high because you continue to ignore the millennial peak and project straight ahead beyond the millennial inflection point at about 2004.
See fig 12 from Energy and Environment paper at http://journals.sagepub.com/doi/full/10.1177/0958305X16686488
and an earlier accessible blog version at
http://climatesense-norpag.blogspot.com/2017/02/the-coming-cooling-usefully-accurate_17.htmlcomment image
Fig. 12. Comparative Temperature Forecasts to 2100.
Fig. 12 compares the IPCC forecast with the Akasofu (31) forecast (redharmonic) and with the simple and most reasonable working hypothesis of this paper (green line) that the “Golden Spike” temperature peak at about 2003 is the most recent peak in the millennial cycle. Akasofu forecasts a further temperature increase to 2100 to be 0.5°C ± 0.2C, rather than 4.0 C +/- 2.0C predicted by the IPCC. but this interpretation ignores the Millennial inflexion point at 2004. Fig. 12 shows that the well documented 60-year temperature cycle coincidentally also peaks at about 2003.Looking at the shorter 60+/- year wavelength modulation of the millennial trend, the most straightforward hypothesis is that the cooling trends from 2003 forward will simply be a mirror image of the recent rising trends. This is illustrated by the green curve in Fig. 12, which shows cooling until 2038, slight warming to 2073 and then cooling to the end of the century, by which time almost all of the 20th century warming will have been reversed.
As for TCS etc See also 6:31 am comment above
Best Regards Norman Page

Jim Whelan
May 18, 2018 4:38 pm

Not a surprise. I became a skeptic in the early days of the global warming claims when I studiously began looking into the claims. Back then most universities had open websites with easy access to almost all research information. I quickly encountered one of the main proponents making a statement similar to: “Some have questioned my removing contradictory data from my results. I claim this is justified because the theory is so obviously true that contrary results must be incorrect.” Clearly this individual wasn’t doing science. Ever since I have closely examined claims and found that very few of the warming alarmists are doing actual science.