Climate Modellers Waiting for Observations to Catch Up with Their Predictions

Guest essay by Eric Worrall

h/t Dr. Willie Soon; In climate science, when your model predictions are wrong, you wait for the world to correct itself.

New climate models predict a warming surge
By Paul VoosenApr. 16, 2019 , 3:55 PM

For nearly 40 years, the massive computer models used to simulate global climate have delivered a fairly consistent picture of how fast human carbon emissions might warm the world. But a host of global climate models developed for the United Nations’s next major assessment of global warming, due in 2021, are now showing a puzzling but undeniable trend. They are running hotter than they have in the past. Soon the world could be, too.

In earlier models, doubling atmospheric carbon dioxide (CO2) over preindustrial levels led models to predict somewhere between 2°C and 4.5°C of warming once the planet came into balance. But in at least eight of the next-generation models, produced by leading centers in the United States, the United Kingdom, Canada, and France, that “equilibrium climate sensitivity” has come in at 5°C or warmer. Modelers are struggling to identify which of their refinements explain this heightened sensitivity before the next assessment from the United Nations’s Intergovernmental Panel on Climate Change (IPCC). But the trend “is definitely real. There’s no question,” says Reto Knutti, a climate scientist at ETH Zurich in Switzerland. “Is that realistic or not? At this point, we don’t know.”

Many scientists are skeptical, pointing out that past climate changes recorded in ice cores and elsewhere don’t support the high climate sensitivity —nor does the pace of modern warming. The results so far are “not sufficient to convince me,” says Kate Marvel, a climate scientist at NASA’s Goddard Institute for Space Studies in New York City. In the effort to account for atmospheric components that are too small to directly simulate, like clouds, the new models could easily have strayed from reality, she says. “That’s always going to be a bumpy road.”

In assessing how fast climate may change, the next IPCC report probably won’t lean as heavily on models as past reports did, says Thorsten Mauritsen, a climate scientist at Stockholm University and an IPCC author. It will look to other evidence as well, in particular a large study in preparation that will use ancient climates and observations of recent climate change to constrain sensitivity. IPCC is also not likely to give projections from all the models equal weight, Fyfe adds, instead weighing results by each model’s credibility.

Read more: https://www.sciencemag.org/news/2019/04/new-climate-models-predict-warming-surge

It’s nice to learn that the IPCC is considering using observations to constrain model projections.

0 0 votes
Article Rating
131 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Latitude
April 20, 2019 6:17 am

why are these people still in business?…

…even as screwed up as medical is…if 99% of your patients are dying, stop doing it

David Guy-Johnson
Reply to  Latitude
April 20, 2019 6:38 am

They’re saying that now that they don’t understand how their own models work? That’s priceless.

Latitude
Reply to  David Guy-Johnson
April 20, 2019 6:55 am

The biggest elephant in the room…is their stupid adjustments

They adjusted past temperatures down…to show a faster rate of warming – agenda
..when they plug that fake data into the models…the models show the same fake rate of warming

…the models will never be right

Reply to  Latitude
April 20, 2019 7:36 am

Latituder:
The models will never be right until the precise causes of climate change are known, and they still may not be right if the causes are random, and not predictable, rather than cyclical.

Concerning the “adjustments” — they are important, but the infilling is more important — over half the grid cells in any given month have no temperature data, or are missing data — government bureaucrats get to wild guess the numbers.

These are the same bureaucrats hired because they believe in a coming global warming crisis, and they get lifetime job security if they keep predicting a coming global warming crisis.

There’s a conflict of interest — they own the climate models, make the warming predictions, and also own the historical temperature actuals … which they repeatedly change whenever they want.

They have a strong financial incentive to keep climate change fear alive — would you trust them to do honest, unbiased “infilling” ?

Bigger than the biggest elephant in the room, is the dinosaur in the house, that I’ll define later.

The average computer game climate “model”, excluding the Russian model that obviously colludes with Trump!, predicts future global warming at a rate QUADRUPLE the actual global warming rate since 1940, when the “era of man made CO2 began”, through 2018 — which was a total of about +0.6 degrees C. of intermittent global warming over 78 years, or +0.077 degrees C. per decade.

That would be +0.77 degrees C. per century.
That’s reality.
It actually happened.

But never mind reality.

Climate alarmists are not interested in the PAST, which was measured, maybe not that accurately, and can be studied — they only care about the FUTURE, which they claim to know with great precision, in spite of over 60 years of wrong predictions
of the future climate !

The average computer game predicts about +3 degrees per century … which only happens in the climate alarmist fairy tale world — not in real life !

If you compare the average climate model with the UAH weather satellite data since 1979, the models predict TRIPLE the actual warming … but there’s no logical reason to start the comparison with 1979, ignoring CO2 emissions from 1940 through 1979.

The “dinosaur in the house” is keeping the above facts away from the general public, continuing to use the same, obviously wrong CO2 / warming formula for predictions since the 1970s, and NOT CARING that EVERY prediction about global warming since the late 1950s has been WRONG … because if the general public doesn’t know that, then to them it never happened !

My climate science blog:
http://www.elOnionBloggle.Blogspot.com

Latitude
Reply to  Richard Greene
April 20, 2019 7:47 am

“the models predict TRIPLE the actual warming …”

…adjustments to past temps triple the rate of warming too

Latitude
Reply to  Richard Greene
April 20, 2019 7:52 am

..and even more sinister

What they are claiming as actual measured temps…..is in reality that adjusted mess
…which means the models are even more wrong
and so is what they are claiming as the rate of warming

mario lento
Reply to  Richard Greene
April 20, 2019 11:03 am

The average model includes projections of temperature which use CO2 far below current levels… and these false models show less warming, but for the wrong reason. They are equally as bad as the models that show more warming.

Teddz
Reply to  Richard Greene
April 20, 2019 11:16 am

But it’s so easy to make assumptions, or reach conclusions based on data which is “hard” that turns out to be anything but.

The Rolls Royce Trent engine that went wonky on the Quantas airplane a couple of years ago is an example of that. Despite quality assurance being in place the oil delivery system to that engine failed because quality assurance failed. A precisely positioned hole – but having a tolerance and at the full extent of that tolerance – was then used as the position point of another hole, which also was at the full extent of its tolerance. Result was a pipe failure.

Climate data needs quality assurance and there needs to be an international standard for it.

Mark Luhman
Reply to  Richard Greene
April 22, 2019 12:19 pm

Latituder said: “The models will never be right until the precise causes of climate change are known, and they still may not be right if the causes are random, and not predictable, rather than cyclical.”

It even worse than that even if we what all variables are, to model them in a small enough resolution to be accurate, by the time we have the end of the computer run the event would have past million of years in the past. That is true today and will be true tomorrow even though if we were to create computer capable of doing calculations a billion time faster than today.

Climate is far to complex to model period. Climate models are only useful in what if simulation, yet even then one cannot count on the answer. The reality they are very expensive computer games with about the same use as someone playing Final Fantasy. Climate simulation are just that fantasy, to the most part a giant waste of time and money.

DaveR
Reply to  Latitude
April 21, 2019 5:08 am

Latitude, thats exactly what the Australian Bureau of Meteorology has just been caught out on.

Back in 2012 the BOM produced the ACORN-Sat 1 national temperature series which, you guessed it, cooled past temperature measurements and steepened the graph and increased the “rate of warming” (while conveniently leaving out older temperatures which were warmer than today).

Trouble was, average temperatures since 2012 didnt play ball, and refused to follow the previously increased warming slope, tracking lower.

So in February 2019 they had to adjust the national temperatures again – and produce the ACORN-Sat 2 series – and you guessed it again, lowered historical temperatures a second time to hide the kink at 2012 in the 100 year temperature graph.

This time it was done in secret and the public didnt even know it was being revised. but I guess that it what you do when it all goes AU (=wrong).

Unfortunately for the BOM, the magnitude of the adjustments, even for the original ACORN-Sat 1 series, are larger than the AGW signal they supposedly “clearly show”.

Ideology driving science. And there will have to be a third adjustment in about….2026.

Rich Davis
Reply to  David Guy-Johnson
April 20, 2019 6:59 am

But “the trend is definitely real”, says the nutburger.

Now the models are reality I guess. You can’t make this shtuff up.

Latitude
Reply to  Rich Davis
April 20, 2019 7:29 am

oddly enough…..when you eyeball the unadjusted temp data…and then eyeball where the models would be using that
…of course the models are more accurate

That says their “adjustments” are fake

Michael Jankowski
Reply to  Rich Davis
April 20, 2019 10:32 am

I think he’s saying there is a “trend” in models using a higher equilibrium sensitivity than they used to. He even says he’s not sure if it is realistic.

Reply to  David Guy-Johnson
April 20, 2019 8:46 am

From the quoted article: “… the trend “is definitely real. There’s no question,” … “Is that realistic or not? At this point, we don’t know.”

Let me help – something can’t be “definitely real” and not realistic at the same time. Amazingly, some of the climate scientists are finally realizing this and, ever so gently, questioning what might be wrong in their models.

They may suffer excommunication, or this may be the beginning of a serious change in the CAGW / CCC narrative. Let’s hope for both?

Reply to  David Guy-Johnson
April 20, 2019 9:19 am

Excellent summation, David Guy-Johnson!

Bob Weber
April 20, 2019 6:16 am

Kate Marvel is Gavin’s understudy. I sat next to them during the recent AGU fall meeting and then listened to her speak. She’s a very attractive ‘face’ for CO2 psuedoscience. Too bad for her she’s as wrong as Gavin.

The models run hot because they’re based on science fiction. Don’t worry about those observations though, ’cause given enough time, good old Gavin will adjust the temperature history so the models correspond.

Reply to  Bob Weber
April 20, 2019 9:14 am

“Climate computer models cited by the IPCC and other climate activists employ much higher assumed sensitivity values that create false alarm. The ability to predict is perhaps the most objective measure of scientific competence. All the scary predictions by climate activists of dangerous global warming and wilder weather have proven false-to-date – a perfectly negative predictive track record.”

Source:
HYPOTHESIS: RADICAL GREENS ARE THE GREAT KILLERS OF OUR AGE
https://wattsupwiththat.com/2019/04/14/hypothesis-radical-greens-are-the-great-killers-of-our-age/

[excerpt]

3. There is NO credible scientific evidence that climate is highly sensitive to increasing atmospheric CO2, and ample evidence to the contrary. Catastrophic humanmade global warming is a false crisis.

Competent scientists have known this fact for decades. In a written debate in 2002 sponsored by APEGA and co-authored on our side by Dr. Sallie Baliunas, Dr. Tim Patterson and me, we concluded:
http://www.friendsofscience.org/assets/documents/KyotoAPEGA2002REV1.pdf

“Climate science does not support the theory of catastrophic human-made global warming – the alleged warming crisis does not exist.”

“The ultimate agenda of pro-Kyoto advocates is to eliminate fossil fuels, but this would result in a catastrophic shortfall in global energy supply – the wasteful, inefficient energy solutions proposed by Kyoto advocates simply cannot replace fossil fuels.”

Many scientific observations demonstrate that both these statements are correct-to-date.

The current usage of the term “climate change” is vague and the definition is routinely changed in the literature, such that it has become a non-falsifiable hypothesis. It is therefore non-scientific nonsense.

“A theory that is not refutable by any conceivable event is non-scientific.” – Karl Popper

Climate has always changed. Current climate is not unusual and is beneficial to humanity and the environment. Earth is in a ~10,000 year warm period during a ~100,000 year cycle of global ice ages.

The term “catastrophic human-made global warming” is a falsifiable hypothesis, and it was falsified decades ago – when fossil fuel combustion and atmospheric CO2 increased sharply after ~1940, while global temperature cooled from ~1945 to ~1977. Also, there is no credible evidence that weather is becoming more chaotic – both hurricanes and tornadoes are at multi-decade low levels of activity.
https://www.thegwpf.org/content/uploads/2013/11/Khandekar-Extreme-Weather.pdf

Even if all the observed global warming is ascribed to increasing atmospheric CO2, the calculated maximum climate sensitivity to a hypothetical doubling of atmospheric CO2 is only about 1 degree C, which is not enough to produce dangerous global warming.
https://wattsupwiththat.files.wordpress.com/2017/11/2017_christy_mcnider-1.pdf
https://journals.ametsoc.org/doi/10.1175/JCLI-D-17-0667.1

Climate computer models cited by the IPCC and other climate activists employ much higher assumed sensitivity values that create false alarm. The ability to predict is perhaps the most objective measure of scientific competence. All the scary predictions by climate activists of dangerous global warming and wilder weather have proven false-to-date – a perfectly negative predictive track record.

Based on current knowledge, the only significant impact of increasing atmospheric CO2 is greatly increased plant and crop yields, and possibly some minor beneficial warming of climate.
__________________________________

Geo
Reply to  Bob Weber
April 22, 2019 2:20 pm

They run hot because they have grossly overestimated climate sensitivity. Set sensitivity at zero and the model churn out results almost exactly in line with observations.

In fact, we are past the point for excuses. Use the current warming results to calibrate the model to correctly predict outcomes. I mean, it is no longer theoretical what the response of the climate will be to 400 ppm carbon dioxide. We know the answer, and it doesn’t match the model.

mario lento
Reply to  Geo
April 22, 2019 2:45 pm

Serious On-Topic Question. Has anyone actually set “the ECS parameter” (assuming there is one) to nominal and then ran models to hind cast? If one sets ECS to a feedback of somewhere between null and 1C… could the models actually recreate past trends?

Or do the models output ECS instead of use it as a parameter?

Komrade Kuma
Reply to  Geo
April 25, 2019 10:48 am

Just a thought but given we live on a life forming planet which has behaved in a relatively stable way for thousands of years with ice ages the big events, that occupies a ‘sweet spot’ in a solar system, spins on its own axis, orbits the system star, has its own moon orbiting and is affected by all the other solar-orbital influences that manifest from time to time, surely the default position would be that sensitivity to CO2 was pretty minimal and zero would be a good starting assumption. This is especially so given that CO2 drives the biosphere’s evaporo transpiration mechanism which is pretty enregy intense and works in the opposite direction to the alleged CO2 ‘greenhouse’ effect as does all other surface evaporation.

The fact that the model meshes are too coarse to model cloud system mechanics means that they must simply ignore the evaporative effect except via a ‘fudge factor’.

Sorry but that is NOT a model. Mathematical automatons they may be but to call them ‘models’ is fraudulent.

Crispin in Waterloo
April 20, 2019 6:22 am

This is the thin edge of a very fat wedge: examining the models and comparing their predictions with reality – otherwise know as validation.

We already know where that goes at the moment Nowhere. The models do not predict future climate states and never have, so there is an incentive in some quarters not to check their skill.

The basic problem is having a desire to create a high sensitivity to CO2 while still forcing the output to replicate past measurements. This requires assigning large cooling capacity to things like SO2 and volcanic or industrially sourced dust. It doesn’t work well for even the near future, even when there is produced a passable representation of the past.

What should be happening is the measurements and proxies of contributing influences should be entered into a database. The history as measured should then be processed to yield not a temperature curve (because that is an input) but a sensitivity to CO2 using a “goal seek” function.

If everything else is known, and CO2 sensitivity ECS is the unknown variable, that should be the output. We already know it will be somewhere in the range 0.3-1.3 degrees C. Let the optimization algorithm for for a few days and see what pops out, then run an out of sample prediction and see how it does. Validate the thing before spending trillions and deleting democracy, basically out of laziness.

Reply to  Crispin in Waterloo
April 20, 2019 10:37 am

If everything else is known, and CO2 sensitivity ECS is the unknown variable

But how can everything else be known?
The climate is the textbook case of a chaotic system. A butterfly wing can start a hurricane. It can leap from one supposedly stable state to another with no discernible forcing.

It cannot be modelled without a means of excluding certain parameters for simplicity. And which parameters those are cannot be known.

Yirgach
Reply to  Crispin in Waterloo
April 20, 2019 6:42 pm

Perhaps everything WILL be known when we understand fluid turbulence.
Well maybe that will be a start, but for now it’s the big unknown.
Feynman thought it was the number one problem.
Can’t model that, then you ain’t got squat.

Phoenix44
Reply to  Crispin in Waterloo
April 21, 2019 1:24 am

If we knew the ECS then you could run long term models on Excel with three lines – delta CO2, ECS, resulting increase in temperature.

Then put the change in temperature into your big global weather forecast models to find out what that causes in terms of drought, rain, hurricanes etc.

But instead they are trying to find the one thing we actually want to know – ECS – by throwing everything in together and stirring it around. Then if the result looks like reality they try and extract what they have done on ECS. Its bonkers.

WXcycles
April 20, 2019 6:30 am

” … instead weighing results by each model’s credibility. …”

No cred at all, and not likely to gain any either, any time this century, or next.

observa
Reply to  WXcycles
April 20, 2019 8:58 am

Well you have to find the best fit curve don’t you? Otherwise there’ll be lots of bad fitting ones muddying the waters and impacting the correct policy prescriptions.

Hugs
Reply to  WXcycles
April 20, 2019 9:03 am

Quickly! Develop 1e20 models with low sensitivity. 🙂

Greg Cavanagh
Reply to  WXcycles
April 20, 2019 2:56 pm

Weighting the “most credible” is exactly what Mann did with the tree rings. And they’ll get exactly the same result.

A question to be asked from an experiment is: “Is there any possibility of a surprising result?”

PaulH
April 20, 2019 6:35 am

I think the IPCC is banking on the Gambler’s Fallacy: “We’ve been wrong so long, we know we’ll get it right someday!”

Yooper
April 20, 2019 6:38 am
Chaamjamal
April 20, 2019 6:39 am

Climate sensitivity is the Achilles Heel of climate science.

https://tambonthongchai.com/2019/01/22/empirical-sensitivity/

DocSiders
April 20, 2019 6:41 am

I watched a YouTube Climate debate the other day and in that debate Lindzen pointed out the lack of predictive skill of the Models and at the same time he also pointed out the unpredicted Pause. The scoffing response (from Richard Sumerville, I believe) to Lindzen’s reply was to the effect that…”the oceans are a giant heat sink (well, no kidding) so there will naturally be a time lagged response and that temperatures will rise even faster than the model slopes indicate when temperatures do finally catch up.”

Lindzen did not get a chance to respond to that response. But of course the obvious answer could have been…”then why did none of the $100 Million Climate Models predict and track such an obvious lag?” And yes, why didn’t they? Because all of the Models have proven to have lousy predictive skill…at the level of Model falsification to my mind.

It is not unlikely that we’ll see a cooling trend before we see their •NEW• “slingshot” temperature rise. I wonder how the Alarmists will explain that away if it happens.
They will most likely blame the cooling on CO2 induced Climate Instability….and please God let the cooling trend occur soon.

The general public would be lost to the Alarmists for good then. Polling shows that the majority of the general public already doesn’t believe the Models work.

Serge Wright
Reply to  DocSiders
April 20, 2019 3:25 pm

If the oceans are a giant heat sink, storing the heat at depth, then they will act to constrain climate sensitivity.

Reply to  DocSiders
April 20, 2019 3:35 pm

DocSiders

…and please God let the cooling trend occur soon.

No thanks, I’d rather it didn’t. The irony being that we sceptics, perceptibly, must hope for the very event we hope won’t happen, to happen, that the world cools simply to prove our point.

The fact is, there’s an easier and more productive solution. When, and if, the planet ever does reach 1.5C warmer and nothing catastrophic happens (in 12 years time of course) where the hell do the alarmists have left to go? The scam will have been stripped bare.

DocSiders
Reply to  HotScot
April 20, 2019 4:44 pm

The lying propaganda press already has people believing that severe weather events are increasing at a terrible rate and at far greater severity. The facts say just the opposite. The propaganda is successful so far (though it still hasn’t reached the political tipping point where people feel any urgent need to act). Meanwhile we’ll continue to waste around $200 Billion annually….and growing. And unelected government bureaucrats will use regulatory back door strategies to keep squeezing some more of the life out of the economy.

So no matter how benign the climate is as the 1.5 degree mark is crossed, the media bent perception will be that the climate did unprecedented damage.

However, it will be politically impossible to explain away a 0.3 degree GAT decline when your predictions were for a 100% CERTAIN 0.7 degree GAT increase predicted by tbe Super Accurate Newest CLIMATE MODELS. Even the most clueless casual observer will be hard to convince that ANY cooling fits in with the “settled science” that predicted lots of RAPID warming (the next IPCC report is expected to predict about twice the rate of warming…after a few more years of little to no change…to catch up to earlier predictions).

I sure don’t want any actual cooling….but I fear impoverished slavery and lasting damage to Constitutional freedoms a whole lot more.

April 20, 2019 6:41 am

instead weighing results by each model’s credibility.

And since you already know the answer ===> Wouldn’t one model be enough then?

April 20, 2019 6:44 am

What did I say that got me sent to the bit bucket?

John F. Hultquist
Reply to  MSimon
April 20, 2019 9:14 am

Dear MS,
Chill. Once you click on ‘Post Comment’ things happen that no one understands or controls.
Sometimes one needs to turn the computer or phone off and take a hike. Not the figurative “take a hike” but actually go out and walk a bit, take a camera, take some photos.
This will be good for you, and it will give the ‘whatever’ in the system time to digest and post your enlightened comment.

Reply to  John F. Hultquist
April 20, 2019 9:23 am

Great advice, John F.!

Pop Piasa
Reply to  John F. Hultquist
April 20, 2019 1:44 pm

It makes you wonder if many comments are being diverted and delayed by snoopware that’s not been found yet on wordpress.

April 20, 2019 6:51 am

Problem for them is the higher the ECS the longer the time it takes to get there.

No worry, the Adjustocene no doubt will take care of that “little” problem. Like the BoM’s Darwin adjustment; that lays the basis now for NOAA to make another of their own adjustment to GHCN for that station’s adjustments, just in time for AR6.

Trump should sign an executive order renaming NOAA as the Bureau of ADjustments (BAD). And an order to rename NASA/GISS as the Center for Adjusted Climate Anomalies (CACA).
If he can’t directly force them to change their deceptive ways (or fire them since they are GS civil service), he can at least shame them.

Rich Davis
Reply to  Joel O’Bryan
April 20, 2019 7:28 am

But they have no shame Joel. They are doing Gaia’s work. The ends justify the means.

Rich Davis
Reply to  Joel O’Bryan
April 20, 2019 7:52 am

So I don’t think it’s a problem for them at all. That’s a feature, not a bug. If it is going to take three lifetimes to reach the climate catastrophe, with serious harm already baked in, but just not visible to us yet, how can we be so callous as to suggest we wait and see? What about the grandchildren?

If it’s going to take 200 years to reach the equilibrium, it isn’t going to be possible to falsify the theory before socialism is firmly in place everywhere.

If they can sell the idea that we are slowly charging up the vast capacitors of the oceans and that they will eventually discharge vast amounts of heat that we are just not seeing yet, then they can continue this farce for generations before it is definitively disproven. By then we’ll all be regimented in our socialist workers’ paradise and not permitted to notice. The glorious five-year plan has once again succeeded in keeping the global temperature in check.

Patrick healy
Reply to  Joel O’Bryan
April 20, 2019 11:37 am

Mr O’Bryan,
In certain parts of Britain Kaka is a coloquilism for feces.
It is usually used by very polite mammas so as not to teach their offspring naughty words.
So excellent idea – never mind the spelling, the word sound is spot on!

Steve Keohane
April 20, 2019 6:52 am

“equilibrium climate sensitivity” has come in at 5°C or warmer. Modelers are struggling to identify which of their refinements explain this heightened sensitivity They don’t check their programs’ results as they program? I guess another way to express it is, they don’t know what they are doing.

Duncan Smith
Reply to  Steve Keohane
April 20, 2019 7:13 am

Possibly they know exactly what they are doing, this higher (too high) sensitivity makes for great sound bites on the MSM to scare the public without carrying on the explain the ‘uncertainties’. While the scientists/modelers exonerate themselves on page 97 of a report with a one paragraph disclaimer. Otherwise, why broadcast it if the results even make climate scientists squeamish?

Reply to  Steve Keohane
April 20, 2019 8:08 am

They are also desparately trying to prop up the higher end estimates of climate sensitivity. The Big Embarrassment is that after 40 years, the Climate community are still unable to narrow the range down to something with any meaningful use. Stuck now at ECS = 1.5 K to 4.5 K .

And the recent pressure from observation (Lewis and Curry, and others) to lower the range from 1 K to 3 K means the high end alarmist numbers are crap. SO along comes the “new” model to prop up the high end for their True Believers.

Phoenix44
Reply to  Steve Keohane
April 21, 2019 1:34 am

The trouble is that they are trying to get their models to “produce” ECS when ECS is an assumption they (indirectly) put in. They want ECS to be an emergent property so the models can “prove” ECS but it is not, it is simply a product of their assumptions. Now ECS is going crazy because they have been tweaking assumptions all over the place to back-cast accurately but all that shows us that their models are wrong. But of course they cannot admit that. It is a total mess, the consequence of using models for proof rather than as models of possible futures if assumptions are correct.

Tom Abbott
April 20, 2019 7:00 am

From the article: “In assessing how fast climate may change, the next IPCC report probably won’t lean as heavily on models as past reports did, says Thorsten Mauritsen, a climate scientist at Stockholm University and an IPCC author. It will look to other evidence as well,”

Computer models are not evidence.

icisil
Reply to  Tom Abbott
April 20, 2019 7:24 am

They are evidence of climate scientists’ understanding of climate. Since they don’t perform well, that is the evidence that they don’t understand climate well.

WXcycles
Reply to  icisil
April 20, 2019 9:53 am

yeah but satellites can now see climate-change™

It used to be called weather but all things are made new again … and stuff.

Sun Spot
Reply to  Tom Abbott
April 20, 2019 8:02 am

“Computer models are not evidence.”
. . . +10,000

son of mulder
April 20, 2019 7:02 am

I’d be interested to see updates made to the famed Trenberth diagram of global energy balance, using the models to show the diagrams for 1880, 1997 (the original Trenberth diagram), today (based on actuals and the models) and that predicted when CO2 has doubled to twice 1880 values.

I think such a collection of diagrams would expose many inconsistencies between actual and theory.

TRM
April 20, 2019 7:03 am

We are so far from having anything close to accurate models it isn’t funny. To anyone who doubts that please read Dr Brown’s post

https://slashdot.org/comments.pl?sid=5790561&cid=48073849

It was included in one of his posts here as well.

cerescokid
Reply to  TRM
April 20, 2019 7:33 am

I could read his work all day long. I would love to have any representative from the establishment to tell us what he has wrong.

Jeff Alberts
Reply to  cerescokid
April 20, 2019 8:34 am

I’m sure Mosher will get right on that.

Reply to  Jeff Alberts
April 20, 2019 10:13 am

Steven is is a bit contradictory in his explanations. In a conversation on my blog, he stated he wasn’t producing anomalies, he was calculating probabilities, trying to predict what temperatures would be in areas where no stations were. Then I visited the BEST page, and the headline said 2018 was the 4th warmest year on record. Hm.

I’ve been crunching the NOAA GHCN-M reports for a couple of months now, creating very basic anomalies, and they show 1 to 1.25 degrees of warming over the last 120 years. The contiguous US shows only about 0.5 to 0.75 degrees of warming.

Even though my calculations are nowhere near as sophisticated as the pro’s, the have to mean something, because they’re using only real, observed, data.

Bob Weber
Reply to  Pat
April 20, 2019 8:05 am

…If the results are to be believed, the world has even less time than was thought to limit warming to 1.5°C or 2°C above preindustrial levels—a threshold many see as too dangerous to cross. With atmospheric CO2 already at 408 parts per million (ppm) and rising, up from preindustrial levels of 280 ppm, even previous scenarios suggested the world could warm 2°C within the next few decades.

They’re doubling down on their alarmism. The way it was said gives too much latitude for interpretation. Did they mean there would be an average 2°C change or was that a peak change? ‘The next few decades’ could mean 2, 3, or 4 decades – which is it? It’s too vague.

As the CMIP model projection(s) are based on projected higher human CO2 emission scenarios, and since the modelers truly believe CO2 is causal, it’s no surprise they ‘believe’ temps will increase faster from faster emissions output ‘over the next few decades’. Climate science is rigged on faulty ideas.

I have a different view of what causes weather and climate to change; ie, TSI changes via solar activity.

My view of the next decade is solely dependent on projected solar activity for solar cycle 25, which indicates a ‘warming surge’ from the top of the solar cycle like we had in SC24 with the large El Nino.

Personally I don’t put any credence into any ‘next gen climate models’ that are based on CO2 warming.

I conclude there is no possible way that a temperature increase of the size they promote can even happen, even under a 100% solar forcing regime. Therefore, I know these highly educated and well-paid and pampered people continue to be self-deluded.

It’s disgusting that people who are so wrong are getting all the press coverage and grant monies, and treated as though they are infallible.

DocSiders
Reply to  Bob Weber
April 20, 2019 3:46 pm

They are doubling down so they can get the proverbial “5 Year Plan” under way before the climate begins cooling again.

Jiim Rose
April 20, 2019 7:04 am

Please excuse my naivete. Could someone explain this to me? I thought the climate sensitivity was an input to the models. The article talks as if they are calculating it. This begs the question “What are the actual inputs to the models?”

Jeff Alberts
Reply to  Jiim Rose
April 20, 2019 11:22 am

Raises the question, not begs.

Adam Gallon
April 20, 2019 7:07 am

Of course they won’t give them equal weighting. They’ll weight the ones that give 4+ degrees all the weights.

Clyde Spencer
Reply to  Adam Gallon
April 20, 2019 9:12 am

Adam
It seems that there is a similar bias in reporting political events. The Fourth Estate has become a Fifth Column.

Jeff Alberts
Reply to  Adam Gallon
April 20, 2019 11:23 am

“Of course they won’t give them equal weighting. They’ll weight the ones that give 4+ degrees all the weights.”

Just ask M. Mann. He had to weight bristlecone pines 391 times higher than all the other proxies in order to achieve the hockey stick he was looking for.

Editor
April 20, 2019 7:16 am

Also interesting from the Science article is this note saying the next IPCC report is running late, in part due to these modeling problems:

Answers may come from an ongoing exercise called the Coupled Model Intercomparison Project (CMIP), a precursor to each IPCC round. In it, modelers run a standard set of simulations, such as modeling the preindustrial climate and the effect of an abrupt quadrupling of atmospheric CO2 levels, and compare notes. The sixth CMIP is now at least a year late. The first draft of the next IPCC report was due in early April, yet only a handful of teams had uploaded modeling runs of future projections, says Fyfe, an author of the report’s projections chapter. “It’s maddening, because it feels like writing a sci-fi story as the first-order draft.”

The ambitious scope of this CMIP is one reason for the delay. Beyond running the standard five simulations, centers can perform 23 additional modeling experiments, targeting specific science questions, such as cloud feedbacks or short-term prediction. The CMIP teams have also been asked to document their computer code more rigorously than in the past, and to make their models compatible with new evaluation tools, says Veronika Eyring, a climate modeler at the German Aerospace Center in Wessling who is co-leading this CMIP round.

Reply to  Ric Werme
April 20, 2019 9:02 am

““It’s maddening, because it feels like writing a sci-fi story as the first-order draft.”” It’s sci-fi all the way down.

Alan Ranger
April 20, 2019 7:29 am

“instead weighing results by each model’s credibility.”

What credibility?

Pop Piasa
Reply to  Alan Ranger
April 20, 2019 6:54 pm

Jeez, the only halfway credible model was done by the Russians, IIRC.

April 20, 2019 7:40 am

Higher published ECS numbers cause more climate extremism amongst those with a college level education in non-STEM subjects claiming “climate science” capabilities. It’s a big group they are mining for adherents.

M.W.Plia
April 20, 2019 7:54 am

“The results so far are “not sufficient to convince me,” says Kate Marvel, a climate scientist at NASA’s Goddard Institute for Space Studies in New York City.”

This is akin to holding up the cross to a vampire…how does she keep her job?

April 20, 2019 8:00 am

The chart by Dr. Roy Spencer provides a convincing illustration of today’s folly.
There hasn’t been such a “End of the World” mania since Millerism in the 1840s. When the first day that the world would end did not work out, Miller recalculated the numbers and with even more conviction and support crafted another day when the world would end.
The mania spread widely across the Eastern States and even to England.
When the worst disaster imaginable did not happen it became known as the “Great Disappointment”.
Fascinating conclusion.
Dr. Roy’s chart is dated 2013.
Is there a more current one?

Major Meteor
April 20, 2019 8:08 am

But, but… I thought the science was settled?

April 20, 2019 8:11 am

From the post: “IPCC is also not likely to give projections from all the models equal weight, Fyfe adds, instead weighing results by each model’s credibility.”
Great news! Because the credibility of any and all of these models approaches zero, then the proper weighting gives the most credible composite result to the nearest integer: ECS=0 deg C.

rah
April 20, 2019 8:22 am

“Climate Modellers Waiting for Observations to Catch Up with Their Predictions”

You sure as hell could have fooled me! I can’t see where they have waited even a nanosecond to pump out the results of their models as if they are reality. Alarmists live in a computer generated climate world as far as I can see. GIGO means nothing to them.

ferd berple
April 20, 2019 8:28 am

” … instead weighing results by each model’s credibility. …”

If the IPCC knows which model is the mist credible, there us no need for any models. Simply use the credibility range as your forecast and throw out all the models.

My point is that by assigning credibility you have made a prediction. Which in itself establishes that there is no need for models. Simply use whatever criteria you use to establish “credible” as your forecast.

Thus, the IPCC, by using credibility as a selection criteria, are in effect saying models are not necessary. We could as well use wild added guesses and then select the most credible.

Curious George
April 20, 2019 8:32 am

All these IPCC confidence levels should be multiplied by our confidence in the IPCC itself. 95% times zero is … ?

Eben
April 20, 2019 8:35 am

That pic on top is way out of date , somebody is not keeping up

John F. Hultquist
Reply to  Eben
April 20, 2019 9:26 am

You and Bob Hoye, at 8 am, should find the data and produce a new chart.
I’m busy, so it is up to you. Or someone else.

William Astley
April 20, 2019 8:36 am

It is a fact that CAGW has been falsified by the observations.

The observations in fact do not support AGW.

Rather than solve the problems:
What caused the temperature change in the last 30 years and what caused the atmospheric CO2 change, the cult just dug in deeper, moving to fake climate models.

https://wattsupwiththat.com/2015/05/12/22-very-inconvenient-climate-truths/

The 22 Inconvenient Truths
1. The Mean Global Temperature has been stable since 1997, despite a continuous increase of the CO2 content of the air: how could one say that the increase of the CO2content of the air is the cause of the increase of the temperature? (discussion: p. 4)
2. 57% of the cumulative anthropic emissions since the beginning of the Industrial revolution have been emitted since 1997, but the temperature has been stable. How to uphold that anthropic CO2 emissions (or anthropic cumulative emissions) cause an increase of the Mean Global Temperature?
[Note 1: since 1880 the only one period where Global Mean Temperature and CO2 content of the air increased simultaneously has been 1978-1997. From 1910 to 1940, the Global Mean Temperature increased at about the same rate as over 1978-1997, while CO2anthropic emissions were almost negligible. Over 1950-1978 while CO2 anthropic emissions increased rapidly the Global Mean Temperature dropped. From Vostok and other ice cores we know that it’s the increase of the temperature that drives the subsequent increase of the CO2 content of the air, thanks to ocean out-gassing, and not the opposite. The same process is still at work nowadays] (discussion: p. 7)
3. The amount of CO2 of the air from anthropic emissions is today no more than 6% of the total CO2 in the air (as shown by the isotopic ratios 13C/12C) instead of the 25% to 30% said by IPCC. (discussion: p. 9)
4. The lifetime of CO2 molecules in the atmosphere is about 5 years instead of the 100 years said by IPCC. (discussion: p. 10)

6. The absorption of the radiation from the surface by the CO2 of the air is nearly saturated. Measuring with a spectrometer what is left from the radiation of a broadband infrared source (say a black body heated at 1000°C) after crossing the equivalent of some tens or hundreds of meters of the air, shows that the main CO2 bands (4.3 µm and 15 µm) have been replaced by the emission spectrum of the CO2 which is radiated at the temperature of the trace-gas. (discussion: p. 14)

Hugs
Reply to  William Astley
April 20, 2019 9:14 am

‘The Mean Global Temperature has been stable since 1997’

Lovely. What about if you take a climatic 30 year trend? Do you see warming or not?

PS. Please don’t start telling me it’s El Nino. There’s warming which you just said doesn’t exist. How serious is 1.8 Kelvin degrees per century, that’s the question.

William Astley
Reply to  Hugs
April 20, 2019 10:55 am

The observations do not support AGW. The models only make sense if the temperature rise correlates continuously to the CO2 rise. i.e. If AGW has real temperature would have increased as a wiggly line.

CAGW is dangerous, as it does not scientifically exist and the lying required to create CAGW and force us to spend money on green stuff that does not working is leading to left chaos.

CAGW is a fake problem, with a very, very expensive fake solution that does not work at a basic engineering level.

Six independent analysis results in peer reviewed papers supports the assertion that the increase in atmospheric CO2 was not cause by anthropogenic CO2 emissions.

Hugs
Reply to  William Astley
April 20, 2019 11:38 am

There is an allowance to variation like ENSO, so the correlation is not expected yo be yearly.

Richard M
Reply to  Hugs
April 20, 2019 6:43 pm

If you want to ignore El Nino then we have been cooling at a rate of 1.2 C / decade over the past 3 years. That should get us into the next glaciation period quite quickly.

I think ignoring what is clearly noise in a trend is silly.

ferd berple
April 20, 2019 8:43 am

” … instead weighing results by each model’s credibility. …”

This is a very big deal. This is actually the death rattle of climate models, they just haven’t realized the implications:

Consider that the IPCC for example, establishes that the model predictions of sensitivity between 2C and 4C, centered on 3C are the most credible. In that case, there is no need for any models. The IPCC by establishing a credibility criteria has made a prediction that there will be most likely 3C of warming for a doubling of CO2, with a warming between 2C and 4 possible.

As soon as you select the models for credibility, your credibility function is making a prediction about the future. And regardless of what the models say, it will be the credibility function that delivers your predicted value for future temperatures, not the models.

At the point the IPCC produces a credibility function to select the models, the models become redundant to the credibility function. You can simply replace all the models with the credibility function and the results would remain the same. Thus, the credibility function signals the death of climate models.

brent
April 20, 2019 8:54 am

Climate Change 2007: The Physical Science Basis
Summary for Policymakers
by Vincent Gray
http://www.pensee-unique.fr/GrayCritique.pdf

When one resorts to weasel wording such as the IPCC does, this should have been a big red flag!
IPCC doesn’t make “Predictions”; they only make “Projections”
Models are not “Validated”; They are only “Evaluated”

When the usual suspects are challenged wrt lack of “Predictions”, they blame uncertainty in emissions scenarios, completely evading the crux of the matter that the core models themselves have not been shown to have skill at prediction.

icisil
April 20, 2019 8:57 am

Historical revisionism takes time to catch up.

NOAA analysis determined Hurricane Michael was a Category 5 hurricane at landfall last October.

https://twitter.com/NOAAComms/status/1119238984423682051

icisil
Reply to  icisil
April 20, 2019 9:07 am

Max wind gust recorded on land was 119 mph.

n.n
April 20, 2019 9:05 am

Well, if Dr. Curry is correct, then the system response follows a stadium wave, which will produce periodic samples coincident with their hypotheses. Then with the leverage of the government, press, academia, and empathetic corporations, the convenient conclusion will be repeated ad infinitum until the narrative becomes obligatory as a matter of consensus.

Wharfplank
April 20, 2019 9:08 am

As we humans modify our definition of disastrous fiery floods I’m sure our models will adapt, also.

WXcycles
Reply to  Wharfplank
April 20, 2019 9:41 am

History already has.

E J Zuiderwijk
April 20, 2019 9:21 am

I feel a new round of temperature homogeneizations coming on. Prepare for an even colder past.

ferd berple
April 20, 2019 9:24 am

in particular a large study in preparation that will use ancient climates and observations of recent climate change to constrain sensitivity.
===============
Mathematical nonsense due to bandwidth mismatch. Ancient proxies are low frequency signals, with as much as 800 years lag between temperature and CO2. Recent climate change is high frequency signals, with human effects going back at most 75 years according to the IPCC.

The very basis of climate science, establishing climate as the average of weather over 30 years is a nonsense. Earth is mostly covered in oceans, and most of the surface energy is stored in the oceans. The deep oceans have cycles on the order of millenia.

Trying to make sense of climate in terms of 30 year averages is the story of the blind men and the elephant. Depending upon where you take your 30 year sample, you are going to get a different answer as to the nature of climate.

comment image

H.I. McDonough PhD
April 20, 2019 9:28 am

My model of all past climate models predicts that they will be completely wrong again.

Curious George
April 20, 2019 9:50 am

My personal experience with climate modellers:
https://judithcurry.com/2013/06/28/open-thread-weekend-23/#comment-338257

Reply to  Curious George
April 20, 2019 11:12 am

One of the things I noticed recently (thanks to a troll no less) that GCMs are not climate models. They are global weather models. The climate part comes from averaging GCM weather for many years. The old standard was that climate is weather averaged for thirty years. However, if GCMs can only track actual weather for about two weeks (typical of chaotic systems), then this averaging of future weather beyond two weeks is just nonsense.

I’d like to see an actual climate model working. I doubt that anyone knows what those climate differential equations are–they barely know what some of the weather differential equations are.

Jim

RACookPE1978
Editor
Reply to  Jim Masterson
April 20, 2019 11:26 am

Jim Masterson

One of the things I noticed recently (thanks to a troll no less) that GCMs are not climate models. They are global weather models.

More accurately, they began as “Global Circulation Models” created to simulate local aerosols and dust particles in local regions (LA Basin for example), then they were “extrapolated from local air masses” to regional air masses to “analyze” the acid rain downwind of the the assumed smokestacks emitting sulfate particles.
When that worked to eliminate the smokestacks, the same Finite Element cubes were multiplied and extrapolated across the globe to “analyze” the “Hole” over the Antarctic to establish the theory needed to eliminate CFC’s from air conditioners and HVAC systems.
When that worked, the same Global Circulation Models were extrapolated using the same equations ( and more elaborate finite element boundary assumptions) to begin the “climate” models that are being used to destroy the economy and kill people.

The models are run for many “years” in model space to stabilize at some condition, then a single parameter is changed (usually CO2 concentration) then allowed to run again with the results printed (displayed) for each following year from the assumed start condition.

Reply to  RACookPE1978
April 20, 2019 11:42 am

“Climate is what you expect. Weather is what you get.”
–sometimes attributed to Mark Twain (or Robert Heinlein or . . . .)

Jim

Michael Jankowski
Reply to  RACookPE1978
April 20, 2019 5:36 pm

The irony in that would be that the climate models do a much better job extrapolated globally than they can do locally and regionally.

April 20, 2019 9:56 am

You need to update your graph to 2018. It was OK when it was 2 or maybe 3 years old but not now at 7 years.

ferd berple
Reply to  Scute
April 20, 2019 10:14 am

You need to update your graph to 2018.
===================
that simply allows the modeller’s to change their predictions and adjustments by making use of 5 more years of data. They will immediately turn around and using these new numbers say: “See we got it right all along”.

It is really easy to get predictions right when you update them every year. It makes it look like you were doing a great job all along. We see this with solar predictions. They simply overwrite the predictions with actuals and shift the predictions to match. It makes it look like they were batting a thousand when in actual fact the only bats were in the belfry.

Reply to  ferd berple
April 21, 2019 10:10 am

I meant just update the observations to 2018 and leave the models. Since this is a post about comparing model projections with reality that would be fair i.e. how are model projections made in 2005 holding up today? Not very well. And so we can’t trust model projections, even when they do update them.

But my comment also calls out the use of this graph because the observations show increased temps after this time (2011). Although those temps are overwhelmingly still below the modelled projections, the extent to which they are is less than in 2011. That means this outdated graph is misleading. If they have a case (and they do) then present it with all the up-to-date data. Just because it’s less spectacular in recent years doesn’t mean they should hide it.

Wondering Aloud
April 20, 2019 10:05 am

As the models continue to fail completely, will the data sources continue to be “adjusted” (a.k.a. fudged) so they can all pretend the warming predicted is real? What does the graph look like if you replace the Observations line with the actual observations as opposed to the adjusted ones? About 1/2 a degree C lower?

Tom Abbott
Reply to  Wondering Aloud
April 21, 2019 5:38 am

“What does the graph look like if you replace the Observations line with the actual observations as opposed to the adjusted ones?”

Here’s a comparison:

http://www.giss.nasa.gov/research/briefs/hansen_07/

The actual observations is the chart on the left, the Hansen 1999 US temperature chart, and the chart on the right is a bogus, bastardized modern-era, ETCW (Early Twentieth Century Warming) Hockey Stick chart, which artificially cools the 1930’s into insignificance (and 1998, for that matter). The Climategate conspirators said they had to do somethinng about the 1940s heat “blip”, their word for warming equal to today’s warming. And they did do something about the blip, they created the bogus, bastardized Hockey Stick charts in order to make it disappear.

Hansen 1999 is the real temperature profile of the globe, which shows the 1930’s as being as warm as today. That means there is no unprecedented warming today, which puts the lie to the CAGW speculation.

The bogus Hockey Stick charts were created out of thin air to mimic the CO2 chart. This way NASA and NOAA can scream “Hottest Year Evah!” and “Hotter and Hotter” and pretend that temperatures are rising in concert with CO2 levels. They are a bunch of liars. The only “evidence” they have for this is their bastardized temperature charts.

Note, the Hansen 1999 chart only goes to 1999, with 1998 being 0.5C cooler than 1934. Combine the Hansen 1999 chart with the UAH satellite chart and we see that 2016 was only 0.1C warmer than 1998, which makes 2016 0.4C cooler than 1934. We have been in a temperature downtrend since the 1930’s. Quite a different story than the one NASA and NOAA are telling.

http://www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_March_2019_v6.jpg

And it should also be noted that unmodified charts from all around the world resemble the Hansen 1999 chart profile with the 1930’s showing to be as warm as today. Again, no unprecedented warming in 2016 or today. The temperatures have actually dropped about 0.6C over the last three years, even as CO2 levels climb.

No unmodified temperature charts from around the world resemble the bogus, bastardized Hockey Stick chart. The Hockey Stick chart is the creation of the Climategate Data Manipulators and perpetrates an expensive, criminal lie on the world.

ferd berple
April 20, 2019 10:06 am

in particular a large study in preparation that will use ancient climates and observations of recent climate change to constrain sensitivity.
===============
In the past 1 million years, every ice age ended when CO2 was low. In the past 1 million years, every ice age began when CO2 was high.

According to GHG theory this should be impossible. Especially when going back 1 million years. Ice ages should end when CO2 is high, because high CO2 causes warming according to GHG theory. Ice ages should begin when CO2 is low, because low CO2 causes cooling according to GHG theory.

Our evidence of climate change going back at least 1 million years directly contradicts the GHG theory of climate change.

We know the changes in average insolation due to orbital irregularities are too small to trigger an ice age. Especially given CO2 levels and prevailing GHG theory.

The only logical conclusion is that the GHG theory of climate change is wrong. In every other branch of climate science, this contradiction would have long ago led to a reevaluation. However, due to political pressure from the IPCC this is not possible in climate science. There are too many jobs at risk.

The mistake was likely caused by the lack of resolution in the early ice cores. Only recently, just 20 years ago, was it discovered that temperature leads CO2. Before that it was assumed that CO2 was leading temperature, because that is what GHG theory required.

And in science, 20 years is nothing. Like ulcers. 80% of all doctors still believe that stress causes ulcers, because that is what they were taught. Even though the true cause was discovered 40 years ago to be due to bacteria.

The same is true of climate science. The vast majority of climate scientists believe that CO2 causes warming, because that is what they were taught. And since everyone says it is true, it must be true. Because if it is not true, a whole lot of people stand to lose their jobs. So they have a vested interest in shouting down anyone that says otherwise.

Until the current crop of climate scientists die off, and the lag time between temperature and CO2 is widely published in textbooks, there will be little progress made in climate science, because the ice cores show that climate science has at its foundation an incorrect assumption.

Climate science believes the GHG theory is correct, and there is not any serious attempt by leading climate scientists to test this theory because of entrenched interests. The failure of climate science and the IPCC to narrow the range of climate sensitivity after spending in excess of 100 billion dollars is strong evidence that climate science is based on a wrong assumption.

“Science advances one funeral at a time.” Max Planck

DH
April 20, 2019 10:24 am

Can somebody please point me to the original post where the figure by Roy Spencer/H. Hayden at the tip of this post is presented and explained?
Or better any update of it?
Thanks

Berndt Koch
April 20, 2019 11:16 am

“..In the effort to account for atmospheric components that are too small to directly simulate, like clouds..”

worth repeating…

“..In the effort to account for atmospheric components that are too small to directly simulate, like clouds..”

So let me get this right, by implication, they are telling us they can’t model clouds??? If so that in itself should invalidate every one of the models…

Alan Tomalty
April 20, 2019 11:40 am

Reality is much more sinister than all of you have realized. Each generation of computer climate models has core code that translates the results of their radiative transfer equations to a warming projection. This code is highly guarded and only released to trusted modellers who agreed to use the basic code that has been produced for that generation. Climate modelling is on its 6th generation. Think of it. Why would you have to have any generation number in the 1st place? Why isnt each modeller free to run their own code without having to follow what is in each generation? Of course they get to tinker by adding in their own code but that just tinkers at the edges and doesn’t do anything to change the core code. Notice that all the modellers don”t know why the 6th generation models are running way hotter. If there wasn’t core code supplied to them with each generation then each modeller would know exactly why his code was running hotter because he is constantly testing his code after every tinkering change. So each modeller always knows why his model is doing what it is doing relative to each 2 successive changes in the code. You don’t make more than 1 change in the code without testing. Every computer programmer in the world ,(and I have been one who has owned his own software company and part owner in another company ) knows that not testing after each change is disastrous. Initially the 6th generation was supposed to have solar forcing code with cosmic rays…..etc except that when they ran the tests, the models were not showing any warming. Thus they cancelled the 1st release of the core 6th generation code and went back to the drawing board. That is why the release of the 6th generation has been delayed. It now seems that whatever changes they came up with, is making the models run even hotter, but don’t ask each individual modeller at a particular university or institution that runs these super computer GCMs because they were simply given the 6th generation of core code. This whole concept of Climate modeller generation is to make sure they speak with the same voice. The Russian model is the only rogue system and the others pay no attention to it, because the Russsian modellers have refused to adhere to this generation concept.

Tom
April 20, 2019 12:40 pm

I think the satellite data shows about 0.7 C of warming since 1980. I don’t understand the chart. Please explain.

Richard M
Reply to  Tom
April 20, 2019 7:06 pm

The chart is only for the Tropics and only the mid Troposphere. The reason this data is often examined is because this is where the tropical hot spot is supposed to appear.

Tom Abbott
Reply to  Richard M
April 21, 2019 5:46 am

Yes, and the tropical hotspot is “missing in action”. So much for the CAGW speculation. When a scientific speculation doesn’t pan out, like a prediction of a tropical hotspot that doesn’t appear, it’s time to question the speculation.

April 20, 2019 1:13 pm

“IPCC is also not likely to give projections from all the models equal weight, Fyfe adds, instead weighing results by each model’s credibility.”

Their credibility is already zero. Why bother? Just continue to publish the continually wrong stuff and pretend it is data. Business as usual. Scary stuff. Send money.

Mikey
April 20, 2019 3:06 pm

Gee, this isn’t like ‘1984’ at all, where they constantly rewrote history to prove how accurate their predictions were. Frauds.

Mikey
Reply to  Mikey
April 20, 2019 3:09 pm

Maybe they should call themselves The Ministry of Truth in Climate Science.

Tom Abbott
Reply to  Mikey
April 21, 2019 5:49 am

“Maybe they should call themselves The Ministry of Truth in Climate Science.”

I think you are on to something there, Mikey! 🙂

Ian Bryce
April 20, 2019 3:10 pm

The graph needs to be updated. 2013 was a long time ago!

yarpos
April 20, 2019 3:10 pm

I wonder how the IPCC feels equipped to assess the credibility of anything? That reads like something from The Onion.

April 20, 2019 3:18 pm

… the new models could easily have strayed from reality

Ya think ?

April 20, 2019 3:18 pm

Knutti, the Switzerland Rahmstorf, no respect for facts or reality

Zigmaster
April 20, 2019 3:53 pm

My observation is that if the observations don’t match the computer models it appears that they think it’s easier to change the observations. A whole generation has been indoctrinated to believe that the weather they observed is not as they remembered . They also think that observations made my dead people over hundred years ago can be changed because they don’t match the models. The warmists are so brazen that they make changes to historical data with the most spurious of explanations when anyone with half a brain knows that it’s done just to meet the narrative. It’s interesting how trusting everyone is when you tell them that there is this major fraud going on and their retort is , so the governments and the universities and the weather bureaus and many major companies and institutions , and the media are in cahoots to deceive the general population !

Well, Yes.

How or why it has happened is arguable , but that it has happened there is no doubt. Most observers feel that it is driven by a desire to control and the power that control bestows, and others say it’s purely motivated by greed, but I think that it is mainly driven by the natural human emotion to not want to be wrong. Most people have invested too much emotional capital into understanding that what they’ve believed all there life is actually wrong.

Tom Abbott
Reply to  Zigmaster
April 21, 2019 5:59 am

“It’s interesting how trusting everyone is when you tell them that there is this major fraud going on and their retort is , so the governments and the universities and the weather bureaus and many major companies and institutions , and the media are in cahoots to deceive the general population!”

The truth is it is not everyone who is in on the conspiracy. There are just a few in on it, the Climategate Data Manipulators and CAGW political activists. The rest of the world has to depend on what the Climategate Data Manipulators say because the majority of the population is not in a position to question these Climategate liars, so they accept the CAGW claims of the conspirators as fact.

So you have a few who are in on the conspiracy and the rest of the world have been duped into believing a lie. Even very smart people have been duped. Lack of information, or being fed false information, doesn’t mean you are stupid, but it can lead one astray, as in the CAGW case.

April 20, 2019 3:54 pm

I’ve seen a lot more attractive and interesting Hot Models than the ones that the IPCC sponsors.

Tom Abbott
Reply to  nicholas tesdorf
April 21, 2019 6:01 am

Yeah, you ought to search for “Russian models”. Warning though, you may get distracted from climate models.

April 20, 2019 5:37 pm

The only solutio to all of this climate nonsense is for Presidency Trump
to inform the head of the United Nations that unless the IPCC does also
include a study into the natural factors in the study of the worlds weather
come the 30 year climate cycle, that he will cease to pay the USA
contributions to the UN.

This must include the use of observations, both historical and present
day rather than just the crystal ball computers presently used. And that
all data be made available including just how the so called “Summaries”
are actually compiled.

MJE VK5ELL

WXcycles
Reply to  Michael
April 20, 2019 7:23 pm

Better still boot them out of New York. See how long the UN lasts when it’s based in Paris.

Global UN-xit.

Mr Bliss
April 20, 2019 7:25 pm

“Modelers are struggling to identify which of their refinements explain this heightened sensitivity” – I doubt they are struggling to do anything but contain their unmitigated feelings of joy at such a catastrophic future. “Why didn’t we think of running the models hot before!!!”

Frank
April 20, 2019 9:26 pm

Paul Voosen: “Many scientists are skeptical, pointing out that past climate changes recorded in ice cores and elsewhere don’t support the high climate sensitivity”

How does Mr. Voosen explain the existence of glacial and interglacials. Low climate sensitivity means that it is difficult to change the temperature of our planet with a forcing. It means that our planet emits or reflects 2 or 3 W/m2 per degK of warming, instead of the 1 W/m2/K predicted by climate models. (If a doubling of CO2 reduces radiative cooling to space by 3.6 W/m2, then 3.6 K of warming with a response of 1 W/m2/K will restore balance at the TOA. If our planet emits and reflects 2 or 3 W/m2/K in response to warming, then balance will be restored by 1.8 or 1.2 K of warming. When you look back at the LGM – which was 6 K colder, the opposite is true, our cooler planet was emitting or reflecting 6, 12 or 18 W/m2 less radiation at the LGM! Total planetary irradiation isn’t changed by orbital mechanics, only the location and seasonality of that irradiation. It’s tough to believe in low climate sensitivity and understand how our planet could be 6 K colder during interglacial – unless our planet is near a “tipping point” or has high climate sensitivity in the cold direction in the cold direction, but not the hot direction.

High Treason
April 21, 2019 12:22 am

This chart needs an update- it is now 5 years- time for a review to see how far out the models are from reality. It is also time to review the confidence level. I am betting the verdict will have to be “almost certain.” Of course, the confidence levels are the exact opposite of normality- 100% wrong. Confidence can not get more certain as the divergence between modelled predictions and measured reality becomes greater. it is quite the reverse.

Semantic manipulation and sneaking in brazen lies is the MO of the alarmists. The brazen errors of the certainty reporting are no accident. If it is an accident, it is terminally gross incompetence. If it is not an incompetent accident, then it is a deliberate fraud. I suspect the latter.

The flagrant certainty error should be a penny-drop moment for anyone with a brain.

François Marchand
April 22, 2019 11:46 am

Your figures are ten years old. Leave your air-conditioned office, and go outside, where you will see olive trees which do not belong there…

Pamela Gray
April 22, 2019 1:06 pm

It would have been much faster and cost a lot less to publish to say, “We are putting lipstick on a pig.”

DDP
April 22, 2019 7:28 pm

“IPCC is also not likely to give projections from all the models equal weight”

So, in other words only the most “it’s worse than we thought” models will be used. It’s far easier to adjust the data to match a half dozen models than it is for 73. Especially so when the older “other evidence” needs to be adjusted to fit.