Guest Post by Willis Eschenbach
I see that Zeke Hausfather and others are claiming that 2018 is the warmest year on record for the ocean down to a depth of 2,000 metres. Here’s Zeke’s claim:

Figure 1. Change in ocean heat content, 1955 – 2018. Data available from Institute for Applied Physics (IAP).
When I saw that graph in Zeke’s tweet, my bad-number detector started flashing bright red. What I found suspicious was that the confidence intervals seemed far too small. Not only that, but the graph is measured in a unit that is meaningless to most everyone. Hmmm …
Now, the units in this graph are “zettajoules”, abbreviated ZJ. A zettajoule is a billion trillion joules, or 1E+21 joules. I wanted to convert this to a more familiar number, which is degrees Celsius (°C). So I had to calculate how many zettajoule it takes to raise the temperature of the top two kilometres of the ocean by 1°C.
I go over the math in the endnotes, but suffice it to say at this point that it takes about twenty-six hundred zettajoule to raise the temperature of the top two kilometres of the ocean by 1°C. 2,600 ZJ per degree.
Now, look at Figure 1 again. They claim that their error back in 1955 is plus or minus ninety-five zettajoules … and that converts to ± 0.04°C. Four hundredths of one degree celsius … right …
Call me crazy, but I do NOT believe that we know the 1955 temperature of the top two kilometres of the ocean to within plus or minus four hundredths of one degree.
It gets worse. By the year 2018, they are claiming that the error bar is on the order of plus or minus nine zettajoules … which is three thousandths of one degree C. That’s 0.003°C. Get real! Ask any process engineer—determining the average temperature of a typcial swimming pool to within three thousandths of a degree would require a dozen thermometers or more …
The claim is that they can achieve this degree of accuracy because of the ARGO floats. These are floats that drift down deep in the ocean. Every ten days they rise slowly to the surface, sampling temperatures as they go. At present, well, three days ago, there were 3,835 Argo floats in operation.

Figure 2. Distribution of all Argo floats which were active as of January 8, 2019.
Looks pretty dense-packed in this graphic, doesn’t it? Maybe not a couple dozen thermometers per swimming pool, but dense … however, in fact, that’s only one Argo float for every 93,500 square km (36,000 square miles) of ocean. That’s a box that’s 300 km (190 miles) on a side and two km (1.2 miles) deep … containing one thermometer.
Here’s the underlying problem with their error estimate. As the number of observations goes up, the error bar decreases by one divided by the square root of the number of observations. And that means if we want to get one more decimal in our error, we have to have a hundred times the number of data points.
For example, if we get an error of say a tenth of a degree C from ten observations, then if we want to reduce the error to a hundredth of a degree C we need one thousand observations …
And the same is true in reverse. So let’s assume that their error estimate of ± 0.003°C for 2018 data is correct, and it’s due to the excellent coverage of the 3,835 Argo floats.
That would mean that we would have an error of ten times that, ± 0.03°C if there were only 38 ARGO floats …
Sorry. Not believing it. Thirty-eight thermometers, each taking three vertical temperature profiles per month, to measure the temperature of the top two kilometers of the entire global ocean to plus or minus three hundredths of a degree?
My bad number detector was still going off. So I decided to do a type of “Monte Carlo” analysis. Named after the famous casino, a Monte Carlo analysis implies that you are using random data in an analysis to see if your answer is reasonable.
In this case, what I did was to get gridded 1° latitude by 1° longitude data for ocean temperatures at various depths down to 2000 metres from the Levitus World Ocean Atlas. It contains the long-term monthly averages at each depth for each gridcell for each month. Then I calculated the global monthly average for each month from the surface down to 2000 metres.
Now, there are 33,713 1°x1° gridcells with ocean data. (I excluded the areas poleward of the Arctic/Antarctic Circles, as there are almost no Argo floats there.) And there are 3,825 Argo floats. On average some 5% of them are in a common gridcell. So the Argo floats are sampling on the order of ten percent of the gridcells … meaning that despite having lots of Argo floats, still at any given time, 90% of the 1°x1° ocean gridcells are not sampled. Just sayin …
To see what difference this might make, I did repeated runs by choosing 3,825 ocean gridcells at random. I then ran the same analysis as before—get the averages at depth, and then calculate the global average temperature month by month for just those gridcells. Here’s a map of typical random locations for simulated Argo locations for one run.

Figure 3. Typical simulated distribution of Argo floats for one run of Monte Carlo Analysis.
And in the event, I found what I suspected I’d find. Their claimed accuracy is not borne out by experiment. Figure 4 shows the results of a typical run. The 95% confidence interval for the results varied from 0.05°C to 0.1°C.

Figure 4. Typical run, average global ocean temperature 0-2,000 metres depth, from Levitus World Ocean Atlas (red dots) and from 3.825 simulated Argo locations. White “whisker” lines show the 95% confidence interval (95%CI). For this run, the 95%CI was 0.07°C. Small white whisker line at bottom center shows the claimed 2018 95%CI of ± 0.003°C.
As you can see, using the simulated Argo locations gives an answer that is quite close to the actual temperature average. Monthly averages are within a tenth of a degree of the actual average … but because the Argo floats only measure about 10% of the 1°x1° ocean gridcells, that is still more than an order of magnitude larger than the claimed 2018 95% confidence interval for the AIP data shown in Figure 1.
So I guess my bad number detector must still be working …
Finally, Zeke says that the ocean temperature in 2018 exceeds that in 2017 by “a comfortable margin”. But in fact, it is warmer by only 8 zettajoules … which is less than the claimed 2018 error. So no, that is not a “comfortable margin”. It’s well within even their unbelievably small claimed error, which they say is ± 9 zettajoule for 2018.
In closing, please don’t rag on Zeke about this. He’s one of the good guys, and all of us are wrong at times. As I myself have proven more often than I care to think about, the American scientist Lewis Thomas was totally correct when he said, “We are built to make mistakes, coded for error” …
Best regards to everyone,
w.
PS—when commenting please quote the exact words that you are discussing. That way we can all understand both who and what you are referring to.
Math Notes: Here is the calculation of the conversion of zettajoules to degrees of warming of the top two km of the ocean. I work in the computer language R, and these are the actual calculations. Everything after a hashmark (#) in a line is a comment.
heatcapacity=sw_cp(t=4,p=100) # heat capacity, with temperature and pressure at 1000 m depth print(paste(round(heatcapacity), "joules/kg/°C")) [1] "3958 joules/kg/°C" seadensity=gsw_rho(35,4,1000) # density, with temperature and pressure at 1000 m depth print(paste(round(seadensity), "kg/cubic metre")) [1] "1032 kg/cubic metre" seavolume=1.4e9*1e9 #cubic km * 1e9 to convert to cubic metres print(paste(round(seavolume), "cubic metres, per levitus")) [1] "1.4e+18 cubic metres, per levitus" fractionto2000m=0.46 # fraction of ocean above 2000 m depth per Levitus zjoulesperdeg=seavolume*fractionto2000m*seadensity*heatcapacity/1e21 print(paste(round(zjoulesperdeg), "zettajoules to heat 2 km seawater by 1°C")) [1] "2631 zettajoules to heat 2 km seawater by 1°C" z1955error = 95 # 1955 error in ZJ print(paste(round(z1955error/zjoulesperdeg,2),"°C 1955 error")) [1] "0.04 °C 1955 error" z2018error = 9 # 1955 error in ZJ print(paste(round(z2018error/zjoulesperdeg,3),"°C 2018 error")) [1] "0.003 °C 2018 error" yr2018change = 8 # 2017 to 2018 change in ZJ print(paste(round(yr2018change/zjoulesperdeg,3),"°C change 2017 - 2018")) [1] "0.003 °C change 2017 - 2018"
Very timely analysis. The print media yesterday widely remarked on this paper. Dr Zeke Hausfather and Dr Cheng were quoted in some of the articles. USA Today writer Doyle Rice attributes Zeke as saying 2018 was the fourth warmest year on record for atmosphere and the warmest year on record for the oceans. I noticed the New York Times has a front page article about this. Science has an ‘Perspective’ article written by Dr Cheng, et al, that says the study refutes the argument of the global warming pause by supporting the case for the extra heat being absorbed in the ocean. 93% of the extra heat ends up in the ocean. The ocean warmed in the period 1990 to 2010 total of 0.1C. In this comment they discuss the difference in the results of their findings and the previous estimates in units of Wm to -2 power. The lay press focuses on the numbers 40% with modify ‘greater’ to promote the sense of crisis now I am sure.
This is all very fascinating. The IQ levels here are so cool to watch from someone who is just an average joe.
I do have a question though…. regardless of all the arguments for or against global warming…. If those who say it is warming are correct then isn’t the proposed fix likes pissing on a forest fire? I mean with over three billion people pooping in ditches who are on the cusp of the benefits of fossil; fuels to dig wells grow crops, build schools hospitals and roads and industrialize. Won’t their contributions overwhelm anything we could possibly do to combat it? I mean is the west really willing to go to war with brown people so they don’t sell oil to black people? The middle east has nothing but oil and sand and they can;’t sell sand to feed their people.
Just asking….
Joe
relax/
there is no man made warming. I could not find it.
http://breadonthewater.co.za/2019/01/06/does-man-made-climate-change-exist/
Isn’t that great?
earth [God] is bigger than you thought?
In the head post Willis said
My understanding is that 1 exponetiated to 21st power ( 1^21) is still 1. So he must have meant 10E+21, a not insignificant difference. I’m sure it’s a typo but Willis claims to not like typos(neither do ,I or probably anyone cursed with the auditor gene) so maybe a good idea to fix.
258 comments and no correction; doesn’t anybody read that on which they comment?
Oops missed out an “n” above. Accursed typos.
By Lijing Cheng
, John Abraham
Zeke Hausfather
, Kevin E. Trenberth
Attempt number two.
In the head post Willis said
My understanding is that 1 exponetiated to 21st power ( 1^21) is still 1. So he must have meant 10E+21, a not insignificant difference. I’m sure it’s a typo but Willis claims to not like typos(neither do ,I or probably anyone cursed with the auditor gene) so maybe a good idea to fix.
258 comments and no correction; doesn’t anybody read that on which they comment?
That is the correct notation. E is symbolic for 10.
1E+21 = 1*10 to the 21st power.
Um…
” Willis claims to not like typos(neither do ,I or probably anyone cursed with the auditor gene…”
“exponetiated” ?
That is two comments, two corrections.
Just sayin’.
OK thanks , my mistake. So what purpose does the “1” serve as 10^21 is the same as 1*10^21?
Consistency in the notations? But more importantly, it lays out your significant digits. 2 x 10^21 carries a different meaning than 2.000 x 10^21.
BTW:
“E-notation[edit]
A calculator display showing the Avogadro constant in E-notation
Most calculators and many computer programs present very large and very small results in scientific notation, typically invoked by a key labelled EXP (for exponent), EEX (for enter exponent), EE, EX, E, or ×10x depending on vendor and model. Because superscripted exponents like 107 cannot always be conveniently displayed, the letter E (or e) is often used to represent “times ten raised to the power of” (which would be written as “× 10n”) and is followed by the value of the exponent; in other words, for any two real numbers m and n, the usage of “mEn” would indicate a value of m × 10n. In this usage the character e is not related to the mathematical constant e or the exponential function ex (a confusion that is unlikely if scientific notation is represented by a capital E). Although the E stands for exponent, the notation is usually referred to as (scientific) E-notation rather than (scientific) exponential notation. The use of E-notation facilitates data entry and readability in textual communication since it minimizes keystrokes, avoids reduced font sizes and provides a simpler and more concise display, but it is not encouraged in some publications”
https://en.wikipedia.org/wiki/Scientific_notation
Willis, in your Monte Carlo simulation, did you assume a normal random distribution (white noise) for the temperature data?
Or did you use the actual data and undersample it?
The reason I’m asking is 1×1 gridcell neighbors are likely strongly correlated, therefore the surface distribution is correlated and you get N^(something less than 1/2) increase in resolution with sample size.
So if your Monte Carlo analysis doesn’t use the same distribution of noise as the actual data, then your Monte Carlo analysis is optimistic in regards to increase of precision with samples.
I reread and found you used something analagous to the original data distribution
I’m amused that the resolution is less than that of the monthly variation.
It would be interesting to do an analysis of whether resolution improves by N^(1/2). I suspect not.
Peter
I’m thinking the improved precision from increased measurements applies only when multi-sampling a population (or batch) that has achieved steady-state well-mixed conditions, a normal distribution IOW. Ocean temperature can vary (as already discussed) greatly from one point to the next, in a current or out of it and, highly dependent upon the location taken, and of course will increase as the sun shines upon it, so is hardly a steady-state nor uniform condition. Increased precision can arrive only if multiple buoys are able to simultaneously sample the same drop of water, which can’t happen.
But… what exactly are we looking for? It’s reading a temperature. But Zeke is looking for a heat content from each measurement, which can be calculated from temperature… So this is just further proof that Zeke doesn’t have the certainty he thinks he has.
By using 1×1 degree grid, such that the further South or North of the equator the 1×1 cells become more narrow (tall trapezoids). Would it not make more sense to use 60 x 60 nautical mile cells? This would allow the cells the have equal weight when calculating the average cell temperature. Or are you taking into account the smaller volume of each cell as you move N or S? Note at he equator 60 Nm is about 1 degree. Curious.
I see the needed information is in the R code as “cubic metres, per levitus”
A little homework answers many questions and provides some context:
“The original plan advertised in the Argo prospectus called for a nearest-neighbour distance between floats, on average, of 3° latitude by 3° longitude.[4] This allowed for higher resolution (in kilometres) at itudes, both north and south, and was considered necessary because of the decrease in the Rossby radius of deformation which governs the scale of oceanographic features, such as eddies. By 2007 this was largely achieved, but the target resolution has never yet been completely achieved in the deep southern ocean”
https://en.wikipedia.org/wiki/Argo_(oceanography)
I think the typo in the copied text should have read:
“(in kilometers) at high latitudes, both north and south…”
IOW…they wanted better resolution at high latitudes.
Dan
{any] warming on earth is actually caused by increased UV radiation into the ocean [this is the process of increased UV via the window of ozone, peroxides and N-oxides and the related solar factors that produce these chemicals TOA]
this [warming] process would release both more H2O and more CO2 in the atmosphere.
the question is whether the entrapment of radiation of these components by earth 5-15 um is greater than the deflection of radiation 0-5 um
like I stated before ,
I don’t believe more CO2 or more H2O causes more warming. At least I could not find the trend here – and you say it should be a global ternd?
http://breadonthewater.co.za/2019/01/06/does-man-made-climate-change-exist/
Graphs linked at https://wattsupwiththat.com/2019/01/11/a-small-margin-of-error/#comment-2586016 show no correlation of CO2 and T. We agree. However, WV and T track each other quite closely as shown here
. Look closely, especially at the large moves, and WV change typically happens a few months before T change. The low effective thermal capacitance of the atmosphere allows the close tracking of a forcing thing and an energy thing.
Dan
you must realize of course that there must be a relationship (correlation) between (delta) heat and [CO2} and {H2O]?
the simplified reactions are:
H2O (l) + delta UV/heat / wind => delta H2O (g)
HCO3- + delta heat => CO2 (g) + OH- (there are giga tons of bi-carbonates in the oceans)
Is’nt that amazing?
Like someone once said here (on WUWT):
It is simple, really. It is warmer during the day than during the night because of the sun. It is warmer in summer than it is in winter because of the sun. So, more than likely, following simple logic, would you not say that if it is warmer now on earth than it was 100 years ago, it must also be because of the sun?
Hen,
It appears some perceive climate change must be caused by a single factor. IMO that would interfere with discovering the truth.
I found a mix of three factors that do an excellent job of explaining the measured average global temperature. The match is 98.3% 1895-2017. (I would not be surprised to discover that these three factors masked other unnamed factors.) The three factors are: sun 42%, SST cycle 23% and water vapor 35%. The contribution of the sun is quantified by a proxy which is the time-integral of SSN anomalies.
I ruled out CO2 as substantial contributor giving 8 examples of compelling evidence in Section 2 of http://globalclimatedrivers2.blogspot.com
Dan
yes , there might be some climate change but it is due to global cooling,
not warming.
Updating my file on Honolulu just now as an example I find that average temperatures have been rising at a rate of 0.0038K per annum since 1994, this is about 0.1K in total since 1994.
The amount is so small that effectively we must agree that temperatures have stayed the same?
However, maximum temperatures have started dropping there now, at a rate of -0.01K/annum, effectively down -0.24K since 1994.
So, it is only a matter of another few years or so to find the overall cooling trend also becoming apparent in the average temperature of Honolulu.
Now, my total data set of 54 stations, properly balanced to zero latitude shows the cooling trend clearly.
So, either I have it wrong or everybody else got it wrong. Allow me to stick to the latter opinion…
Willis,
You say:
“Call me crazy, but I do NOT believe that we know the 1955 temperature of the top two kilometres of the ocean to within plus or minus four hundredths of one degree.”
Surely climate science has demonstrated it’s ability precisely record temperatures over vast areas/ volumes. After all, in 1999 Mann et al reported Northern Hemisphere temperature changes for the year 1,000 A.D. to within 0.5°C using four proxy measurements for temperature and a hockey stick./s
Using ASTM or ISO methods for contractual performance tests you are not permitted to report temperatures with a precision greater than 1 K regardless of the number of calibrated reference thermometers/thermocouple you may have used.
I made a reply to a comment above…
Here it is as a comment.
What is the expansion rate of water going from liquid to vapor?… because when I see clouds I see water as a liquid expanded to water as a vapor. So when you’re talkin thermal mass….?
Clouds consist of bits of condensed water vapor, i.e. liquid water.
Thanks for the reply.
It just seems that water evaporating is water going from a liquid to a vapor.
Whatever, maybe I’m just thinking of a small margin of error.
If you can see it, it is not water vapor. Water vapor is a transparent gas. Evaporating water goes from a liquid to a gas (transparent water vapor) which mixes with the other gases in the atmosphere. Boiling (212 F) at 1 atm the volume increases about 27 times. Evaporating at 70 F the vapor pressure is only about 0.0247 atm and the volume of the saturated gas is 868 times the volume of the liquid it came from.
What you can see as a cloud is tiny bits of liquid water (or bits of ice) which has condensed from the water vapor. The bits are so tiny they stay suspended in the air (sort of like dust). If the bits grow big enough by merging with other bits or condensing more vapor they fall as rain (or snow or hail). I suspect that the confusion is in the definition of water vapor. The word vapor is commonly used to refer to something that you can see.
Dear Mr. Eschenbach,
shouldn’t there be a warming of the oceans?
https://www.mpg.de/research/deep-sea-hot-springs-atlantic?c=2249
“This could change our understanding of the contribution of hydrothermal activity to the thermal budget of the oceans.” I didn’t hear anything about new calculations except this:
https://www.nature.com/articles/s41586-018-0655-4.epdf?referrer_access_token=hd2Rppe7xz7wGl9YfvzxjtRgN0jAjWel9jnR3ZoTv0PvwTnwonfdy9SsXj9usRNiG5tsmK6ohRdqVzycb5LRF41M69MZZn-b0nEt5AATmu2MAu141oAl7VbaSVZjVsIDihJ3_mzODhSzfI_Hx3yMWyaHqSmJFv6DTj-TlBmUgWFNkivJEqAJ2VqNtLlL2HrETl5nNR4c5hVVVNYMvN2alOzqX3Gv1OCAXDgR8inO01lcZWFlc9fdm63s8coxMsbn&tracking_referrer=www.spektrum.de
https://www.nature.com/articles/s41586-018-0655-4
Unfortunately is no german sceptic interested in focusing on this theme. Besides you maybe?
Best regards
M. Koecher
Willis,
Late comment, but there was link here from Climate Etc.
I made a quick check using Argo Marine atlas + spreadsheet, and believe that your analysis is flawed.
I fetched gridded 1×1 degree 0-2000 dbar depth-averaged temperatures, 60 N-60 S, for June 2018, both absolute temperatures and anomalies.
I used the “Flat Earth” approach and assumed that all gridcells have the same size.
In June 2018, the average temperature of all 28667 gridcells was 6.12 C with a 95% CI of +/-0.024 C.
Your data for June is substantially warmer, around 6.90 C with much larger uncertainty, +/- 0.07 CI.
You have around 5000 more gridcells than me, so I suspect that you use marginal seas with a depth less than 2000 m. These areas may get “spuriously” warm if deep cool water aren’t include in the 0-2000 m profiles. This may increase both average and variance in your data.
You have also forgotten that the use of anomalies removes a lot error, in this case reducing it by an order of magnitude. The anomaly field (wrt 2004-2016) of June 2018 has a 95% CI of only 0.0021 C.
Furthermore, Zeke use 12 month running means, so the monthly uncertainty should be divided with the sqrt of 12, reducing it to only 0.0006 C.
Thus, there is plenty of room for other uncertainties as well , so the +/- 0.003 C uncertainty (95% CI) for yearly data in the IAP dataset seems reasonable…
Olof
the reason why you [climate] guys are never getting it right is because you do not balance your sampling stations/ argo whatever etc. nh / sh/ 50/50 %. Hence the sampling is biased towards the nh which has been warming.
namely I found that the rates of warming nh and sh are not the same if you look at the weather stations over the past 40 years?
perhaps it has something to do with the movement of earth’s inner core / the magnetic north pole has been moving, meaning that the elephant in the room has been moving as well?
in my case a simple procedure of taking equal amount of sample stations nh/sh blanced to zero latitude gave a surprising result.
i.e. there is no man made global warming….
click on my name to read my reports on that.
Olof, you cannot assume that the gridcells are the same size. They vary in size from about 100 sq. km. at the poles to 12,300 sq km at the equator.
w.
Willis,
Yes, I know the importance of area-weighting, but I just imitate what I think you are doing. If you make random sampling from a gridded dataset, the data away from the equator gets overrepresented because the gridcells are smaller, and there are more of them then their area represents. This will make the average cooler and variance larger (see below).
I can see the advantages with an equal area gridded dataset like Berkeley Earth’s, but they are quite uncommon.
The proper area-weighted temperature for June 2018 (0-2000 dbar, 60N-60S) is 6.44 C, whereas the not area-weighted gridcell average is 6.11 C (sorry, wrote 6.12 above). The temperature profiles at 60 N are typically 3 C colder than at the equator, and the temperature at 60 S is around 6 C colder than at the equator.
Hence, the not area-weighted mean will have a larger variance, because data that deviate from the mean gets overrepresented, so I think it is safe to use this approach since it wouldn’t exaggerate a low uncertainty.
Also, it would be very difficult (or impossible) to do parametric statistics (mean, SD, SE, CI, t-test, etc) if the data hasn’t equal weight.
I still think the major errors in your analysis are that you sample data that hasn’t full 0-2000 m profiles, and that you don’t use an anomaly approach to reduce error.
Olof
clearly, you must agree with me that it is impossible to try and estimate a global average from the temperature of all the waters from pole to pole. For one thing, there must be issues regarding calibration at each measuring point. It is like looking for a needle in a haystack.
You are all fooling yourselves with what you are doing..
If this were my job and if somebody paid me for my work, I would look at the data from each argo measuring point averaged over every calendar year in the past and determine the trend at each station by looking at the derivative of the least square equation giving me the speed of cooling/warming in K/annum. Then I would also make sure that the number of stations nh equals the number of stations sh and that all the stations looked at balance to zero latitude.
This eliminates a lot of error, especially those by calibration and longitude. If you do it right, you will [eventually[] agree with me that it is globally cooling.
https://wattsupwiththat.com/2019/01/11/a-small-margin-of-error/#comment-2593050
Click on my name to read my reports.
I agree immediately that the oceans get warmer. This is the main driver for the global warming.
The reason is the agriculture: the fertilizers are dumped into oceans that leads to algae flourishing. The water absorbs more sunlight in the upper layers and is warming.
This IS a significant problem.
Alex
you may be right,
in fact my results generally support the argument that turning a desert into an oasis traps heat [Las Vegas], whereas chopping all the trees in the area leads to losses of warmth. [Tandil, ARG]
At the moment I am inclined to believe from all of my collected data that the good we do [greening] more or less balances the evil [the un-greening] leaving us with the natural factors determining live as we know it.
but in the case that you make for similarity in the water [due to more algae growth] you have to come up with some experiment and measurement that would prove this?
Let me know.
Alex
you may be right,
in fact my results generally support the argument that turning a desert into an oasis traps heat [Las Vegas], whereas chopping all the trees in the area leads to losses of warmth. [Tandil, ARG]
At the moment I am inclined to believe from all of my collected data that the good we do [greening] more or less balances the evil [the un-greening] leaving us with the natural factors determining live as we know it.
but in the case that you make for similarity in the water [due to more algae growth] you have to come up with some experiment and measurement that would prove this?
Let me know.
More false confidence from the usual suspects.
Mind you Zeke has form for lying his pants off and lets not forget his intentional attempt to mislead by using different models to fool people into thinking the IPCCs forecasts and temperature obs are close
So
Is it cooling or warming?
The silence is deafening. We have record snowfall in just about every mountain range in the world, and this is due to global warming? Are you guys for real? And there is absolutely no one of all you clever guys who measured that earth is actually cooling, not warming?
I am astonished. As you will note mostly in the NH that you will have to shove snow in front of your houses later and later in the season, you will soon come to realize that it must be due to cooling, not warming.
It is just this very clever move of the IPCC to start talking about “climate change’ . That way you can explain any type of natural climate change due to the GB cycle as being [largely] due to man made {CO2}
By my calculations the [dust bowl] droughts are now coming back to the great plains of America. We already had our fair share of the drought time here in South Africa.
Namely, as the temperature differential between the poles and equator grows larger due to the cooling from the top, very likely something will also change on earth. Predictably, there would be a small (?) shift of cloud formation and precipitation, more towards the equator, on average. At the equator insolation is 684 W/m2 whereas on average it is 342 W/m2. So, if there are more clouds in and around the equator, this will amplify the cooling effect due to less direct natural insolation of earth (clouds deflect a lot of radiation). Furthermore, in a cooling world there is more likely less moisture in the air, but even assuming equal amounts of water vapour available in the air, a lesser amount of clouds and precipitation will be available for spreading to higher latitudes. So, a natural consequence of global cooling is that at the higher latitudes it will become both cooler in winter and/or warmer and drier in summer.
Best wishes to all
Henry