Below are the RSS annual averages for 1998 and from 2009 to 2015 and monthly values for 2016. Prior to 2009, the annual average differences varied from -0.001 to + 0.001.
| Year | Oct3 | Oct4 | Diff |
|---|---|---|---|
| 1998 | 0.550 | 0.550 | 0.000 |
| 2009 | 0.217 | 0.223 | 0.006 |
| 2010 | 0.466 | 0.475 | 0.009 |
| 2011 | 0.138 | 0.144 | 0.006 |
| 2012 | 0.182 | 0.188 | 0.006 |
| 2013 | 0.214 | 0.230 | 0.016 |
| 2014 | 0.253 | 0.273 | 0.020 |
| 2015 | 0.358 | 0.381 | 0.023 |
| Jan | 0.665 | 0.679 | 0.014 |
| Feb | 0.977 | 0.989 | 0.012 |
| Mar | 0.841 | 0.866 | 0.025 |
| Apr | 0.756 | 0.783 | 0.027 |
| May | 0.524 | 0.543 | 0.019 |
| Jun | 0.467 | 0.485 | 0.018 |
| Jul | 0.469 | 0.492 | 0.023 |
| Aug | 0.458 | 0.471 | 0.013 |
| Sep | 0.576 | ||
| Oct | 0.350 |
This topic was discussed in an informative article on WUWT in October, which I will build on to explain how the adjustments affect the possibility of a 2016 record in light of the October anomaly.
To begin let us see how the RSS adjustments may effect the comparison between 1998 and 2016. The average value for the first 8 months using the October 3 numbers is 0.6446. The average value for the first 8 months using the October 4 numbers is 0.6635. The difference between these numbers is 0.0189.
The average of all 10 numbers under the October 4 column is 0.6234. Using this number, what would be required for 2016 to tie 1998 is an average of 0.183 for each of the last two months of this year. In other words, the last two months need to drop by an average of 0.167 from the October anomaly which was 0.350.
As I said above, the average difference between the new and old numbers for the first eight months of 2016 was 0.0189. Now let us assume that the old numbers for the first ten months of 2016 were 0.0189 lower than the present numbers. That would give an average of 0.6045. With that number, the average for November and December that would be required for 2016 to set a record is 0.2775. That is an average drop of 0.0725 from the present October anomaly of 0.350. It would be different if we had an older and lower October anomaly.
Of course it is much easier to drop an average of 0.0725 rather than 0.167. When the December numbers are in, we will know the impact of RSS adjustment on 2016 average and whether it may break the 1998 record.
Comparatively, here is what is necessary for UAH to set a record in 2016. After large drops from February to June, the anomalies changed course and rose. The average of the last four months is 0.418, which is 0.08 above the June anomaly of 0.338! Keep in mind that ENSO numbers dropped every month this year. To set a record in 2016, the average anomaly for UAH for the last two months has to be 0.219. This represents a drop of 0.189 from the October anomaly.
It is still possible for the 1998 record to stand after 2016 for both RSS and UAH, however that would require a significant drop in the November anomaly from the October anomaly in each case, but much more significant for RSS, than UAH.
Another impact of the RSS adjustment is on the length of the recent pause. Before, the pause length was 18 years and 9 months. Naturally, with every average anomaly going up since 2009, this prior pause length has now shortened. The longest period of time of over 18 years where the slope is the minimum is from December 1997. Prior to the latest adjustments, the slope from December 1997 to August 2016 was 0.277/century. In order to compare apples to apples, the new slope from December 1997 to August 2016 is 0.396/century. That is an increase of 43%. It is now significantly harder for the pause to return using RSS.
Note: The October 4 numbers utilized for the analysis above may vary slightly from the present RSS numbers by up to 0.002 due to additional minor recent adjustments by. For the latest numbers from RSS, see the table below.
In the sections below, we will present you with the latest facts. The information will be presented in two sections and an appendix. The first section will show for how long there has been no statistically significant warming on several data sets. The second section will show how 2016 so far compares with 2015 and the warmest years and months on record so far. For three of the data sets, 2015 also happens to be the warmest year. The appendix will illustrate sections 1 and 2 in a different way. Graphs and a table will be used to illustrate the data. Only the satellite data go to October.
Section 1
For this analysis, data was retrieved from Nick Stokes’ Trendviewer available on his website. This analysis indicates for how long there has not been statistically significant warming according to Nick’s criteria. Data go to their latest update for each set. In every case, note that the lower error bar is negative so a slope of 0 cannot be ruled out from the month indicated.
On several different data sets, there has been no statistically significant warming for between 0 and 23 years according to Nick’s criteria. Cl stands for the confidence limits at the 95% level.
The details for several sets are below.
For UAH6.0: Since October 1993: Cl from -0.029 to 1.792
This is 23 years and 1 month.
For RSS: Since July 1994: Cl from -0.011 to 1.784 This is 22 years and 4 months.
For Hadcrut4.4: The warming is statistically significant for all periods above three years.
For Hadsst3: Since February 1997: Cl from -0.029 to 2.124 This is 19 years and 8 months.
For GISS: The warming is statistically significant for all periods above three years.
Section 2
This section shows data about 2016 and other information in the form of a table. The table shows the five data sources along the top and other places so they should be visible at all times. The sources are UAH, RSS, Hadcrut4, Hadsst3, and GISS.
Down the column, are the following:
1. 15ra: This is the final ranking for 2015 on each data set.
2. 15a: Here I give the average anomaly for 2015.
3. year: This indicates the warmest year on record so far for that particular data set. Note that the satellite data sets have 1998 as the warmest year and the others have 2015 as the warmest year.
4. ano: This is the average of the monthly anomalies of the warmest year just above.
5. mon: This is the month where that particular data set showed the highest anomaly prior to 2016. The months are identified by the first three letters of the month and the last two numbers of the year.
6. ano: This is the anomaly of the month just above.
7. sig: This the first month for which warming is not statistically significant according to Nick’s criteria. The first three letters of the month are followed by the last two numbers of the year.
8. sy/m: This is the years and months for row 7.
9. Jan: This is the January 2016 anomaly for that particular data set.
10. Feb: This is the February 2016 anomaly for that particular data set, etc.
19. ave: This is the average anomaly of all months to date.
20. rnk: This is the rank that each particular data set would have for 2016 without regards to error bars and assuming no changes to the current average anomaly. Think of it as an update 50 minutes into a game.
| Source | UAH | RSS | Had4 | Sst3 | GISS |
|---|---|---|---|---|---|
| 1.15ra | 3rd | 3rd | 1st | 1st | 1st |
| 2.15a | 0.261 | 0.381 | 0.760 | 0.592 | 0.87 |
| 3.year | 1998 | 1998 | 2015 | 2015 | 2015 |
| 4.ano | 0.484 | 0.550 | 0.760 | 0.592 | 0.87 |
| 5.mon | Apr98 | Apr98 | Dec15 | Sep15 | Dec15 |
| 6.ano | 0.743 | 0.857 | 1.024 | 0.725 | 1.11 |
| 7.sig | Oct93 | Jul94 | Feb97 | ||
| 8.sy/m | 23/1 | 22/4 | 19/8 | ||
| Source | UAH | RSS | Had4 | Sst3 | GISS |
| 9.Jan | 0.540 | 0.679 | 0.906 | 0.732 | 1.16 |
| 10.Feb | 0.831 | 0.991 | 1.068 | 0.611 | 1.34 |
| 11.Mar | 0.733 | 0.868 | 1.069 | 0.690 | 1.30 |
| 12.Apr | 0.714 | 0.784 | 0.915 | 0.654 | 1.09 |
| 13.May | 0.544 | 0.543 | 0.688 | 0.595 | 0.93 |
| 14.Jun | 0.338 | 0.485 | 0.731 | 0.622 | 0.75 |
| 15.Jul | 0.388 | 0.492 | 0.728 | 0.670 | 0.84 |
| 16.Aug | 0.434 | 0.471 | 0.768 | 0.654 | 0.97 |
| 17.Sep | 0.441 | 0.578 | 0.714 | 0.607 | 0.91 |
| 18.Oct | 0.408 | 0.350 | |||
| 19.ave | 0.537 | 0.624 | 0.841 | 0.646 | 1.03 |
| 20.rnk | 1st | 1st | 1st | 1st | 1st |
| Source | UAH | RSS | Had4 | Sst3 | GISS |
If you wish to verify all of the latest anomalies, go to the following:
For UAH, version 6.0beta5 was used.
http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tlt/tltglhmam_6.0beta5.txt
For RSS, see: ftp://ftp.ssmi.com/msu/monthly_time_series/rss_monthly_msu_amsu_channel_tlt_anomalies_land_and_ocean_v03_3.txt
For Hadcrut4, see: http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.5.0.0.monthly_ns_avg.txt
For Hadsst3, see: https://crudata.uea.ac.uk/cru/data/temperature/HadSST3-gl.dat
For GISS, see:
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
To see all points since January 2016 in the form of a graph, see the WFT graph below.

As you can see, all lines have been offset so they all start at the same place in January 2016. This makes it easy to compare January 2016 with the latest anomaly.
The thick double line is the WTI which shows the average of RSS, UAH6.0beta5, HadCRUT4.4 and GISS. Unfortunately, WTI will not be updated until HadCRUT4.5 appears.
Appendix
In this part, we are summarizing data for each set separately.
UAH6.0beta5
For UAH: There is no statistically significant warming since October 1993: Cl from -0.029 to 1.792. (This is using version 6.0 according to Nick’s program.)
The UAH average anomaly so far for 2016 is 0.537. This would set a record if it stayed this way. 1998 was the warmest at 0.484. Prior to 2016, the highest ever monthly anomaly was in April of 1998 when it reached 0.743. The average anomaly in 2015 was 0.261 and it was ranked 3rd.
RSS
Presently, for RSS: There is no statistically significant warming since July 1994: Cl from -0.011 to 1.784.
The RSS average anomaly so far for 2016 is 0.624. This would set a record if it stayed this way. 1998 was the warmest at 0.550. Prior to 2016, the highest ever monthly anomaly was in April of 1998 when it reached 0.857. The average anomaly in 2015 was 0.381 and it was ranked 3rd.
Hadcrut4.5
For Hadcrut4.5: The warming is significant for all periods above three years.
The Hadcrut4.5 average anomaly so far is 0.841. This would set a record if it stayed this way. Prior to 2016, the highest ever monthly anomaly was in December of 2015 when it reached 1.024. The average anomaly in 2015 was 0.760 and this set a new record.
Hadsst3
For Hadsst3: There is no statistically significant warming since February 1997: Cl from -0.029 to 2.124.
The Hadsst3 average anomaly so far for 2016 is 0.646. This would set a record if it stayed this way. Prior to 2016, the highest ever monthly anomaly was in September of 2015 when it reached 0.725. The average anomaly in 2015 was 0.592 and this set a new record.
GISS
For GISS: The warming is significant for all periods above three years.
The GISS average anomaly so far for 2016 is 1.03. This would set a record if it stayed this way. Prior to 2016, the highest ever monthly anomaly was in December of 2015 when it reached 1.11. The average anomaly in 2015 was 0.87 and it set a new record.
Conclusion
Does it surprise you that the RSS adjustment made the pause much more difficult to resume and made it much easier for 2016 to break the 1998 record?
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
It would make sense to validate the ‘guesstimation’ algorithms used for those areas with missing surface stations. Surface stations with measurements should be guesstimated using the algorithms and the real measures compared with the guesstimate. My hypothesis is that the guesstimates will be found to be wildly inaccurate and ‘unsurprisingly’ all on the high side. Remember the vast majority of surface temperatures are invented by these guesstimates. How many surface stations are measuring the poles where these extreme temperatures are guesstimates? The warmest year evah is that by a mathematically invented few hundredths of a degree – based on these unvalidated guesstimates.
Ian W on November 20, 2016 at 3:51 am
It would make sense to validate the ‘guesstimation’ algorithms used for those areas with missing surface stations.
Even in the arctic regions, you have 251 active GHCN stations within 60N-70N, 44 within 70N-80N, and 3 within 80N-82.5N.
What they measure is by evidence much higher than what satellites measure (a linear trend of 8.6 °C / decade for 60N-70N, and of over 12 °C / decade above). But the average trend for the highest latitudes satellites provide data for (80N-82.5N) is far above 4 °C / decade too.
What wonders me all the time is that:
– the same people who complain so loud about “missing surface stations” are the same people who are fully satisfied of comparisons between satellite measurements and a handful of measurements by radiosondes;
– scientifically approved engineering interpolation methods like kriging, used all around the world by ten thousands of people in their daily work, are incessantly questioned by exactly one category of persons: climate sceptics lacking any really professional knowledge concerning what they doubt about.
Apos for a stoopid mistake: °C / century !
Is that so wonderful that people (self proclaimed “climate scientists”) that repeatedly showed they cannot be trusted and have rather push their political agenda than science, are splashing doubt on everything they touch ? Should they state that 2+2=4, many will begin to doubt that, too.
Climate “science” tremendously damaged science.
And, remember, you’ll find as many “AGW believers lacking any really professional knowledge concerning what they believe in” as you’ll find “climate sceptics lacking any really professional knowledge concerning what they doubt about”.
paqyfelyc: are you a retired teacher? You behave so terribly similar…
Bindidon
Who cares who I am ? what if i were a retired SS, current president-elected, a janitor, a TV-star, or your niece ?
IMO It don’t smell very good when people worry more about who’s talking than about the validity of what is said.
This is the 138th comment on a post debating whether the butcher should rest his thumb on one side of the scale or the other and how hard should he press.
How about Just weigh the meat
To the attention of all those commenters who write about “large modifications” performed this year in the RSS3.3 TLT dataset, here is a comparison of these so-called “large modifications” with the difference between UAH5.6 TLT and UAH6.0beta5 TLT:
http://fs5.directupload.net/images/161121/8m39gno4.jpg
Critique is a good tool indeed, but it must be conveniently used.
The differences between the differences 🙂 are eevn a bit more visible below:
http://fs5.directupload.net/images/161112/2q7jcz25.jpg
so this article is saying that it is now impossible to cherry pick a starting point with this multiply adjusted data which shows a ‘pause’?
Meanwhile arctic sea ice sets a new record low for this time of year, arctic temps have shown a 36 degree F anomaly and the sea ice extent actually decreased over the weekend…
https://ads.nipr.ac.jp/vishop/#/extent
…a clear indicator of warming!
The long pause disappeared in February. However the long pause was expected to return if there was a long and deep La Nina. But these adjustments delayed the return of the pause to a further time into the future or possibly put into question whether it ever will return on RSS.
The point of the pause was, according to AGW theory it could happen for short duration, but not 15 y or more. But it happened nonetheless: the theory is broken, end of story. Even if the pause ended (or not … a random walk can be up for long period, too, you know?).
“…a clear indicator of warming!” . Not so clear. Melting arctic glaciers revealing ancient forests from a few century ago, THAT is indeed a clear indicator of warming! Too bad that also show that Greenland was actually green, that Alaska was, too, all well before GHG hysteria. What you think was the extent of arctic sea ice back in those day ?
“Indicators of warming” does not equate to “indicators of human-induced,” or for that matter, “indicators of CO2-induced,” warming. Since “natural variation” is poorly understood in some mighty significant ways, whether there is some minuscule upward trend in “average” temperature (another relatively meaningless metric) over some relatively short period of time doesn’t support any call to “action” in terms of “policy.” When you can scientifically prove that CO2 is the driver of warming, AND that human fossil fuel burning is the cause of rising CO2 levels, AND that the amount of warming caused thereby will be extreme and catastrophic, in the real world (as opposed to the computer model fantasy world), then come and present it. Until then, there’s ACTUAL problems to use our resources to solve.
All the accounts I have seen of ‘ancient forests’ underneath glaciers have been from at least 1000 years ago, which ones are you referring to?
+1 AGW is not Science
@Phil : those from when Greenland was named, as should be obvious from my full sentence
paqyfelyc November 22, 2016 at 2:18 am
+1 AGW is not Science
@Phil : those from when Greenland was named, as should be obvious from my full sentence
OK but where are these “Melting arctic glaciers revealing ancient forests from a few century ago”
I’m unable to find any reference to them?
Inside the search bar, I typed “ancient arctic forests” and pressed enter, and the first article that came up was:
https://wattsupwiththat.com/2014/02/06/another-hot-model-forest-emissions-wildfires-explain-why-ancient-earth-was-so-hot/
Werner Brozek November 22, 2016 at 10:02 pm
“OK but where are these “Melting arctic glaciers revealing ancient forests from a few century ago”
I’m unable to find any reference to them?”
Inside the search bar, I typed “ancient arctic forests” and pressed enter, and the first article that came up was:
Which doesn’t refer to “Melting arctic glaciers revealing ancient forests from a few century ago”, either!
The sentence that caught my eye was:
But keep looking with the search bar for something that hits the nail on the head in a better way for you.
That WUWT article was about Arctic forests in the Pliocene time, at least 2 million years ago. I think the claims about retreating glaciers refer to the Mendenhall, of which Wiki says:
“The most recent stumps emerging from the Mendenhall are between 1,400 and 1,200 years old. The oldest are around 2,350 years old. Some have dated around 1,870 to 2,000 years old.”
Thank you Nick!
Feel free to comment on
https://wattsupwiththat.com/2016/11/18/rss-resets-former-pause-length-and-2016-record-race-now-includes-september-and-october-data/#comment-2350606
Griff on November 21, 2016 at 5:38 am
…a clear indicator of warming!
You simply discredit at least yourself. As commenter paqyfelyc correctly notes, there are plenty of proofs about that, but your chart does exactly the contrary, and thus your comment is…
… a clear indicator of alarmism.
If you want to show warming, I allow me to propose as source any data showing a trend over some longer period 🙂
It’s 10am.
I just woke up.
Maybe I’m still a little hazy.
This article put me in a bad mood.
It appears that I am looking at average temperature anomaly data in thousandths of a degree Centigrade.
There are no average temperature measurements that support three decimal places!
I doubt if an accuracy of +/- 0.1 degrees Centigrade is possible since the satellite measurements are not of the temperature, they are not made on the surface of the planet, the poles are not covered well, data from many satellites have to be combined, and there are many “adjustments” to the raw data too.
There are also humans involved who may have biases, although Mr. Spencer and his UAH try very hard to avoid any appearance of bias — yet you chose to present mainly RSS data?
I can only come up with two reasons to use three decimal places:
(1) Trying to con people that the data are extremely accurate (as NASA does with two decimal places for their surface average temperature), or the one I believe is true
(2) Mathematical mass-turbation (the author loves to play with numbers).
Presenting average temperature data with three decimal places is bad science.
Don’t we already get enough bad science from the scaremongers?
Author JusttheFactsWUWT must be slapped upside the head and retired until he can come up with an article that refutes some of the climate scaremongers’ false claims.
Playing with thousandths of a degree C. (unintentionally) makes him a “useful i-diot” for the scaremongers, and here’s why:
— The scaremongers look at the FOREST when they claim +2 degrees warming will end life on Earth as we know it
— You probably expected me to say this article looks at the TREES in the forest.
— Wrong.
— This article looks at the LEAVES on the trees in the forest !
— The scaremongers LOVE to have skeptics debating tenths of a degree C. while they spin their +2 degree C. tipping point nonsense.
— Even better for skeptics to be busy discussing thousandths of a degree Centigrade!
Since no one seems to care about data margins of error for the average temperature of the surface of our planet, I’m unofficially declaring the following margins of error based on common sense — I would be happy if there is proof the margins are smaller:
Surface data since 1880 +/- 1 degree C.
Surface data since 1980 +/- 0.5 degree C.
Satellite data since 1979 +/- 0.25 degrees C.
In my opinion claims of a +/- 0.1 degree C. margin of error are unproven bull—-
I’m going back to sleep.
My free climate blog for non-scientists
Covers politics and science of climate
http://www.elOnionBloggle.Blogspot.com
Do not blame JusttheFacts. I wrote the article and he edited it. I accept the full blame for not realizing that my name was missing until late in the first day. I had no intention of mentioning this fact, but with your criticism of JusttheFacts, I had no choice.
Richard Greene on November 22, 2016 at 8:52 am
Your lack of knowledge and experience in the field debated here is horrifying.
The data as downloaded and presented by Wener Brozek very often is subject to further analysis or even combination with other datasets. The more accuracy in the data, the better the further processing.
That’s the reason why e.g. Roy Spencer publishes data with two digits behind the decimal point. Three would even be better.
I’m quite happy about that! So I can compare his data with e.g. GHCN or ERSST4, or compute out of it linear trend estimates which otherwise would degenerate to bare nonsense.
If you don’t like this level of accuracy, so simply manage to ignore it. But please don’t expect from others to do such a stupid job for you.
Character attacks on me do not make YOU seem intelligent.
The article even had some numbers with four decimal places.
This is false precision.
And just a waste of time.
There is no three or two decimal point accuracy in average temperature data.
We were angry when NASA presented the average temperature of our planet with two decimal places while claiming hard to believe accuracy of +/- 0.1 degree C.
Three and four decimal places are even worse.
These data are not accurate to 0.1 tenth of a degree C. — no matter who claims that accuracy — if I’m wrong about that provide evidence that I am wrong.
A character attack is not evidence!
This is statistical mass-tutbation by people who love playing with numbers!
Statistical analysis of inaccurate, rough data to three decimal places is a waste of time
If all skeptics were nasty like you, and in love with meaningless thousandths of a degree false accuracy, the global warmunists will win!
The warmunists look at the forest — the +2 degree rise is the tipping point
The skeptics should refute them.
Instead ,this article looks at the leaves on the trees in the forest.
Anomalies in thousandths of a degree C.???
This is not science.
It is mathematical mass-turbation that does nothing to refute the coming global warming catastrophe fantasy.
This is EXACTLY what the warmunists want the skeptics to spend time on.
They’s be happy if we looked at tenths of a degree.
Hundredths of a degree = even better to occupy our time.
Thousandths of a degree = a total waste of skeptics’ time.
As a retired physics teacher, I completely understand the rules for significant digits. But in these articles, I use the numbers they provide for the various data sets. If I were to round off all numbers to the nearest whole degree, all numbers on my tables would be either a 1 or a 0. How useful would that be? Or sure, I could say +/- 0.1 after each number, but why clutter things up needlessly? Just take all numbers with a grain of salt.
There is no two, three or four decimal place accuracy in average temperature data.
Was it not just a few years ago that NASA claimed the average temperature of our planet set a new record by something like two hundredths of a degree C. ?
And NASA did that while claiming a very hard to believe +/- 0.1 degree C. margin of error.
Bad math and bad science.
If you believe satellite data claims of +/- 0.1 degree C. accuracy, which I don’t, you could round off all data to the nearest one-tenth of a degree C.
I never suggested you must round to the nearest degree — that is a “red herring” you tossed in just to ridicule me.
Your article is a poster child for the false precision logical fallacy.
http://research.omicsgroup.org/index.php/False_precision
Any conclusions are nearly meaningless without reasonable margins of error considered.
I believe you have done similar articles in the past — please stop!
It is time to stop your number games, and write something useful to refute the coming global warming catastrophe myth.
Statistical analyses applied to rough, inaccurate average temperature numbers do not make the numbers more accurate — in fact, false conclusions are possible … but many people can be impressed by three decimal places, and that must be why you love false precision.
Richard Greene on November 27, 2016 at 1:07 pm
There is no two, three or four decimal place accuracy in average temperature data.
I quote the site you linked to:
However, in contrast, it is good practice to retain more significant figures than this in the intermediate stages of a calculation, in order to avoid accumulated rounding errors.
That is, as I told you more than once, exactly the reason why these numbers have so many digits behind the decimal point.
If, instead of having so much time to waste in producing useless comments about data, you had to do daily professional work with that data, you never would write such comments.
So please let people do their job as they need to do.
True! I apologize for that!
However you did say:
So exactly what do you expect me to do? What I will do is use the numbers they give me and at the end of the year I will see if their new record, should it occur, is statistically significant or not. Rounding off all numbers in all intermediate months and subtracting rounded differences may give a totally different number than rounding at the end.
Let me illustrate with an example. Suppose I had a number like 24.746 and decided that I can only go to the nearest 1/100. That would give 24.75. But then I changed my mind and decided I should go only to the nearest 1/10. Then 24.75 becomes 24.8. But if I rounded 24.746 to the nearest 1/10 right away. I would have gotten 24.7.
So wait until the end of the year and I will decide which records are statistically significant or not.
In case you are wondering, all 5 data sets that I am covering may set a record in 2016, but only GISS has a chance of being statistically significant.
Werner: I give you one thumb up!
Do you know where the “tipping point” comes from? It was the German Hans Joachim Schellnhuber who compares the earth with a human body. He determined, when the body has a temperature of +2° = 39 °C it has fever, so does the earth when it has 2 ° more. That’s the scientific basis for the global warming terrorist fraction.
I knew about Hans Joachim Schellnhuber and the 2 degrees, but I did not know he compared it to the human body.
https://www.youtube.com/watch?v=Z7uK92wKInw 4:30 ff (German language)
Danke Schoen!
Go watch “No Certain Doom” video on this site. ALL these temps are irrelevant, as ALL the models have the same plus or minus errors that after a century of calculations equal plus or minus 14 C. Non of the models means ANYTHING!
…to those who simply do not know the difference between actual in situ measurements, powerpoint video numbers, and model projections.