Climate Deception: How The “Hottest” Temperature Game Is Played To Offset Prediction Failures
Guest essay by Dr. Tim Ball
Global temperature is not doing what the “official” Intergovernmental Panel on Climate Change (IPCC) predicted. Proponents of the claim humans are the cause of warming and the cooperative media react by trying to deflect, divert and perpetuate fear. They exploit people’s lack of knowledge and understanding. A January 2013 ABC News headline said, “2012 Was 9th Warmest Year on Record, Says NASA” is a classic example of how the public are deliberately misled. It is deliberate because it distorts, is out of context, and exploits manipulation of statistics or as Disraeli summarized, “Lies, damn lies and statistics.”
The deception begins with the headline but is expanded in the article. The challenge is to know what is actually being said. Initially, you need a translator, but can develop sufficient propaganda detectors once the methods are identified. There are guidelines that work in most circumstances:
Don’t believe anything you read; Question everything; Be especially suspicious of numbers; Know the source and political bias; If you’re affected by the story get at least three other sources; Remember all government information and data is biased; Be especially wary of stories that cite authorities.
The opening paragraph to the ABC story says,
“The year 2012 was the ninth warmest globally since record keeping began in 1880, said climate scientists today from NASA. NOAA, crunching the numbers slightly differently, said 2012 was the tenth warmest year, and both agencies said a warming pattern has continued since the middle of the 20th century.”
The implied threat is the temperature continues its inexorable trend up. The record is 133 years long and with a general warming trend. When would you expect to find the warmest years? Figure 1 provides a hint.

Figure 1
Why are they drawing attention to this by focussing on the “ninth warmest”? Because for the last 15 years the trend has leveled and declined slightly in contradiction to their forecast. Figure 2 shows what is actually happening.

Figure 2
The IPCC claim with over 90 percent certainty that Figure 2 is not suppose to happen. Here is the actual data;

Figure 3
Notice how the shift caused a change in terminology to divert attention from the fact that CO2 was no longer causing increasing warming. CO2 levels continue to rise, but temperatures don’t follow. It completely contradicts their predictions, which is why they want to divert attention.
How meaningful is the temperature increase? What is the accuracy of the measure? IPCC says there was a “trend of 0.6 [0.4 to 0.8]°C (1901-2000)” , that is for most of the period in the news story. Notice the error range is ±0.2°C or ± 33%. It is a meaningless record.
The story cites NOAA and NASA in the standard appeal to authority. However, it’s offset by the observation that they are “crunching the numbers slightly differently” to explain why they disagree between 9th and 10th on the list. How can that be? Aren’t they using the same data? All agencies produce different average temperatures because they select different stations and “adjust” them differently. NASA GISS consistently produces the higher readings, and were most active politically when James Hansen was in charge. They both use the grossly inadequate surface station data.
Although the article limits its claim by acknowledging it is only the 9th warmest in the official record, most people believe it is the 9th warmest ever. It is a misconception deliberately created by political activists like Al Gore and not openly refuted by governments. It is like Gore’s claim that CO2 levels are the highest ever when they are actually the lowest in 300 million years.
So, how long and complete is the official record? A comprehensive study was produced by D’Aleo and Watts “Surface Temperature Records: Policy-Driven Deception?” detailing what was done. Two graphs from NASA GISS show the general pattern.

Figure 4 (Source NASA GISS)
There are fewer than 1000 stations with records of 100 years and most of them are severely compromised by growth of urban areas. Equally important, is the decline in the number of stations they consider suitable, especially after 1990. This pattern also partly explains why the current readings are high (Figure 5). Temperature increases as the number of stations used are reduced.

Figure 5
Number of stations plotted against temperature.
Although they condition the terminology “hottest” with “on record” most people assume it is “ever”. This implication was deliberate. The IPCC rewrote history by eliminating the Medieval Warm Period (MWP) that was warmer than today. Weather agencies, increased the slope of temperature by lowering the old record – New Zealand is a good example (Figure 6).

Figure 6
Global temperatures are not following “official” predictions, so those who used global warming for a political agenda try to defend the indefensible. This proves it is political because scientific method requires you admit your science is wrong, determine why, and if possible make adjustments.
– See more at: http://drtimball.com/2013/climate-deception-how-the-hottest-temperature-game-is-played-to-offset-prediction-failures/#sthash.N7kgUFhU.dpuf
Man is an expert liar. Man is also an expert at deception, that is, phrasing a fact so that it supports your opinion.
We need scientist to be honest enough to look at a fact and admit his opinion (hypothesis) was wrong. Most times and in most fields (I hope!) that’s not an issue. A scientist is out find out what is really going on.
Add politics (search for power) and greed (search for grants) and corruption enters in. (I suppose pride and search for fame is also involved.)
Robertv says: @ur momisugly August 28, 2013 at 12:28 am
… But We The People don’t protest anymore because we fear our government….
>>>>>>>>>>>>>>>>>
We don’t protest any more because it could mean a ten year jail sentence as a Felony. With a felony conviction you may lose your right to vote permanently if you reside in certain states like Alabama, Florida, Delaware, Virginia and eight other states. In all but two state you lose the right to vote while in prison and sometimes during Parole and Probation.
Washington Times:
From Huffington Post:
Voting rights of a felon in the USA:
If you thought the USA was a free country that respected it’s Constitution… Well I guess you were wrong.
Jer0me says: @ur momisugly August 28, 2013 at 4:01 am
I was also very interested in the sudden drop in stations used for collecting data in 1990. I saw that animation here many years ago, where the stations were plotted on a world map, year by year. the massive drop-off in 1990 was startling.
I, for one, want to know why this was, and how much of the ‘global warming’ of the 90′s was caused by this…..
>>>>>>>>>>>>>>>
May I suggest you look at the articles by Verity @ur momisugly digging in the clay on and around this link. She did a lot of research and several articles as did E. M. Smith See AGW is a Thermometer Count Artifact and also see his Thermometer Zombie Walk
“Pseudepigraphy, the false use of more famous historical names to make your point or writings more significant, was not uncommon before modern times and the Internet.” ~J Gary Fox
There is the anecdote of the historian who commissioned one of his researchers to determine whether the authors referring to past works had accurately represented what the original source had said. They found that in most cases, the citations were not really accurate characterizations of previous works, and in some cases were complete fabrications. And it is worse with the internet, because others very easily repeat the mischaracterized reference.
“Pseudepigrapha (also Anglicized as “pseudepigraph” or “pseudepigraphs”) are
falsely attributed works,
texts whose claimed authorship is represented by a separate author; or
a work, “whose real author attributed it to a figure of the past.” ~wik
Jordan says:
August 28, 2013 at 12:05 am
I’d like to know more about that figure 5.
We expect the average of the entire Earth surface to be around 15 deg C.
If we have poorly distributed measurement stations (eg tending to be concentrated in mid latitudes like USA and Europe) their simple average would be a good deal lower than 15. Even if there are many thousand of stations.
* * *
Is this what Figure 5 is showing?
PLUS… Thorsten: (August 28, 2013 at 4:02 am)
PLUS… Steve from Rockwood (August 28, 2013 at 5:19 am) AND others…
First, Fred Berple says: There is no evidence these stations were any more unreliable than stations anywhere else. That may be correct, but is the converse true? Is there any evidence that these stations were more reliable than stations anywhere else? We’ve heard for years that the U.S. network was “among the best” in the world, yet we have seen evidence presented by Mr Watts (among others) that many U.S. stations are, and continue to be, very poorly sited, and influenced by the UHI, which NOAA implies they can address my corrections.
A bigger issue here is HOW the global mean temp is calculated using the base data from fluctuating numbers of stations. I have been told it is a mathematical construct based on grids. The grids range in shape from roughly rectangular (in a flat projection) near the Equator to triangular at the poles. A mean temp is created — mathematically — for each grid from those met stations that are deemed acceptable to the powers that be. From those mean grid temps, which represent huge geographic areas (especially nearer the Equator), a global mean is created. Two immediate problems:
1. Is that model in and of itself fully appropriate? Are there other procedures that are more robust statistically? I’m guessing here, but I’ll bet it was designed more for computing ease than anything else. We’ll disregard, for the moment, what relevance to reality is actually associated with a calculated “global mean temp.”
2. If there are no met stations inside the grid, temperature values are created — “transported” in by some mathematical construct — from neighboring grids, even if those grids have no reliable met stations within them. In the Arctic and Antarctic, as well as over great expanses of oceans there are no existing, permanent met stations with any real history much less those with >100 years. As a result, interpolation and extrapolation must necessarily be used to “fill the gaps.” As must extrapolation and interpolation based on other extrapolated/interpolated values if the neighboring cells also lack quality met stations.
Of course, there’s always the issue of the missing original met records, those once claimed to have been lost due to limited disc drive space (while the “value-added” dataset was retained), and the myriad adjustments over time, those same adjustments we now know (h/t whomever is behind Climategate 1.0) were not documented by the researchers as to what was done and on what basis.
The IPCC is persistently guilty of a kind of deceptive argument that is not the one the one described by Dr. Ball. In fact, in his current article, Ball makes an argument of this kind. I’m sure this error is inadvertent.
The kind of deceptive argument that I have in mind is an equivocation. I prove the existence of it and of the equivocation fallacy that results from it in the peer-reviewed article at http://wmbriggs.com/blog/?p=7923 .
An equivocation is an argument in which one or more terms change meaning in the midst of the argument. By logical rule, a proper conclusion may not be drawn from an equivocation. To draw an improper conclusion is the equivocation fallacy.
A term that is capable of changing meanings is said to be “polysemic.” In the language of climatology, the word-pair predict/project is polysemic in the frequently observed circumstance that the two words in the word pair are treated as synonyms. As I show in the above referenced article, “predict” has a meaning and “project” has a meaning and the two meanings differ. When the difference between the two meanings is observed, it is found that none of the climate models referenced by AR4 predict. All of them project. A model that projects is incapable of conveying information to a policy maker about the outcomes from his or her policy decisions. Thus, this type of model is useless for the purpose of guiding policy decisions on CO2 emissions. When well meaning people such as Dr. Ball use “project” and “predict” as synonyms this has the effect of covering up the failure of global warming research to provide a basis for making policy.
I erred. In my recent post please strike the second instance of “the one” in the first paragraph.
Monday, February 18, 2013Its the Sun stupid – The minor significance of CO2
1 The IPCC’s Core Problem
The IPCC – Al Gore based Anthropogenic Global Warming scare has driven global Governments’ Climate and Energy Policies since the turn of the century. Hundreds of billions of dollars have been wasted on uneconomic renewable energy and CO2 emission control schemes based on the notions that it is both necessary and possible to control global temperatures by reducing CO2 emissions. All this vast investment is based on the simple idea that as stated in the IPCC AR4 report
“we conclude that the global mean equilibrium warming for doubling CO2, or ‘equilibrium climate sensitivity’, is likely to lie in the range 2°C to 4.5°C, with a most likely value of about 3°C. Equilibrium climate sensitivity is very likely larger than 1.5°C.”
These values can only be reached by adopting two completely unfounded and indeed illogical assumptions and procedures
1. CO2 is simply assumed to be the main climate forcing .This is clearly illogical because at all time scales CO2 changes follow temperature changes.
2. Positive feedback from the other GHGs – notably water vapour and methane is then added on to the effects of CO2 and attributed to it. Obviously, in nature, the increase in CO2 and Humidity are both caused by rising temperatures. It is also impossible to have a net positive feedback because systems with total positive feed back are not stable and simply run away to disaster. We woudn’t be here to tell the tale if it were true.
From its inception the IPCCs remit was to measure Anthropogenic Climate Change and indeed Climate Change was defined as Anthropogenic until the 2011 SREX report when the definition was changed.The climate science community simply designed their models to satisfy the political requirements of their funding agencies. – Publications ,acadmic positions,peer approval , institutional advancement and grants were unlikely to be forthcoming unless appropriate forecasts of catastrophic warming were dutifully produced. The climate models have egregious structural errors and ,what is worse, in their estimates of uncertainty the IPCC reports for Policymakers simply ignored this structural uncertainty and gave policy makers and the general public a totally false impression of the likely accuracy of their temperature forecasts.It is this aspect of the AGW meme which is especially unconsionable.
The inadequacy, not to say inanity, of the climate models can be seen by simple inspection of the following Figure 2-20 from the AR4 WG1 report.
Fig1
The only natural forcing is TSI and everything else is anthropogenic. For example under natural should come such things as eg Milankovitch Orbital Cycles,Lunar related tidal effects on ocean currents,Earths geomagnetic field strength and all the Solar Activity data time series – eg Solar Magnetic Sield strength, TSI ,SSNs ,GCRs ,( effect on aerosols,clouds and albedo) CHs, MCEs, EUV variations, and associated ozone variations and Forbush events. Unless the range and causes of natural variation are known within reasonably narrow limits it is simply not possible to calculate the effect of anthropogenic CO2 on climate.
The results of this gross error of scientific judgement is seen in the growing discrepancy between global temperature trends and the model projections. The NOAA SSTs show that with CO2 up 8% there has been no net warming since 1997, that ,the warming trend peaked in 2003 and that there has been a cooling trend since that time.
ftp://ftp.ncdc.noaa.gov/pub/data/anomalies/annual.ocean.90S.90N.df_1901-2000mean.dat
The gap between projections and observations is seen below
Fig 2 ( From Prof. Jan-Erik Solheim (Oslo) )
2, The Real Climate Drivers.
Earths climate is the result of resonances between various quasicyclic processes of varying wavelengths. The long wave Milankovich eccentricity,obliquity and precessional cycles are modulated by solar “activity” cycles with millenial centennial and decadal time scales .These in turn interact with lunar cycles and endogenous earth changes in Geomagnetic Field strength ,volcanic activity and at really long time scales plate tectonic movements of the land masses.The combination of all these drivers is mediated through the great oceanic current and atmospheric pressure systems to produce the earths climate and weather.
To help forecast decadal and annual changes we can look at eg the ENSO PDO, AMO NAO indices and based on past patterns make reasonable forecasts for varying future periods. Currently the PDO suggests we may expect 20 – 30 years of cooling in the immediate future.Similarly for multidecadal,centenial and millennial predictions we need to know where we are relative to the appropriate solar cycles.The best proxies for solar “activity”are currently ,the Ap index, and the GCR produced neutron count .The solar indices are particularly important for their past history these can be retrieved from the 10 Be data,
In a previous post on htpp://climatesense-norpag.blogspot.com on 1/22/13 – Global Cooling – Timing and Amount(NH) I have made suggestions of possible future cooling based on a repetion of the solar millenial cycle. Here I point out for the modellers the value of using the Ap index as a proxy measure of solar activity. Compare the Northern Hemisphere HADSST3 Temperature anomaly since 1910 with the AP index since 1900 . Because of the thermal inertia and slow change in the enthalpy of the oceans there is a 10 – 12 year delay between the driver proxy and the temperature.
Fig 3 – From Hadley Center
Fig 4 From http://www.leif.org/research/Ap-1844-now.png
There are some good correlations .The 1900 and 1965 Ap lows correspond to the NH temperature minima at 1910 and 1975 respectively . The 1992 Ap peak ( Solar Cycle 22) corresponds to the 2003 temperature high and trend roll over- and as shown in the previous post referred to above might well represent the roll over of the millenial solar cycle which brought the Medieval and Roman warming peaks. The NH is used because it is more sensitive to forcing changes and its greater variability makes correlation more obvious.
As a simple conceptual model the Ap index can be thought of as simple proxy for hours of sunshine especially when mentally integrated over a 10 -12 year period. See Wang et al
http://www.atmos-chem-phys.net/12/9581/2012/acp-12-9581-2012.pdf
As far as the future is concerned the Solar Cycle 23/24 Ap minimum in end 2009 is as low as the 1900 minimum and would suggest both a secular change in solar activity in about 2006 and a coming temperature minimum at about 2019/20. This change is also documented for TSI by Adbussamatov 2012 http://www.ccsenet.org/journal/index.php/apr/article/view/14754
Fig 5.
As a final example for this post the following figure from Steinhilber et al http://www.pnas.org/content/early/2012/03/30/1118965109.full.pdf
shows the close correlation of successive Little Ice Age Minima with cosmic Ray intensity.
Fig 6
CONCLUSION : IT IS NOW CLEAR THAT THE Ap/GCR/10BE DATA ARE THE BEST PROXY MEASURES OF
THE EARTHS TEMPERATURE DRIVER OVER MILLENIAL CENTENNIAL AND DECADAL TIME SCALES.
THE BEST WAY OF FORECASTING THE FUTURE IS TO PREDICT FUTURE SOLAR CYCLES AT THESE WAVELENGTHS KEEPING IN MIND THE EARTHS MAGNETIC FIELD
The above is courtesy of Dr. Norman Page.
Salvatore Del Prete says:
August 27, 2013 at 12:31 pm
Past history shows beyond a shadow of a doubt that the sun is the driver of the earth climatic system, and this time is going to be no different.
If one goes back in past history it will show prolonged active solar periods have been associated with a rise in temperatures while prolonged solar minimum periods have been associated with a drop in temperature.
However neutral solar activity or solar activity that has been established over a long period of time remaining more or less the same will show very weak or no correlations to the climate at all ,or even run counter to the climate and that is where so many of you are getting tripped up.
So many are so short sided, and so many of you fail to grasp the secondary effects that come when the sun changes from a prolonged active state to a prolonged minimum state.
So many of you have no concept of climatic thresholds, so many of you don’t understand the beginning state of the climate has much to do on how the climate will wind up even if the same forcings are applied.
So many of you don’t understand that the climate is non linear, and cycles only work when the climate is in the same climatic regime and even then they are a guide at best.
They say bond events occur in a cycle every 1470 years, that is a qusi cycle at best with a plus or minus 500 year difference from the mean which in effect does not make it very cyclic.
So many of you ignore completly the real issue of why the climate has changed abruptly from time to time in a period of a few decades. Cycles do not explain and cannot to be made to fit in with past abrupt climate changes that have taken place on earth .
My conclusion is that present day mainstream climatolgist are an embarrassment to this very interesting field ,and have set it back by decades, while their AGW theory will meet it’s end before this decade ends.
The temperature trend is going to be down once the maximum of solar cycle 24 passes by which is not very far off. I have mentioned the solar parameters needed to set all of this in motion many times in the past.
1.solar flux sub 90 sustained
2. ap index sub 5.0 or less sustained
3. solar wind 350 km/sec or lower sustained
4. UV light off upwards of 50% extrme uv light wavelengths sustainded.
5. solar irradiance off upwards of.015% sustained
The above, following several years of sub solar activity in general which we have had post year 2005, in contrast to very active solar conditions previously.
Clueless fools.
Once last note with the sun versus the climate is as follows: The CATCH is the degree of magnitude change of solar activity and duration of time of solar activity has to reach a certain critical level in order to overcome random earthly climatic changes and or influence these random earthly climatic items(such as enso,volcanic actiivty ,cosmic rays/clouds,to name a few) which will allow them to phase in line with the solar activity rather then show no corrrelation at all when solar activity is neutral or not changing over the course of many decades.
In addition I maintain the GHG effect is a result of the climate ,not the cause of the climate. It comes as a result of the amounts of co2 /water vapor that are in the atmosphere which are tied to oceanic temperatures which is tied to the total energy in the earth climatic system to begin with.
NOTE: a weakening geomagnectic field wil serve to amplify any solar effects.
Solar changes from states of prolonged active to prolonged inactive conditions are the best explanation to explain all of the many eratic jigsaw climatic changes over the earth throughout the ages.
I’d still like to understand more about that Figure 5. Does anybody know where it comes from and how it was calculated.
Look at the scale on the LHS – it ranges from 9.0 to 12.5. The legend says this is “Average-T”, so we can suppose it is an average temperature in degC. If it if an illustration of global mean temperature, we should expect a value of around 15degC, +/- a few tenths (call this the “expected value”).
Regardless of how the average is being calculated in Figure 5, the method and/or data is clearly a very poor estimator of global average T: the error is systematically low by several degC. Eugene discusses gridding (say, averaging by grid and then combining into a global average), but surely this would be a method which aims to adjust for poor distribution, and we would get something much closer to the expected value from gridding (if there is enough data in each grid to address sampling error in the gridded data).
That’s why I suggested it could be a simple average over all data. It appears to be an estimate of average T, and not anomalies. If so, it would provide a very useful indicator of a sampling issue with the “instrumental record”.
Anomalies, and differences are frequently argued to produce a useful measure of TREND (emphasis) in Average T, and it is suggested that this can be precise, without needing to measure average T itself. There is a problem with this claim (possibly shown in the above graph): if the instrumental record is so poorly distributed that it produces a huge error in Average T, it takes quite a leap to argue that the network can produce an acceptable assessment of the trend in Average T. I certainly don’t find this convincing.
Perhaps I can finish by repeating a comment I made above: a small number of stations (with good spatial distribution) might do a much better job of estimating average T compared to a large number of stations with a biased distribution. If there is enough data available in the “instrumental record”, I think it would be better to extract the data from a subset than to try to use all of the data.
The subset would be a network of stations with good distribution which can actually get close to the expected value with an unbiased error distribution.
Stephen Wilde, you are exactly correct about the data changing due to a difference in the number of reporting stations.
As I have said AGW theory is already proven wrong(in the eyes of many) and will be obsolete before this decade is out.
It is the most asinine climate theory I have ever come across.
Figure 5 explains the problem quite well.
SATELLITE DATA should help to alleviate this problem as move further into the future.
Terry Oldberg:
At August 28, 2013 at 12:29 pm
http://wattsupwiththat.com/2013/08/27/the-hottest-temperature-game/#comment-1402076
you are being egregious when you accuse Tim Ball of making an equivocation error in his article but you do not state what it is.
The accusation is especially offensive because your post demonstrates you do not understand what you have written.
You say
No.
I list some of the misunderstandings concerning equivocation and of IPCC predictions stated in your words I have quoted here.
1.
You are incorrect when you say
That is not true. The truth is
An equivocation is an argument in which one or more terms change meaning in the midst of the argument. By logical rule, a proper conclusion may not be drawn from an equivocation THAT AFFECTS THE CONCLUSION.
2.
You mislead when you claim
In its definitions the IPCC says
So, the IPCC defines a prediction is the projection with highest confidence.
The definition does NOT provide an equivocation because the definition makes a clear distinction between
a prediction (i.e. the forecast with highest confidence)
and
a projection (i.e. a forecast with less confidence than another forecast).
Furthermore, a projection can be converted to become a prediction if it gains confidence, and this does not create an equivocation.
It is important to note that a projection can become a prediction without there being an equivocation. And whether or not your paper has been peer–reviewed has no relevance to this.
3.
You are plain wrong when you say
The IPCC defines that a model’s projection with highest confidence is a prediction.
When the IPCC provides a forecast that the IPCC says is a prediction then the IPCC has made a prediction.
How and why the IPCC made that prediction does not – and cannot – prevent that prediction from being a prediction.
4.
You make a logical error when you refuse to accept a forecast as being a prediction when the forecaster states the forecast is a prediction.
The forecaster alone knows the intention of the forecast. And it is not possible for anyone else to know the intention of the forecaster is other than the forecaster says.
Therefore, when the forecaster says the forecast is a prediction then there is no possibility of anyone disproving it is a prediction: the most anybody can do is to show the prediction is improbable.
In light of the above, I am willing to accept that your unsubstantiated affront to Tim Ball derives from you not knowing what you are talking about. But, whatever your reason for that affront, you need to withdraw your assertion or substantiate it.
Richard
ferd berple says:
August 28, 2013 at 6:38 am
——————————————-
I am not convinced. Fig.5 shows a step change which should show up in the temperature record which would increase in 1990 and then remain constant. But what we see is a gradual increase in global temperature to 1998 and then a levelling off for almost 16 years. There is no correlation between Fig. 5 and global temperatures.
Steve Keohane says:
August 28, 2013 at 7:29 am
——————————————-
Thanks Steve for the links. I reiterate that the trend in global temperatures does not correlate with Fig. 5 so I conclude the removal of Russian stations does not explain the warming. Otherwise we would see a step change in 1990 and we do not.
jbird says:
“If there is any truth left, it is found in this statement. Whether it is about global warming or some other issue, you simply can no longer trust our government to report the truth. I can’t believe it has come to this.”
It’s time to dust off those old government jokes that the then USSR populace used to tell, and make them applicable to the USA government.
Salvatore, you depend wholly on wriggle matching. And you don’t even do that right. Your premise is not any kind of scientifically valid form of scientific discourse regarding determining climate drivers. In fact, your comments about the Solar/Earth climate connection are becoming more bizarre thus more easily attributed to someone who wriggle matches without any kind of understanding about the two systems he so willingly wants to connect.
I have been wanting to see this graph for a while, so I finaly ploted it on Wood for Trees.
I know, giant URL, but it is worth the look. It is the rolling decadal linear trend. It is interesting to see both the amplitude change and the slope change over the decades. I guess the level is the sum of the slopes preceding it.
I only ploted every second year to save time. Interesting to see that the only significant warming was for 3 of the decades (1989-1999, 1991-2001, and 1993 – 2003). What I would really like to see know is a plot of the slopes of the decadal linear trend.
http://woodfortrees.org/plot/rss/from:1983/to:2013/plot/rss/from:1983/to:1993/trend/plot/rss/from:1985/to:1995/trend/plot/rss/from:1987/to:1997/trend/plot/rss/from:1989/to:1999/trend/plot/rss-land/from:1991/to:2001/trend/plot/rss/from:1993/to:2003/trend/plot/rss/from:1995/to:2005/trend/plot/rss/from:1997/to:2007/trend/plot/rss/from:1999/to:2009/trend/plot/rss/from:2001/to:2011/trend/plot/rss/from:2003/to:2013/trend
I know, giant URL, but it is worth the look. It is the decadal linear trend. It is interesting to see both the amplitude change and the slope change over the decades.
Steve Keohane says: 7:29 am
Steve, I think this graph showing the physical location decimation came out at the same time as the graph you refer to,and gives an idea of location and density change as well as actual vs. estimated temperatures.
http://i44.tinypic.com/23vjjug.jpg
Wow! Thanks for that JPG. The differences between 1965, 1985 and 2005 are just stunning, if not incriminating.
I gotta ask… Are they right? Where are the original images and the source data?
The originals seem to be from John Goetz, Feb. 10, 2008,
published by Steve McIntyre at ClimateAudit: “Historical Station Distribution”
1965 map
1985 map
2005 map
It is one thing to estimate grid cells where there are no thermometers. It is quite another thing to estimate grid cells when you discard real thermometers located in the cells.
See also WUWT Oct. 15, 2012:GHCN’s Dodgy Adjustments In Iceland, or what I call The Smoking Shotgun in Iceland. Iceland is the test case that exposes the shenanigans going on is adjustment of temperatures records.
@ur momisugly Jeff in Calgary
Thanks for the graph. Bookmarked it. I’m so weary of being told the ‘last decade’ was the hottest on record which is true but irrelevant. It looks as though the 1997-2007 line is key here because 2007-17 is likely to be cooler and then they’ll have to shut [self-snip] up. But they won’t of course because they’ll say “we meant the last full decade and this one hasn’t finished yet”. You could bookmark this comment in return. That way you could present it with your graph, date and all, to the first alarmist who trots out the above predicted quote on January 1st 2018.
Would not the “ninth warmest year” be necessarily COOLER than eight previous years?
(Re: Glynn Mhor (I don’t spell my last name right, do I, heh) at 5:36pm). LOL, no. It is that 2012 year is the ninth time we’ve had a warmest year in a century*. From the Cult of Climatology bible: “‘You can have 12 warmest years in a century*,’ thus saith the High Prophet of Climatology.” Statisticus 1:21.
*”Century: any 100-year period starting after 1979.” (C.C. Book of Important Words, 2010 ed.)
Very childish stuff from lsvalgaard.
1900 to 1950 we have a gradual build up of monitoring sites to the point where we have pretty broad coverage of the globe.
Then in the 1960s we see a gradual decline in number of stations leading to a sudden drop in 1990.
The earlier history implies a natural growth in parallel with development, with probably no aim to produce a global figure.
The latter part can only imply some degree of cherry picking has occurred.