Guest Post by Leo Smith, Gary Pearse and Werner Brozek, Edited by Just The Facts


Image Credits: Picture Australia, and Doubleday and McClure 1897
“In 1894, the Times of London estimated that by 1950 every street in the city would be buried nine feet deep in horse manure. One New York prognosticator of the 1890s concluded that by 1930 the horse droppings would rise to Manhattan’s third-story windows.” No Fracking Consensus:
We may now laugh at the above perceived problem, or the additional things as mentioned by Gary Pearse below. But how will people in 50 years from now judge “us”? Will they laugh at our collective inability to see past our noses? Should we approach things differently?
The previous post with September data produced many gems among the over 500 comments. Unfortunately, I cannot use all gems in this post. I intend to use others in later posts. However for now, I will recopy two posts, one by Leo Smith and the other by Gary Pearse.
Leo Smith:
November 7, 2015 at 2:55 am
Robert G Brown is one of the few people I look up to, not because he has solved the problem of Climate (change) but because he accurately understands the almost complete impossibility of understanding it!
Years ago someone said to me that, faced with a problem that you didn’t know how to solve, you must go back to first principles. None of the theories and equations helping? Start from scratch and develop new ones!
Robert G Brown reminds us that the ‘easy’ problems that we can solve with the application of linear differential equations (and in a sense, scientific theories are simply differential equations, like F=ma) have already been solved, and that what remain are the fiendishly hard problems, that, even if we can identify the differential equations that govern system behaviour, are practically incalculable because of the inherent non linearity of multiple terms, all of which affect each other.
In essence this approach – finding the underlying (partial) differential equations – won’t work, not because we get the raw science wrong, but because the integration of those partials over time leads us into sensitivity issues and chaotic behaviours that is essentially the nature of the beast. Larger and larger supercomputers merely extend the size of the area we can predict with some degree of accuracy, from the minuscule to the pathetically small.
And this is why even this brave attempt to identify all the variables, and even establish the correct partial differential equations based upon them will not result in a computer model that accurately predicts the climate.
The only approach that I have ever come across that partially works, is to examine the possible cases and eliminate those that are completely unstable – that is if we consider all possible climates in terms of stability, we will find that huge collections of them are so mightily unstable that should perturbation of the system by e.g. volcanic eruption or meteor strike or even releases of lots of lovely CO2, cause the system to enter such a region, the overwhelming tendency would be to revert back to a more stable region.
That is, we might be able to map climate into zones of possible quasi stability, and zones of impossible instability. If you like instead of working out what the climate will be, we could at least ascertain what it simply couldn’t be. And then leave the rest as ‘what it could and might be’.
This alone is probably what an organisation like the IPCC should be tasked with – what are the possible states of future climate, what are their potential probabilities, and impacts, and how should we meet the challenges – not by attempting to stop them happening, but by identifying the physical and social and economic changes necessary to adapt to them.
In my time beyond engineering as a business man, I learnt a Golden rule. Do not expend effort on attempting to change that which is inevitable, nor attempting to solve that which is – for whatever reason, effectively insoluble: Rather use the techniques of pragmatism – as practised by both engineers, and oddly enough, the military, and consider all the possibilities, do the research or reconnaissance to ascertain which of them are likely, plan accordingly, make tentative steps forward, and as soon as it appears that the situation is not as it appeared to be, change the plan without shame.
In other words, going back to first principles, as a putative agent of government, what the real question is, is not ‘where is the climate going’ but ‘where might the climate go, with what probability, and, given that its unlikely we can in all honesty stop it, what should be a meaningful response that preserves as much of civilisation as is practicable’?
I know that the final answer would be along the lines of :
‘Almost anywhere within a degree or two, a few cm or so of sea level, a few cm or so of rainfall, and indeed along the any of the lines that the historical record have already shown us is certainly possible’ and as to what we ought to do about it, the final answer there would be: ‘be prepared with a contingency fund, to meet whatever Nature sends, but don’t waste a single halfpenny on trying to stop it or second guessing what its going to do, because frankly the mathematics is insoluble to that level of detail’.
And to PROVE that the ‘mathematics is insoluble to that level of detail’ is the first step.
It’s not just a matter of finding the right equations, don’t waste time on that. Because the simpler job is to prove that even if you did find them they wouldn’t actually allow the integration to a realistic and useful prediction anyway.
All we need to do is to have enough of the relevant parameters to show that the problem is chaotic and non linear, calculate the size of computer needed to give an answer in real time, rather than hindcasting, and that will show that all climate science of the sort that is claimed is ‘settled’ is in fact completely useless.
Not that it will change a damned thing politically, because the mathematics to do that would be beyond nearly everyone – especially ‘climate scientists’ who are mainly, at best, third rate alchemists – and as we know, that which passeth all understanding, is in the end a matter of faith to those whom it passeth….
(The above ends Leo’s post.)
Gary Pearse
November 7, 2015 at 1:05 pm
Leo, I believe your approach is eminently doable and, in part, is done! How many kinds of weather are there anyway? In the polar regions, what, 2-3, in the temperate zones 5 or 6, and in the tropics 2-3. Empirically it has reached these temperatures, these rainfalls/snowfalls (or lack thereof), these intensities and numbers of storms of a couple of types and the secondary effects – rates of sea-level change, droughts, fires etc. Also some physical, non weather stuff – volcanoes, tsunami, earthquakes, extraterrestrial bolides. We should be spending more money on tracking all the asteroids while we are at it and planning possible things that might be done.
I have a soft hypothesis -actually it might be better termed an axiom- that PREDICTIONS OF DOOMSTERS WILL NEVER COME TRUE. Such predictions are made using linear thinking of the kind discussed here for which a supportive legitimate mathematical expression is impossible. In the case of Malthusian disasters, their predictions are even less possible because they miss out the confounding principal component of human ingenuity in their thinking. Our cities didn’t end up being buried in horse manure (Malthus), the industrial revolution didn’t starve itself out by 1900 because of the shortage of coal (Jevons), we didn’t starve to death by 2000 and run out of mineral resources (Club of Rome, Holdren, Ehrlich) nor did we freeze to death by that date with the imminent man-made new ice age on the way (by the same people). Saudi oil minister Sheik Ahmed Zaki Yamani said it best in a 2005 interview with New York Times discussing peak oil: “The Stone Age didn’t end for lack of stone, and the oil age will end long before the world runs out of oil.”
The Club of Rome’s 1972 “Limits to Growth” and others by the same group in recent years were totally blown away. We have doubled the 1972 world population and have 7B people living better and longer than the 3.5B of 1972. That there are still apparently well educated persons making such doomster predictions is evidence more of their misfit psychology than the application of sound methods. All these predictions are made by biologists and social scientists whose training and knowledge are linear and more akin to accounting than to creative science. Such disciplines give the air of erudition but they are precisely the least equipped to make such predictions. Knowing the sex rituals of the chameleon, which do not change over a very long time if at all, or counting tiger turds in the jungle to calculate population, are not the kind of skills required to properly attempt to forecast the future of mankind and the planet.
Mention should also be made here of the inevitability of unexpected consequences (themselves arising from the same kind of lack of unpredictability inherent in “doom” and climate science) that have and will abound in any action that might be designed by doomsters to correct the perceived fantasy. Some of their geoengineering ideas are downright scary and definitely not the work of engineers (although I guess you could buy one). These aspects definitely also brand doomster climate scientists as political activists and social scientists.
(The above ends Gary’s comment.)
Before continuing with my regular post, I would like to point out some highlights in the October data and put these highlights into perspective.
The GISS anomaly for October at 104 smashed the previous all time high mark of 97 from January 2007.
However for RSS, its October value of 0.440 was beaten in October 1998 at 0.461. Furthermore, all of the first 10 months of 1998 beat 0.440. In addition, it was beaten for several months in 2010.
UAH6.0beta4 did have its highest October on record at 0.427. However all of the first 9 months of 1998 beat that mark. In addition, it was beaten for several months in 2010.
Hadcrut4 set an October record at 0.811. However this does not beat its all time high anomaly of 0.832 set in January of 2007.
GISS and Hadcrut4 and Hadsst3 will set new records in 2015, however both satellites will not get higher than third place. I am assuming of course that Dr. Spencer is not about to be replaced very soon by you know who.
In the sections below, as in previous posts, we will present you with the latest facts. The information will be presented in three sections and an appendix. The first section will show for how long there has been no warming on some data sets. At the moment, only the satellite data have flat periods of longer than a year. The second section will show for how long there has been no statistically significant warming on several data sets. The third section will show how 2015 so far compares with 2014 and the warmest years and months on record so far. For three of the data sets, 2014 also happens to be the warmest year. The appendix will illustrate sections 1 and 2 in a different way. Graphs and a table will be used to illustrate the data.
Section 1
This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative on at least one calculation. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.
1. For GISS, the slope is not flat for any period that is worth mentioning.
2. For Hadcrut4, the slope is not flat for any period that is worth mentioning.
3. For Hadsst3, the slope is not flat for any period that is worth mentioning.
4. For UAH, the slope is flat since May 1997 or 18 years and 6 months. (goes to October using version 6.0)
5. For RSS, the slope is flat since February 1997 or 18 years and 9 months. (goes to October)
The next graph shows just the lines to illustrate the above. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the upward sloping blue line at the top indicates that CO2 has steadily increased over this period.
Note that the UAH5.6 from WFT needed a detrend to show the slope is zero for UAH6.0.

When two things are plotted as I have done, the left only shows a temperature anomaly.
The actual numbers are meaningless since the two slopes are essentially zero. No numbers are given for CO2. Some have asked that the log of the concentration of CO2 be plotted. However WFT does not give this option. The upward sloping CO2 line only shows that while CO2 has been going up over the last 18 years, the temperatures have been flat for varying periods on the two sets.
Section 2
For this analysis, data was retrieved from Nick Stokes’ Trendviewer available on his website. This analysis indicates for how long there has not been statistically significant warming according to Nick’s criteria. Data go to their latest update for each set. In every case, note that the lower error bar is negative so a slope of 0 cannot be ruled out from the month indicated.
On several different data sets, there has been no statistically significant warming for between 11 and 22 years according to Nick’s criteria. Cl stands for the confidence limits at the 95% level.
The details for several sets are below.
For UAH6.0: Since January 1993: Cl from -0.018 to 1.669
This is 22 years and 10 months.
For RSS: Since April 1993: Cl from -0.033 to 1.566
This is 22 years and 7 months.
For Hadcrut4.4: Since January 2001: Cl from -0.048 to 1.334
This is 14 years and 9 months.
For Hadsst3: Since October 1995: Cl from -0.001 to 2.010
This is 20 years and 1 month.
For GISS: Since September 2004: Cl from -0.036 to 2.172
This is 11 years and 2 months.
Section 3
This section shows data about 2015 and other information in the form of a table. The table shows the five data sources along the top and other places so they should be visible at all times. The sources are UAH, RSS, Hadcrut4, Hadsst3, and GISS.
Down the column, are the following:
1. 14ra: This is the final ranking for 2014 on each data set.
2. 14a: Here I give the average anomaly for 2014.
3. year: This indicates the warmest year on record so far for that particular data set. Note that the satellite data sets have 1998 as the warmest year and the others have 2014 as the warmest year.
4. ano: This is the average of the monthly anomalies of the warmest year just above.
5. mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first three letters of the month and the last two numbers of the year.
6. ano: This is the anomaly of the month just above.
7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0. Periods of under a year are not counted and are shown as “0”.
8. sig: This the first month for which warming is not statistically significant according to Nick’s criteria. The first three letters of the month are followed by the last two numbers of the year.
9. sy/m: This is the years and months for row 8. Depending on when the update was last done, the months may be off by one month.
10. Jan: This is the January 2015 anomaly for that particular data set.
11. Feb: This is the February 2015 anomaly for that particular data set, etc.
20. ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months.
21. rnk: This is the rank that each particular data set would have for 2015 without regards to error bars and assuming no changes. Think of it as an update 50 minutes into a game.
| Source | UAH | RSS | Had4 | Sst3 | GISS |
|---|---|---|---|---|---|
| 1.14ra | 5th | 6th | 1st | 1st | 1st |
| 2.14a | 0.186 | 0.255 | 0.564 | 0.479 | 0.74 |
| 3.year | 1998 | 1998 | 2014 | 2014 | 2014 |
| 4.ano | 0.482 | 0.55 | 0.564 | 0.479 | 0.74 |
| 5.mon | Apr98 | Apr98 | Jan07 | Aug14 | Jan07 |
| 6.ano | 0.742 | 0.857 | 0.832 | 0.644 | 0.97 |
| 7.y/m | 18/6 | 18/9 | 0 | 0 | 0 |
| 8.sig | Jan93 | Apr93 | Jan01 | Oct95 | Sep04 |
| 9.sy/m | 22/10 | 22/7 | 14/9 | 20/1 | 11/2 |
| Source | UAH | RSS | Had4 | Sst3 | GISS |
| 10.Jan | 0.275 | 0.367 | 0.688 | 0.440 | 0.81 |
| 11.Feb | 0.173 | 0.325 | 0.660 | 0.406 | 0.87 |
| 12.Mar | 0.163 | 0.252 | 0.681 | 0.424 | 0.90 |
| 13.Apr | 0.085 | 0.175 | 0.656 | 0.557 | 0.73 |
| 14.May | 0.283 | 0.310 | 0.696 | 0.593 | 0.78 |
| 15.Jun | 0.331 | 0.392 | 0.730 | 0.575 | 0.77 |
| 16.Jul | 0.181 | 0.288 | 0.696 | 0.637 | 0.73 |
| 17.Aug | 0.274 | 0.390 | 0.740 | 0.665 | 0.79 |
| 18.Sep | 0.252 | 0.373 | 0.785 | 0.725 | 0.80 |
| 19.Oct | 0.427 | 0.440 | 0.811 | 0.700 | 1.04 |
| Source | UAH | RSS | Had4 | Sst3 | GISS |
| 20.ave | 0.244 | 0.331 | 0.714 | 0.570 | 0.82 |
| 21.rnk | 3rd | 3rd | 1st | 1st | 1st |
If you wish to verify all of the latest anomalies, go to the following:
For UAH, version 6.0beta4 was used. Note that WFT uses version 5.6. So to verify the length of the pause on version 6.0, you need to use Nick’s program.
http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tlt/tltglhmam_6.0beta4.txt
For RSS, see: ftp://ftp.ssmi.com/msu/monthly_time_series/rss_monthly_msu_amsu_channel_tlt_anomalies_land_and_ocean_v03_3.txt
For Hadcrut4, see: http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.4.0.0.monthly_ns_avg.txt
For Hadsst3, see: http://www.cru.uea.ac.uk/cru/data/temperature/HadSST3-gl.dat
For GISS, see:
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
To see all points since January 2015 in the form of a graph, see the WFT graph below. Note that UAH version 5.6 is shown. WFT does not show version 6.0 yet. Also note that Hadcrut4.3 is shown and not Hadcrut4.4, which is why the last few months are missing for Hadcrut.

As you can see, all lines have been offset so they all start at the same place in January 2015. This makes it easy to compare January 2015 with the latest anomaly.
Appendix
In this part, we are summarizing data for each set separately.
RSS
The slope is flat since February 1997 or 18 years and 9 months. (goes to October)
For RSS: There is no statistically significant warming since April 1993: Cl from -0.033 to 1.566.
The RSS average anomaly so far for 2015 is 0.331. This ties it at 3rd place. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2014 was 0.255 and it was ranked 6th.
UAH6.0beta4
The slope is flat since May 1997 or 18 years and 6 months. (goes to October using version 6.0beta4)
For UAH: There is no statistically significant warming since January 1993: Cl from -0.018 to 1.669. (This is using version 6.0 according to Nick’s program.)
The UAH average anomaly so far for 2015 is 0.244. This would rank it at 3rd place. 1998 was the warmest at 0.482. The highest ever monthly anomaly was in April of 1998 when it reached 0.742. The anomaly in 2014 was 0.186 and it was ranked 5th.
Hadcrut4.4
The slope is not flat for any period that is worth mentioning.
For Hadcrut4: There is no statistically significant warming since January 2001: Cl from -0.048 to 1.334.
The Hadcrut4 average anomaly so far for 2015 is 0.714. This would set a new record if it stayed this way. The highest ever monthly anomaly was in January of 2007 when it reached 0.832. The anomaly in 2014 was 0.564 and this set a new record.
Hadsst3
For Hadsst3, the slope is not flat for any period that is worth mentioning. For Hadsst3: There is no statistically significant warming since October 1995: Cl from -0.001 to 2.010.
The Hadsst3 average anomaly so far for 2015 is 0.570. This would set a new record if it stayed this way. The highest ever monthly anomaly was in August of 2014 when it reached 0.644. This is prior to 2015. The anomaly in 2014 was 0.479 and this set a new record.
GISS
The slope is not flat for any period that is worth mentioning.
For GISS: There is no statistically significant warming since September 2004: Cl from -0.036 to 2.172.
The GISS average anomaly so far for 2015 is 0.82. This would set a new record if it stayed this way. The highest ever monthly anomaly was in January of 2007 when it reached 0.97. This is prior to October 2015 when a new all time record of 1.04 was set. The anomaly in 2014 was 0.74 and it set a new record.
Conclusion
Do you feel we are going about our climate studies in the proper manner? If not, what changes would you suggest?
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
UAH November Update
UAH for November came out very fast. And it showed a drop of 0.1 from October! I knew that UAH could not reach second place before this. However a huge upward spike in November could have made it interesting. But with a drop, reaching second place it totally out of the question. It is stuck in third.
So what happens to the length of the pause on UAH and RSS? I will assume that RSS will show a similar drop as UAH, but at the very least, not a huge spike. If so, based on the anomalies where their present pauses start, namely February 1997 for RSS and May 1997 for UAH, I would say the pauses will probably AT LEAST stay at the lengths they are at present, namely 18 years and 9 months for RSS and 18 years and 6 months for UAH. The only difference with the November data is that the pauses will start and end a month later.
Richard Barraclough says:
December 1, 2015 at 7:57 PM
Hello Werner,
The “Pause”, as defined by various commenters on this site and Watts up with That, is still the same length as last month. In other words, the start date has moved forward by 1 month to June 1997
The above is for UAH.
(RSS did not come today.)
RSS Update for November
RSS for November came in at 0.426, a slight drop from the October value of 0.447. While it is the warmest November on record for RSS, the anomaly of 0.426 was beaten in the first 10 months of 1998 and the first 9 months of 2010. 2015 is in third place now and there is no way it can even reach second in 2015.
The pause remains at 18 years and 9 months, however it is shifted by one month. So it is no longer from February 1997 to October 2015, but rather from March 1997 to November 2015.
Thanks Werner, Leo, Gary, and JTF!
Keep doing what you do!
It is appreciated by many.