On the scales of warming worry magnitudes–part 1

A few weeks after my paper came out I have received quite unexpected but greatly appreciated offer from Anthony to write a summary of the paper for his blog site. The paper’s title is:

Should We Worry About the Earth’s Calculated Warming at 0.7OC Over Last the Last 100 Years When the Observed Daily Variations Over the Last 161 Years Can Be as High as 24OC?

Guest post by Darko Butina

The paper is unique and novel in its approach to man-made global warming in many respects: it is written by experimental scientists, it is published in journal that deals with data analysis and pattern recognition of data generated by a physical instrument, it treats the Earth atmosphere as a system where everything is local and nothing is global, and it is the first paper that looks for temperature patterns in the data that is generated by the instrument designed to and used by experimental scientists since early 1700s – calibrated thermometer. What is also unique is that every single graph and number that I have reported in the paper can be reproduced and validated by reader using data that is in public domain and analyse that data using simple excel worksheet. There are two main conclusions made in the paper:

1. That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year

2. That the Hockey Stick scenario does not exists in thermometer data and therefore it must be an artefact observed in a purely theoretical space of non-existing annual global temperatures

The paper is long 20 pages and analyses in great details a single weather station of daily data, dataset collected at Armagh Observatory (UK) between 1844 to 2004, one of very few datasets in public domain that have not been either destroyed, corrupted or endlessly re-adjusted by the curators of the global thermometer data at East Anglia University or NASA.

Before we start to analyse this paper, few points need to be made about experimental sciences for my paper to be properly understood. ALL our knowledge and understanding about the physical world around us comes from analysing and learning from data that has been generated by an experiment and measured or recorded by a physical instrument. Let me demonstrate this point by a very simple example of what happens when we record air temperature by some fixed-to-ground thermometer:

clip_image002

Thermometer reading of 15.1 has several links attached to it that cannot be broken: it is linked to a unique grid point, unique date and time stamp, unique instrument – thermometer and that thermometer to unique symbol (OC). So if someone wants to analyse any temperature trends those trends have to come from thermometer readings; it follows that if thermometer to be used is calibrated using Celsius scale, no datapoint can be older than 1743, follow link to Anders Celsius. Since we know for a fact that the annual temperature ranges will depend on the location of that thermometer, and since mixing different datasets are not allowed in experimental sciences, it follows that if there are, say 6000 weather stations (or fixed thermometers) in existence and across the globe, the first step before raising an alarm would be to analyse and report temperature patterns for every single weather station. That was what I was expecting to see when I started to look into this man-made global warming hysteria three years ago, following the revelations of Climategate affair. But I could not find a single published paper that uses thermometer-based data. So we have situation that the alarm has been raised, the whole world alarmed, suicidal economic policies have been taken while totally ignoring the data generated by the only instrument that has been invented to measure temperature – the thermometer. Instead, thousands of publications have been written looking for temperature trends in purely theoretical space that does not and cannot exist, the space of annual global temperatures. Two key papers have been published earlier, both arguing and explaining why global temperature as a single number does not exists, Essex et al., in 2007, using statistical arguments and written by recognized statisticians, while Kramm and Dlugi in 2011 have shown why the Earth’s atmosphere cannot be treated as a homogeneous system but should be perceived as a network of local temperature systems, from astrophysics point of view.

The starting point for my paper was based on facts that it is impossible to have arguments and ambiguity when it comes to thermometer. If you have two readings, only one outcome is possible: T2>T1 or T2<T1 or T2=T1. So if one wants, for some bizarre reason, to compare two annual patterns then one year can be unequivocally declared as warmer only if each daily reading of that year is larger than each corresponding daily reading of another year:

clip_image004

This artificially created graph above has real year in Tmax-Tmin space from the Armagh dataset while the year ‘y2100’ was result of adding 15C to each daily reading of y1844. My point here is that everyone seeing that graph would come up with an identical conclusion – y2100 is unambiguously warmer than y1844. So my perfectly valid question was – why would anyone went to trouble to invent something that does not exists while ignore obvious source of temperature data – the thermometer data? My 40 years of experience in experimental sciences offered a most obvious answer to that question – because nothing alarming could be found in thermometer-based data. There is a quite simple rule when it comes to interpretation of data – if more than a single conclusion could be made about any given dataset it means one of two things: either that dataset is of right kind but more data is needed to understand the data, or the data is of the wrong kind. Every single graph and number that is found in my paper can be independently reproduced and validated and therefore the thermometer data is the right-kind of data to use but we need more of it to fully understand temperature patterns observed on our planet.

The opposite is true when we look at the calculated and not measured data called annual global temperatures. Nobody knows where the data comes from, since data is calculated the only way to validate it is to use another set of calculations, it has been constantly adjusted, modified and different trends have been generated on daily bases using ever-changing arguments. When you dissect this very complex looking scientific problem of man-made global warming to its basic components, what you find is that the whole concept of global warming and climate change has nothing to do with science but everything to do with a very desperate attempt to connect temperatures with a few molecules of CO2 that have been generated by burning fossil fuels, while ignoring vast majority of CO2 molecules that have been generated by nature itself. It must follow that if those alarming trends could not be found in thermometer data, than that data must be removed and new data created, type of data that cannot be either proved wrong or right and allow proponents of man-made global warming to generate any trend they need and to enable them to claim that they know everything about everything when it comes to our planet. But, the only problem with that approach is that you cannot cheat in experimental sciences and slowly but steadily, retired scientists like me, with bit of free time will start to look into this problem and use their respective expertise to critically evaluate the supposed science behind this man-made movement.

So even before I started to collect daily data that are available in public domain, I was almost 100% confident that I will not find any alarming trends in thermometer data. And I was proven right.

Let us now start with the experimental part of the paper, the part where all the details of the dataset and dataset itself are presented. The paper is 20 pages long and all conclusions are based on detailed analysis of the Armagh (UK) dataset that covers period between 1844 and 2004. Dataset can be downloaded from the Armagh Observatory website as two sets of files, Tmax and Tmin files:

http://climate.arm.ac.uk/calibrated/airtemp/tccmax1844-2004

http://climate.arm.ac.uk/calibrated/airtemp/tccmin1844-2004

Depending on the software that one wants to use to analyse data, it is important to format all datasets in the same way. Since all commercial software expect as default data to be read in row-wise manner, reformatted Armagh dataset was created as a matrix containing 161 rows (1 row for each year) and 730 columns (1 column for each day-night readings):

clip_image006

BTW, all the graphs and tables from my paper are presented as JPG image and once I made my paper available free of charge on my own website you will be able to match all those graphs presented in this report to the original ones in the paper.

As a result, we now have annual temperature pattern, let us call it ‘annual fingerprint’, as a 730-bit fingerprint with the first 365 bits assigned to Tmax 1 to Tmax 365 (Jan1 to Dec 365 daytime readings) followed by 365 bits assigned to Tmin 1 to Tmin 365 (Jan1 to Dec 365 night-time readings). So, the annual fingerprint space can be seen as a 161 (years) x 730 (daily readings) matrix. Looking at the table above column-wise, we have ‘day fingerprints’, each of them 161-bits long representing the history of each day-night readings over period of 161 years. Once this table is created, we need to decide what to do with the missing values and with the extra day in February in leap years. We delete that extra day in February, but with great care not to get rest of the year out of the sync. There are two options when dealing with the missing datapoint – either replace it with some calculated one or remove the whole column.

The danger of replacing missing value with some calculated one is that we are contaminating instrumental data with some theoretical data, and unless we really understand that data the safest way is to remove all columns that contain even a single missing datapoint. Once you remove all columns with missing data you end up with 649-bit annual fingerprints, 89% of the original data, i.e. loss of 11% of total information content that is contained in that dataset, but with knowledge that the starting set is not contaminated by any calculated data and all datapoints are generated by thermometer itself.

Now we have our table in excel, table containing 161 years of data where each year is collection of 649 day-night readings and we can ask the data that 64 million worth question: Can we detect unambiguous warming trend over 161 years at Armagh (UK) in thermometer data? All we need to do is to take difference between the youngest (2004) and the oldest (1844) annual fingerprints and display it as a histogram:

clip_image008

Let me briefly digress here to make the following point – when you analyse instrumental data you have to know accuracy or error levels of the instrument that is used to generate the data. If we assume accuracy of thermometer used in 1800s at +/- 0.5C that means that for two readings to be declared as different, the difference between them should be larger than 1.0C. For example, if T2=10.0 and T1=10.8 we have to declare those two readings as same, i.e. T2=T1, since those two readings fall within the error levels of that instrument. If T2=10.0 and T1=20.0 then the difference is real since it is way outside the error levels of the instrument.

So, what is this simple graph (Figure 5) telling us? First thing to notice is that year 2004 cannot be declared either as warmer or colder than 1844 since every few days there is this switchover occurring making 2004 few days warmer than few days colder than 1844. Second thing to notice is that the size of those switchovers can be as large as 10C in one direction and 8C in another, i.e. 18C in total – way above the error levels of thermometer and therefore those switchovers are real. To make sure that those switchover patterns are not some artefacts unique to those two years, I wrote a special program (in C) to systematically compare every year to every other year (161 * 160 = 25760 comparisons) and on average each year is 50% of time warmer and 50% of time colder than any other year in the Armagh dataset. What make things even more complex is that there is no obvious pattern in terms when the switchover occur and magnitude of it, as it can be seen when two different year pairs are plotted on the same graph:

clip_image010

So far, all I did was to plot the original data, without any adjustment and without making any prior assumptions. I did not start this exercise to either prove or disapprove existence of global warming, but to see what the actual data is telling us. And what the thermometer is telling us is that the sheer magnitude of those apparently random and chaotic switchovers are due to natural forces that we do not understand, yet, and the anti-scientific process in which all complexity of annual temperature patterns is removed, replaced by a single number and suddenly we ‘see the light’ cannot be used to acquire any knowledge. If we use a simple logic the following logical construct could be made: dataset that is based on thermometer readings contains 100% of information content when it comes to temperatures. If we reduce that 730-dimensional space into a single number, we reduce the information content of that dataset from 100% to 0% – i.e. there is no information there left to gain any knowledge. Let us do the following question/answer exercise to compare two datasets – one that has day-night thermometer readings for a single year and one where 1 number represents 1 year

Q. What is total range of temperatures observed in Armagh?

A. Lowest temperature observed was -15.1C on February 7 1895, the highest temperature +30.3C recorded on July 10 1895; Total range 45.4C

Q. What is the largest and the smallest natural fluctuations observed for individual days?

A. Day that has the most variability is May 4 (Tmax125) with total observed range of 23.8C, while day with least amount of variability is October 29 (Tmax302) with the observed range of ‘only’ 9.9C

In contrast, each year is presented as a single number in the annual temperature space, number obtain by averaging all daily data to a single number, and there are not too many questions that you can ask about that single number. Actually there are none – not a single question could be asked the number that has no physical meaning! For example, if two years have identical annual average we don’t know why they are the same, and if they have two different annual averages, we don’t know why they are different. If we do the same exercise in daily data, we know exactly which days are moving in a same direction and which days are moving in opposite direction.

Let us now ask the most obvious question – are those patterns, or rather lack of patterns, observed in Armagh unique to UK, i.e. are they local or do they reflect some global patterns? Scientific logic would suggest that the same random/chaotic switchover patterns observed in Armagh should be observed across the globe with the only difference being the size and magnitude of those switchovers, i.e. local variations. To test that I took two annual temperature samples from two weather stations on two different continents, one in Canada and one in Australia:

Waterloo (Canada):

clip_image012

Melbourne (Australia):

clip_image014

Please note difference in both, patterns and magnitude of switchovers.

Let me make a very clear statement here – the choice of Waterloo and Melbourne weather stations was driven by the ease to find weather stations with relatively easy-to-download formats and I did not get involved in method of cherry picking weather stations that fit patterns found in Armagh, as it is normal practice in man-made sciences. To prove that last point and to challenge readers to start looking into the real measured data and stop looking into non-existing and calculated data like annual global temperatures, I will offer a modest financial reward of £100.00 (UK) from my pension, to first person who finds a single example of year pair where one year has every single daily thermometer readings larger than another year. Any weather station that is not on permanent ice or sand (I don’t know what to expect in those cases) and any gap between two years. Obviously, the winner will have to give the link to the original data and contact either Anthony or myself at darkobutina@l4patterns.com to claim the award.

The way I see it, I am here in win-win situation. If nobody can find weather station that shows unambiguous warming trend, and if we keep record of all those analysed weather stations I saved the money but gain large number of additional information that should finally kill any notion of the man-made global warming hypothesis, since the proponents of that hypothesis would have to explain to general public those patterns observed in thermometer data. In strictly scientific terms and using null hypothesis that either all weather stations count or none does, I have already proven that those patterns are real and observed on three different continents, and therefore prove that the global warming trend does NOT exist in thermometer data. On other hand, if someone does find clear and unambiguous warming trend in thermometer data, that work will again make the same point – all temperature patterns are local and ONLY way to declare that trends are global is if ALL individual weather stations are showing the same trends.

This concludes this Part One report in which I explained how the first conclusion “That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year” in my paper has been reached.

The second conclusion in my paper which explains why the Hockey Stick scenario does not exist in thermometer data will be reported in a separate report. In Part Two report I will introduce two different bits of software, my own clustering algorithm and k Nearest Neighbours algorithm, or kNN, both used in sciences like pattern recognition, datamining and machine learning and apply them to annual temperature patterns observed in Armagh. The overall conclusions will obviously be the same as we have reached so far, but I will demonstrate how the observed differences between different annual patterns can be quantified and how we can use those computational tools to detect ‘extreme’ or unusual annual temperature patterns, like annual patterns of 1947 which is the most unique not only in Armagh but also in the rest of UK.

==================================================================

Dr Darko Butina is retired scientist with 20 years of experience in experimental side of Carbon-based chemistry and 20 years in pattern recognition and datamining of experimental data. He was part of the team that designed the first effective drug for treatment of migraine for which the UK-based company received The Queens Award. Twenty years on and the drug molecule Sumatriptan has improved quality of life for millions of migraine sufferers worldwide. During his computational side of drug discovery, he developed clustering algorithm, dbclus that is now de facto standard for quantifying diversity in world of molecular structures and recently applied to the thermometer based archived data at the weather stations in UK, Canada and Australia. The forthcoming paper clearly shows what is so very wrong with use of invented and non-existing global temperatures and why it is impossible to declare one year either warmer or colder than any other year. He is also one of the co-authors of the paper which was awarded a prestigious Ebert Prize as best paper for 2002 by American Pharmaceutical Association. He is peer reviewer for several International Journals dealing with modeling of experimental data and member of the EU grants committee in Brussels.

0 0 votes
Article Rating
115 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Lance Wallace
April 15, 2013 12:15 pm

“Essex et al., in 2007, using statistical arguments and written by recognized statisticians, while Kramm and Dlugi in 2011”
Links?

April 15, 2013 12:18 pm

There is a funny paradox:
1) Earth has warmed by 0.7 deg C during the last 100 years, article says
2) but Earth has already warmed by 0.7 deg C during the 1910-1945 period
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1910/to:1945
3) Where is the modern warming then? ?:D
Always remember, that only the post-1970 warming is attributed to “greenhouse gases”. Never forget, that practically all the warming occurred before the “greenhouse gases” were invented as an issue.

April 15, 2013 12:23 pm

Brilliant.
And probably very important.

Alan Clark, Paid shill for Big Oil
April 15, 2013 12:26 pm

I’m sorry. I don’t know if you intended the humour but I simply laughed out loud when I read: “So we have situation that the alarm has been raised, the whole world alarmed, suicidal economic policies have been taken while totally ignoring the data generated by the only instrument that has been invented to measure temperature – the thermometer.”
Priceless.

April 15, 2013 12:29 pm

dataset collected at Armagh Observatory (UK) between 1844 to 2004, one of very few datasets in public domain that have not been either destroyed, corrupted or endlessly re-adjusted by the curators of the global thermometer data at East Anglia University or NASA.
But has been corrected and adjusted by other people in putting together the dataset:
“1844-1882 Data corrected for thermometer errors by ADSC and North Wall Screen/Stevenson Screen
differences by CJB (cjbmet15b) except for 7 Dec 1860-31 May 1863 when raw data corrected for
NWS/SS only
1883 data from Self Recording Thermograph of Automatic Weather Station, corrected for NWS/SS diff
1884 data corrected before entry, corrected for NWS/SS diff.
1885-1899 data corrected before entry. Exposure in Stevenson Screen from 1 Jan 1885 …
So if someone wants to analyse any temperature trends those trends have to come from thermometer readings; it follows that if thermometer to be used is calibrated using Celsius scale, no datapoint can be older than 1743,
So what? if the thermometer is calibrated to any other scale [Fahrenheit, Reamur, etc] the readings can be readily converted to Celsius. The main point is that the thermometer must be calibrated. As long as it is, everything is fine. BTW, I doubt that the Armagh data was measured by thermometers on the Celsius scale. Fahrenheit would be more likely [but I have not checked; have you?]

richardscourtney
April 15, 2013 12:30 pm

Darko Butina:
Thankyou.
That is science. And it is how science should be done.
Thankyou.
Richard

Lance Wallace
April 15, 2013 12:31 pm

Google search for Kramm and Dlugi 2011 turns up several discussions. Science of Doom makes a few rather dry comments:
http://scienceofdoom.com/2012/01/05/kramm-dlugi-on-dodging-the-greenhouse-bullet/

milodonharlani
April 15, 2013 12:34 pm
April 15, 2013 12:36 pm

Can we detect unambiguous warming trend over 161 years at Armagh (UK) in thermometer data? All we need to do is to take difference between the youngest (2004) and the oldest (1844) annual fingerprints and display it as a histogram
Nonsense.

Robuk
April 15, 2013 12:38 pm

I knew there were some honest scientists out there,

Joe Public
April 15, 2013 12:38 pm

So your data analysis proves beyond doubt that the £millions ‘wasted’ on limiting CO2 production in Armagh has had zero impact upon the local temperature.

bw
April 15, 2013 12:42 pm

Your methodology could be applied to other stations. There are a few others here
http://hidethedecline.eu/pages/ruti/europe/western-europe-rural-temperature-trend.php
The emphasis that missing data can NOT be “interpolated” or “homogenized” must be repeated loudly and often on those interested in the AGW issue. Quality of data, sampling, methodology are crucial to any experimental design. The surfacestatons.org project just shows how easy it is to show that bad data exists. I think that there are still NOAA stations located in the path of air conditioner exhausts even today.
Other surface temperature stations may have good data but it’s best to do it right in the first place. Eg. the Antarctic stations Vostok, Halley, Amundsen-Scott and Davis are well maintained. They show zero warming since they were established.
If the US Climate Reference Network had been established in 1890 then we would have better data. If it were possible to establish a global network with methodically fixed ocean buoys and reliable recording thermometers in 1890 then we would have a better idea of what the various global climates are doing. But good instrumentation costs money.
There is no “global climate”

April 15, 2013 12:43 pm

Empirical evidence. Thank you.
Yet I fear that those who “know” will only believe evidence that matches their ideal world.
We “know” CO2 causes warming so your thermometer must be broken.
You are denying science with primitive looking at things.
You must recognise that the historical records are stored by industry-funded (fossil-fuel-funded) institutions…
Does that need a /sarc?

April 15, 2013 12:44 pm

Did the author consider how much mean annual temperature would have to increase before no overlap between temperature in different years could be expected?
Since for May 4th, there is “total observed range of 23.8C”, we can conclude that the an enormous warming is required (probably over 10C) before the authors test can be expected to yield a positive result. This means that this paper has zero statistical power.
A link to the paper would be appreciated.

April 15, 2013 12:46 pm

By the time I’d typed my third line of urinetaking Leif had actually jumped on a variant of number 3.
lsvalgaard at April 15, 2013 at 12:29 pm.
How embarrassing (for someone).

April 15, 2013 12:47 pm

Eight comments so far and what do we have?
‘Brilliant’
‘Nonsense’
I await the outcome of this debate to see where the balance of opinion lays.

Zeke Hausfather
April 15, 2013 12:49 pm

So I take it that the interpolated CONUS temperatures presented in Fall et al were spurious?

John Edmondson
April 15, 2013 12:50 pm

Your £100 is safe. The warming signal (be it natural or otherwise) is an order of magnitude smaller than the seasonal and daily variability.

milodonharlani
April 15, 2013 12:50 pm

http://solarscience.msfc.nasa.gov/papers/wilsorm/WilsonHathaway2006c.pdf
I hope “correction” refers only to calibrating thermometers & not to “adjustments” of the Hansen/Schmidt variety.

Rob Potter
April 15, 2013 12:53 pm

An interesting approach: People have often pointed out how much information is lost when one calculates averages of one kind or another from a data set, and yet we don’t seem to be able to avoid doing it! For example, in order to compare different sites and show that any observation is seen at more than one site, the almost ubiquitous anomaly is used. However, this is just another way of introducing an average figure and, in the process, losing a great deal of the data as described here.
One point I would like to make is that temperatures are strictly observational data as opposed to experimental data and this is where us experimental scientists get so frustrated (with both the temperature data itself and the ‘scientists’ who manipulate it)! With observational data, the gaps are always there; they can’t be filled by more observations because the time of the observation has passed. This leaves people with the problem of having to either do what is done here (drop years) or introduce a calculated value – and risk contaminating the data with a biased calculation.
A similar issue surrounds dealing with the known issues in observational time series, such as changes in methodology or drift in the measuring equipment. I don’t know if the Armagh series is unique in this respect, but from what I understand there are very very few temperature records made at the same place with the same piece of equipment. This introduces discontinuities into a data set and are commonly addressed by “adjustments” which are supposed to be value neutral, but which everyone can argue about. And, even supposing that a single piece of equipment was used at one site for the whole period, what is the potential for drift in the accuracy of the instrument? Another opportunity for bringing in a calculation to “correct” for a “known” issue, but at what cost?
So, while I applaud this approach, I fail to see that it will really change too many minds. With observational data it is very very hard to keep out some kind of correction for (accepted and agreed on) problems with data collection or manipulations in order to compare the time series to other time series. And it is these corrections upon which everything rests as they can be (and are) argued over incessantly because – as has been shown here – it is the corrections that either produce a trend or don’t, or produce a trend that correlates to another (corrected observational) times series which is then given some kind of causal status.
Oh dear, I have written a thesis instead of a comment. I do apologize to Dr Butina for hijacking his piece which I do think is worth the effort of reading.

April 15, 2013 12:58 pm

milodonharlani says:
April 15, 2013 at 12:34 pm
Dr. Svalgaard: http://climate.arm.ac.uk/publications/484.pdf
does not say which scale was actually used. But the following is a good indication that the thermometers were calibrated to the Fahrenheit scale:
From section 4.1 of http://www.arm.ac.uk/preprints/445.pdf
“This thermometer was still in use at the observatory in 1823 when Thomas Romney Robinson arrived to take up his employment as Director, a position he retained for 59 years. Robinson (1859) remarked of the thermometer It appears to have been made with great care, the freezing and boiling points are exact, and by comparison of the points within the annual range of temperature, I have not found an error greater than 0.2° (F). When the meteorological series was continued in January 1834, after the break since June 1825, the same thermometer was employed until it was broken on 24 May 1859. The series continued using a ‘Kew standard’ thermometer, which, when checked by Mr R.H. Scott from Kew in October 1890, was also found to be accurate to 0.2°F. In view of the reported accuracy of these two thermometers, most likely the only ones employed for the series, and the absence of any more detailed calibration information, we decided etc etc…”
So, what is all that emphasis on the Celsius scale? Completely irrelevant.

milodonharlani
April 15, 2013 12:59 pm

Dr. Svalgaard:
This book, “The Role of the Sun in Climate Change”, by your colleagues Hoyt & Schatten, is probably old hat to you, but was new to me. I found it following up on the first link I posted above, re. Armagh & the solar cycle. It might be of interest to readers here:
http://books.google.com/books/reader?id=EBTZ4LdSfhwC&printsec=frontcover&output=reader&source=gbs_atb&pg=GBS.PA9.w.1.0.0
Your comments would be valued & appreciated. Thanks.

milodonharlani
April 15, 2013 1:00 pm

I wondered also why the author went on at such length re. C. v. F. Conversion is just arithmetic.

milodonharlani
April 15, 2013 1:03 pm

PS: From your valuable link, I see that one of the corrections was for exposure.

Mike Bromley the Kurd (this week)
April 15, 2013 1:08 pm

Mixing of data sets? Now who would do a tom fool thing like that? Oh….wait….right….sorry….I was fantasizing there for a second…..

April 15, 2013 1:15 pm

So if one wants, for some bizarre reason, to compare two annual patterns then one year can be unequivocally declared as warmer only if each daily reading of that year is larger than each corresponding daily reading of another year:

I appreciate the argument that Dr. Butina makes that by keeping the daily min-max data, his dataset has 729 more dimensions than some arithmetically contrived single annual mean value. His point is that even if the annual means between two years are equal, the two years can be very different in the daily measures.
Two complex numbers, C1 and C2, can be said to be equal, but it is not possible to say that C1 > C2 even when C1=a1+b1i , a1>a2, and b1>b2. You can only say |C1|>|C2|. Dr. Butina is not even allowing this. Tell my why Dr. Butina has not created a strawman be creating a condition nearly impossible to meet.
If A(i)>B(i) for all i=1 to 365 we can say that A is “unequivocally” warmer than B. For what “bizarre reason” would we avoid saying A is significantly warmer than B if A(i)>B(i) for all i=1 to 365 except for i=255?
I am all in favor to retaining Tmin and Tmax and the daily ranges that will contribute to an increased statistical mean standard error. But we are not looking for unequivocal evidence in a chaotic system. We are looking for strong statistical significance that the behavior of that chaotic system has changed over time or between periods of study.

james griffin
April 15, 2013 1:16 pm

The natural temp variability since the Holocene Climatic Optimum over the last 10,000 years is
+/- 2.5C. The debate is over 0.7C…..not even close enough to cause any concern. What a waste of time and money the last 20 years has been.

milodonharlani
April 15, 2013 1:25 pm

Mr. Griffin:
Some date the Holocene Optimum from c. 9000 years BP; others from after the 8200 BP event.
http://en.wikipedia.org/wiki/File:Greenland_Gisp2_Temperature.svg
But your point is valid, whatever the precise T variation +/-.

April 15, 2013 1:26 pm

Who knew that it wasnt any cooler in the LIA. Thanks. the sun therefore does nothing.
hahaha
Ah Leif thanks for beating me to the comment about the adjustment in the observatory data.
I thought that perhaps Willis and I were the only ones to have studied that closely enough to wave the Bullshit flag on this guy.. so kudos.

April 15, 2013 1:31 pm

milodonharlani says:
April 15, 2013 at 12:59 pm
This book, “The Role of the Sun in Climate Change”, by your colleagues Hoyt & Schatten, is probably old hat to you, but was new to me.
is well worth a read [and a re-read].

Doubting Rich
April 15, 2013 1:32 pm

“… thousands of publications have been written looking for temperature trends in purely theoretical space that does not and cannot exist, the space of annual global temperatures … the Earth’s atmosphere cannot be treated as a homogeneous system but should be perceived as a network of local temperature systems, from astrophysics point of view.”
I teach a 40-hour course on meteorology and world climate aimed at people with no scientific background, with no minimum academic requirement, so this is a very basic course. This fact becomes entirely obvious during the course. In fact if the student came out of the course not knowing enough to recognise that averaging the climate is ludicrous, and recognise 9 climate zones (in a vastly simplified system, symmetrical about the ITCZ) showing 5 different climates they are likely to fail the course.
So why is it that my students have to know this, yet a climate scientist does not?

April 15, 2013 1:56 pm

As a migraine sufferer I just want to express my sincere thanks for your previous work! 🙂
“He was part of the team that designed the first effective drug for treatment of migraine […] Sumatriptan”
As to this argument I would agree with a single number used to describe “climate” over a year being meaningless – but claiming that every single day over the year would need to be warmer than every single day another seems to stretch things a bit. If “day”, then why not “hour”, “millisecond” or “three week period”? It becomes quite arbitrary.

April 15, 2013 2:22 pm

Troed Sångberg says:
April 15, 2013 at 1:56 pm
but claiming that every single day over the year would need to be warmer than every single day another seems to stretch things a bit. If “day”, then why not “hour”, “millisecond” or “three week period”? It becomes quite arbitrary.
More than arbitrary: nonsense. Why not “century”, or “decade” as well? There is no doubt that Houston, TX is hotter than San Diego, CA, but there are every year days with temperatures below 25F in Houston, but never in San Diego.

dynam01
April 15, 2013 2:26 pm

I too wish to thank Dr. Butina, both for his research here and on behalf on migraine sufferers. I have taken Treximet on and off (that’s how it works for migraine meds, unfortunately) for years and it has been the more consistently effective medication out there.

davidmhoffer
April 15, 2013 2:27 pm

I’ve been a long time critic of the notion of an average temperature that means anything and I think averaging anomalies is worse. But those methods being wrong doesn’t make this one right. If I applied this method to my bank account, I could prove that I don’t have any more money now than I did a year ago. But I do.
Besides which, GHG theory doesn’t say CO2 changes temperature, it says it changes w/m2 which is only indirectly related to temperature. All the methods in the world can’t change that fact. We need to measure and trend w/m2. Degrees just won’t do it.

Lance Wallace
April 15, 2013 2:28 pm

milodonharlani says:
April 15, 2013 at 12:31 pm
I found these:
http://www.uoguelph.ca/~rmckitri/research/globaltemp/GlobTemp.JNET.pdf
Thanks to milodon for linking the Essex & McKitrick paper on the mathematical aspects of the “global temperature” concept. An oldie (2007) but a goodie. If you haven’t read it, you are in for a treat.

jorgekafkazar
April 15, 2013 2:48 pm

The notion of a “global” temperature based on air temperatures is…what was Leif’s word? Oh, yeah. Nonsense. The atmosphere has 1/1100 the thermal capacity of the oceans, so measuring air temperatures (in °F, °C, °R, °K, or whatever) is examining the tail instead of the dog. Air temperature devices (thermometers, etc.) ignore humidity, barometric pressure, wind velocity, and precipitation. Osterizing all those temperatures taken under disparate conditions results in a meaningless fruit smoothie, never an apples-to-apples comparison.
Moreover, in looking at trends, remember that pointwise high temperatures are T⁴ heat-shedding mechanisms and are losing heat faster than cooler areas. Averaging those high temperature areas with other areas just helps us lose sight of that fact. Estimating “global” temperature by averaging regional temperatures is like estimating a city’s population by counting the number of cars on the freeways. Some of those roads lead out of town.

See - owe to Rich
April 15, 2013 2:51 pm

I’m with Stephen Rasey. And Leif. And Mosher, etc.. Just because the author doesn’t like averages or means does not imply that such concepts and measures have no validity regarding climate.
In the Armagh data 2004 does look, on average, warmer than 1844. What caused that? Is that a pattern that is repeated around the world – on average? Is that what global warming means, howsoever it may be caused?
Rich.

April 15, 2013 2:53 pm

Should We Worry About the Earth’s Calculated Warming at 0.7OC Over Last the Last 100 Years
To put this number into perspective, the yearly variation is 3.8 C. It may not exist, but if we assume a place that is completely average during the whole time, and even if we assume it also went up 0.7 C, then the July average in 1844 would have been 3.1 C higher than the January average in 2004. See:
http://theinconvenientskeptic.com/2013/03/misunderstanding-of-the-global-temperature-anomaly/

AndyG55
April 15, 2013 2:56 pm

In Figure 5, what is the net average of the daily differences?
It looks to me that there is quite a bit more above the zero line than below.
same with the un-named red/yellow graph.

April 15, 2013 3:12 pm

Hi Darko
About 2 years ago Tony B and I were considering historic perspective of the convergence/divergence between the CET and Armagh temperatures.
Here is part of the exchange (my contribution)
http://www.vukcevic.talktalk.net/Armagh.htm
Why the Armagh should be rising faster up to 1940 and than keeps track with the CET, in order to start falling in 1970’s?
( In this part of the nort Atlantic summer temperatures show no long term rise, all increase is in the winter time. The Armagh observatory is located on edge of the town, but only about 30 miles from heavily industrialized Belfast.) Is it possible that up to WWII industrialization of Northern Ireland was growing faster than in the CE area, (coal burning particles would increase condensation and cloudiness in the winter, and the cloud blanket reducing winters temperatures drop ?).
By 1940s industrialization reached the CE’s and for the next 30 years keeps in step with it.
But why sudden difference from 1970s?
In early 1970 Northern Ireland was hit by civil unrest, slow down in the industrial activity meant cleaner air and colder winters.
In contrast discovery of the North Sea Gas and en mass central heating installation in the English houses probably contributed to UHI effect, which not only prevented slow fall, but contributed to some rise in the CET during 1970’s. In the 1980’s the UHI effect reached a plateau, and onward both the Armagh and CET move at approximately same rate. The above difference would imply that the UHI component in the CET is of order of 0.3C.

Lot of guessing there, but it is a thought worth considering.

CC Squid
April 15, 2013 3:20 pm

“Doubting Rich says:
April 15, 2013 at 1:32 pm
I teach a 40-hour course on meteorology and world climate aimed at people with no scientific background, with no minimum academic requirement, so this is a very basic course.”
I would be interested in a course of this nature if it were offered online. One of the problems of this or any other course is that the subject matter reflects the bias of the instructor. For example, I would not take any course from Mann because my answers would have to reflect the views of the instructor in order to obtain a great grade. Is a course like this offered online and what is the cost?
If there is not an online course please list the title of the text book used for this course.
Regards,
CC

AndyG55
April 15, 2013 3:20 pm

Also, one might want to look at the number of days in either direction then look at in terms of a binomial probability distribution.
I suspect that either that or the average would tell a different story.

April 15, 2013 3:24 pm

Excellent, thank you. Let’s see if we can get this into a few heads. This is exactly what should boot out the nonsense computer models, if enough people only listen and learn.
I know, I know, no one has to tell me how difficult that is, but it’s happening all the same. Just slowly.

AndyG55
April 15, 2013 3:24 pm

ps, if the average difference is significantly positive and the number of days in one direction is significantly more that 182, then I think it may indicate a change in overall temperature.

davidmhoffer
April 15, 2013 3:28 pm

Suppose we measured temp just twice per day. One reading at 280 K and the other at 300K. Average for the day would be 290 K. But instead of averaging temps, let’s first convert to w/m2 via SB Law P=5.67*10^-8*T^4 We’d get:
280K = 348.51 w/m2
300K = 429.41 w/m2
(348.51 + 429.41) / 2 = 388.96 w/m2
Convert 388.96 w/m2 back to temperature via SB Law and we get…. 287.8 K
So which is the right “average” temperature for the day? 290? or 287.8?
I submit that it doesn’t matter. For the purposes of determining if CO2 increases also increase the w/m2 at earth surface, measuring temperature is ludicrous. As per jorgekafkazar says:
April 15, 2013 at 2:48 pm above, there are plenty of other factors that alter energy balance and whatever number actually is correct it means diddly squat compared to the oceans.
Air temps are like a tiny child being dragged by one hand by a large adult. It may be the thrashing and wailing of the child that draws our attention, but there is little doubt in our minds that the child’s “average” direction is to simply follow the large adult.

Jarryd Beck
April 15, 2013 3:35 pm

I’m not sure that this works. Something seems a bit fishy to me. I don’t like averaging temperatures, that seems to lose too much information. But this doesn’t seem right either.

jorgekafkazar
April 15, 2013 3:36 pm

AndyG55 says: “ps, if the average difference is significantly positive and the number of days in one direction is significantly more that 182, then I think it may indicate a change in overall temperature.”
Right, it may, but it would be ambiguous. Before I’d risk our entire civilization, I’d ask for unambiguous proof of the null hypothesis and an economic analysis not pulled out of someone’s stern.

richardscourtney
April 15, 2013 4:04 pm

Jarryd Beck:
I write to support your post at April 15, 2013 at 3:35 pm which says

I’m not sure that this works. Something seems a bit fishy to me. I don’t like averaging temperatures, that seems to lose too much information. But this doesn’t seem right either.

You have ‘hit the nail on the head’.
Average temperatures are nonsensical in theory and in practice.
In his above article, Darko Butina accepts this and suggests methods for comparing results of actual temperature measurements obtained at one place (i.e. Armagh Observatory).
The suggested methods are probably less than optimum, but they are clear in method and can be debated. Such debate of interpretations of real data and their indications is science. This contrasts with the methods being used by ‘climate science’.
The debate may decide that the methods suggested by Darko Butina are ideal, or need amendment, or merit rejection. But if the suggested methods are not ideal then that opens debate as to how the suggestions should be amended. If the suggestions merit rejection then that opens debate as to what alternative would be acceptable. And if there is no acceptable alternative then that is useful information because learning what cannot be known is always an important scientific finding.
I submit that the above article is among the most important which have been posted on WUWT. It takes the subject of climate change back to consideration of what is and what is not an indication of climate change. And that consideration is three decades overdue.
Richard

AndyG55
April 15, 2013 4:13 pm


yes, I agree completely, the whole method seems fraught with massive uncertainty.
We have warm years , we have cooler years.. Its called NATURAL climate variability.
I was just looking at the graphs from a mathematical point of view. Please don’t think, for one instance, that I believe that they could show anything worth bothering about. ! 🙂

k scott denison
April 15, 2013 4:14 pm

Mr. Mosher and Svalgaard: your points are well taken. Out of curiosity, though, if your were to take what you feel are the best observations available and simply present the data without averaging, grinding, etc. what do the results look like?

April 15, 2013 4:14 pm

vukcevic says:
April 15, 2013 at 3:12 pm
Northern Ireland is generally milder than central England, I suspect the temperature taken at Armagh Observatory is heavily influenced by the Granite rock that is found all over N Ireland and the lough Neagh, the largest lake in Ireland and the UK, the other point you made about Belfast being 30 miles away from Armagh etc… It is but its on the other side of the Lough.

Doug Proctor
April 15, 2013 4:27 pm

You refer to the difference I have termed “Procedural Certainty” vs “Representational Certainty”. The first is about the math, how it was done, what the statistical certainty is of similar results coming from a similar data management under certain randomness assumptions about the data. The second is about how well the result of all this procedural machinations correlates to what you are trying to determine, here the temperature record that would have been created using the equipment that we are using to determine the target parameter.
I have long argued that the global warming numbers could largely be an artefact of collection and adjustment, a statistical result of combining regional differences to produce an artificial global image that at present gives a “warming” profile but could easily have produced a “cooling” profile (and may yet do so) without an actual change in the total energy content of the system.
On 50% of the planet, a TOA TSI changes from 729.4 W/m2 on January 4th to 636.6 W/m2 simply as a result of orbital eccentricities. (The number of 341.5 W/m2 we are always told is the full planet, full year average. But half the world is in darkness at all times, and the orbital distance of the Earth from the Sun changes by 3.3%, meaning that the SI changes by 6.8% through the year.)
The TOA SI on the sunlit side therefore varies by 92.8 W/m2 from winter to summer, and yet the planet stays within 2C all year, and THAT is in the Northern Hemisphere that receives proportionately the lesser amount of SI.
The Earth is a giant energy redistribution system in which those like Trenberth claim they can determine a “missing” 0.58 W/m2 whole Earth power (1.16 W/m2) because they can measure exactly how this energy is distributed – despite an obvious 92.8 W/m2 that is being whipped around without our understanding. Others, like Hansen, claim that the heat redistribution system is in constant regional balance to less than that amount (for error considerations): 0.29 W/m2 whole Earth, 0.58 W/m2. If the energy redistribution is not consistent as to WHERE it goes to that level, natural thermal conductivities and capacities are going to change the resultant temperatures. Which then changes the “global” temperature.
The math employed is Procedurally sound and the outcome, certain – as a mathematical entity. The math does not necessarily (and I would say does not actually) present us with a Representationly sound or certain result with respect to what we are trying to understand, i.e. if the Earth is “heating up” as a result of anthropogenically introduced CO2.

April 15, 2013 4:39 pm

I don’t agree with this idea that you cannot use an “average” … I would bet anything you like, that if thermally isolated a massive block of concrete in the ground with a Stevenson screen over it … the temperature of the block would be an average of the preceding temperatures.
Obviously we could spend a lot of money creating such a measurement device which (largely) removes the short term intra-yearly variation … or we could just use the measured temperature and calculate it.
The real problem with averaging is not the averaging technique itself, but the statistics of non homogenised noise. The problem is that most scientists are usually clueless about real world noise. and … for the select few of us who are electronics engineers … we can laugh at this ….they have this daft idea that averaging gets rid of noise!!!
Yes!! Scientists seriously naive enough to think that averaging gets rid of noise!! They haven’t a clue that most real world situations contain noise that cannot be averaged out.
In any real world situation (where the signal is changing), averaging does not increase signal to noise but DECREASES IT!!
This is because real world signals have a finite period … whereas 1/f noise has an infinite period and and so, if you average 1/f noise, eventually all you are left with is the noise!!
To use a simple example … if your signal is only present for the last 40 years, you have a finite amount of your signal in the frequency range up to 8E-10 Hz rise to a finite maximum at 0Hz. In contrast, 1/f noise has a finite level at 8E-10 Hz (i.e. when you take a 40year sample), but that noise increases if you take an 80 year sample, increases again for a 160year sample and it keeps on increasing and increasing NOT DECREASING the longer your average and approaches an infinite level of noise as you approach a frequency range of 0Hz (an infinite sample).
So, the lower you make your frequency filter (i.e. the longer the period over which you sample) the more the noise to signal ratio.
Usually scientists are protected from their ignorance by the simple fact that they take samples over short periods where 1/f noise is insignificant. So their ignorance doesn’t usually affect their results and they happily average their signals in the naive belief that all noise is removed by averaging.
But when you get these ignoramuses who have no idea of real-world noise and start giving them real world signals, not of seconds, nor hours nor even days for which their lab-based simplistic noise concepts is attuned, but start giving them signals covering spans of months, years, decades and …. heaven forbid … centuries or millennium. These simpletons move outwith their competence in understanding noise and (as we see from the global warming debacle) they make a complete hash of understanding what is going on confusing natural 1/f noise with real signal.
To use a simple example … imagine a scientist trying to determine how climate affects river flow. Their concept of natural variation is that of instrumentation noise, which is easily reduced by averaging. So … they measure river flow and temperature. First they do it over 1 year. No change, then they do it over two … a small change, eventually waiting 40 years they report that the river flow has definitely changed (as well as temperature). And then conclude (whatever the temperature change) that it caused the river flow to change.
What is wrong?
The problem is that river flow is constantly changing. But worse, the change seen in any year is smaller than expects in any decade which is much smaller than one expects in any century which is much much smaller than one expects in any millenmium.
The problem is that the apparently “constant” nature of the river hides the reality that the river bed is not a static channel. Each season new silt remoulds the bed. Each year, the banks erode, so that the rate and type of flow changes from year to year and like a random walk, rather than tending toward some “normal” flow, the river bed itself meanders across the landscape so that the flow tends more to a random walk than a constant unchanging entity.
On even longer timescales – even the mountains which give the static head cannot be considered static as they too erode (what do you imagine creates the silt but the rocks rubbing together … and where do the rocks come from but … the mountain). So, the river itself is constantly changing, the dynamics are constantly changing AND UNLIKE THE NOISE SCIENTISTS ARE USED TO…. REAL WORLD CHANGES CAN KEEP CHANGING IN ONE DIRECTION so that the longer one waits … the more it changes.
So, real world signals are full of natural variation in the form of a multitude of underlying trends and that variation, far from being “averaged out”, actually gets more dominant as you average it.

April 15, 2013 5:04 pm

Unfortunately Armagh Observatory was heavily affected by coal and peat burning in Armagh Town. This is from the 19th century.
In his Foreword to the Armagh Catalogue2, Robinson had some hard things to say about the general siting. The prevailing west to south-east winds, he complained, were apt to “drive smoke from thousands of chimneys from the town over the Observatory, and interfere, by heated air, for nine months out of twelve”.
A state of affairs that persisted until the 1980s. Minimum temperatures usually occur after dawn, when incoming solar radiation exceeds OLWR. Smoke reduces early morning incoming solar radiation and hence decreases minimum temperatures.
Armagh minimum and average temperatures are probably most useful as a proxy for coal and peat consumption in Armagh Town.

Txomin
April 15, 2013 5:21 pm

This is an interesting approach. I too have battled in my field for measurements before/over hypotheses. Thank you, Mr. Butina. I look forward to updates, whatever these might show.

Jim Butts
April 15, 2013 5:22 pm

Data analyses of this sort are not likely to convince anyone of anything. In fact it will do harm because it appears to be a very strained attempt to prove that no warming at all is occurring. To say that one year cannot be said to be warmer than another year unless all corresponding days are each warmer than the other is just ridiculous.
We who are deniers should be more clear about what we are denying. When I am asked if I think there has been warming I say maybe, perhaps about 0.7 Kelvin (about 0.3%) over 150 years. When the follow up question about humans causing the warming comes I say maybe, since CO2 is increasing in the atmosphere by amounts consistent with human use of fossil fuels and CO2 is a greenhouse gas which will cause warming if nothing else changes (like cloud cover for example) but the effect is rather small, a doubling of CO2 would lead to about a 1 deg K temperature increase (about 0.3%). If asked what we should do about it, I say nothing because first, a temperature increase of 1 degree over the next 100 years is nothing to get alarmed about and moreover global warming and increased CO2 in the atmosphere are both good things enhancing both plant and animal life on the planet. When they then launch into the “but what about all the extreme weather” questions I tend to give up and respond that that is all BS.

April 15, 2013 5:33 pm

Forgot to include the Armagh temperature graph.
http://junksciencearchive.com/MSU_Temps/Armagh_an.html
Note the decreased minimum temperatures as affluence increased post 1960 (= increased fuel consumption) and the abrupt increase when the 1981 N Ireland clean air act was implemented.

jaymam
April 15, 2013 5:33 pm

Could somebody please create and publish the usual graph of “global temperature” for the last 130+ years plotted with a scale of 0 to 30 dgrees C, and add to it about a dozen graphs of a typical days temperature at various locations where lots of people actually live, around the world (i.e. not the poles)
It should look like almost a horizontal line with a bunch of wildly swinging curves between 0 and 30 degrees.

Mike McMillan
April 15, 2013 5:38 pm

It always helps to put things in perspective.
I’ve placed Dr Mann’s frightening 1998 hockey stick on a chart of the Galva, IL mean annual max and mean annual min temperatures, all at the same scale.
http://www.rockyhigh66.org/stuff/hockey_stick_d_galva.gif

April 15, 2013 5:58 pm

Philip Bradley says:
“Armagh minimum and average temperatures are probably most useful as a proxy for coal and peat consumption in Armagh Town.”
Armagh: Average November Sunspot Number and February Minimum Temperature 1875-2012
http://thetempestspark.files.wordpress.com/2013/02/nov-ssn-v-feb-tmin-1875-20121.gif

Greg House
April 15, 2013 6:20 pm

Darko, I like your article very much, it is a heavy blow on the empty head of the warmism monster. No wonder some people have started obfuscating the matter by introducing fallacious comparisons to bank accounts and talking about radiation and CO2.
I guess, before the problem of comparisons between years there is another one: comparisons between days. Averaging Tmin and Tmax of days and then comparing the averages seems to be equally nonsensical to me. Maybe you could give it another thought.

Jarryd Beck
April 15, 2013 6:28 pm

I think the problem with averages is that we have to be careful with what we are actually averaging. An average temperature for a particular day, or even simply recording Tmin and Tmax might not take into account the actual temperature changes throughout the day. An average is saying that the average energy in the atmosphere across the whole day was some particular number, but that only works if the temperature changed uniformly across the whole day.
You don’t need to look at many days to know that this does not happen. Suppose that it gets to 35C at 1pm, you might suddenly get a thunderstorm and it drops back to 20C by 2pm until the night. In that case, the 35C maximum doesn’t really represent anything meaningful. On the other hand, it might start at 18C at 6am and very quickly (by 10:30am) get to 30, and stay there all day. Now which day would you say has been hotter?
This study suffers from the exact same problem. However I think that it suffers from another. The author states that you need every day’s temperature to be higher to consider one year hotter than another. I disagree with that. I say that the total energy that was present in the atmosphere needs to be higher to consider one year hotter than another. But how do we do that? We have to integrate every single temperature measurement throughout the whole day to get the actual energy content of that day. I doubt that there are very many weather stations with detailed enough records to be able to do that.
In conclusion, I don’t think that we have detailed enough records to be able to say many meaningful things about average temperatures of any particular year. Of course maximums and minimums still mean something; clearly Montreal gets colder than Sydney. But, I don’t know that it says as much as everyone would like us to believe.

Greg House
April 15, 2013 6:34 pm

Dear moderators, your spam filter swallowed my comment…
[Reply: I check the spam bucket at least 15 – 20 times every day, and rescue legit comments. Your patience is appreciated. — mod.]

P.D. Caldwell
April 15, 2013 7:07 pm

Defining specific climate change with a single attribute, such as temperature, when the attribute is a composite of statistical manipulations is disingenuous. The same is true of the attribution of climate change to a single factor, such as CO2.
This paper is no less extreme or scientifically valid than Mr. Mann’s. One attribute at a single data point is just that: one attribute at one data point. Proxies are just that, proxies.
The science of climate change is not settled because man does not understand the physics and chemistry completely enough, the extant data sets are not finite enough nor spatially adequate to support irrefutable conclusions.
Expending mental capital expressing premature opinions delays understanding. And historically has made fools of the authors. But then, the dead have no ego.

John Francis
April 15, 2013 7:11 pm

Jarryd Beck just beat me to it. Averaging the daily high and the daily low is ludicrous really. You do have to integrate throughout the day to get something meaningful. And even if you could (which is technically but not economically feasible) you are still merely measuring the local variations, with a very spotty grid, plus dishonest scientists with a mission. The whole exercise is nonsense, particularly when expecting robust date to an accuracy of less than a degree or so.
And then for the ocean you would need to integrate over the full depth in millions of locations.

Greg House
April 15, 2013 7:41 pm

Jarryd Beck says (April 15, 2013 at 6:28 pm): “An average is saying that the average energy in the atmosphere across the whole day was some particular number, but that only works if the temperature changed uniformly across the whole day. …We have to integrate every single temperature measurement throughout the whole day to get the actual energy content of that day.”
=======================================================
This way you can not learn anything about “energy in the atmosphere” or “actual energy content of that day”. Because the air is moving. Every time you measure the air temperature it is another air. There is such a thing like wind in our nature. The temperature can decrease for different reasons. It might be clouds covering the sun. Or it might be a cold wind from the north. How can all that be reasonably put together and averaged?
“Climate science” should be closed.

April 15, 2013 7:55 pm

Thanks, Darko.
The Armagh databases are very interesting.
I’ll be waiting for more from you.

NeedleFactory
April 15, 2013 8:01 pm

k scott denison asks (at 4:14 pm)
… Out of curiosity … if you were to take … the best observations available and simply present the data without averaging, grinding, etc. what do the results look like?
If one does not like averages, one could take sums. “Degree Days” are often used in heating energy calculations and agricultural forecasts. For our present probelm we could, for every one of the days-of-the-year for which there is no missing data for any year, sum the degree days (for some arbitrary low temperature) (and using either min- or max-temps).
Some would suggest that such yearly totals would approximate our intuitive understanding of how to compare the warmth of different years. By such sleight of hand one could argue that we are not averaging intensive properties such as temperature, but summing extensive properties such as gallons of oil needed to heat a building. Dr. Darko Butina might be pleased; but the resultant comparisons of years would be identical to those obtained by the “dubious” averaging.

tokyoboy
April 15, 2013 8:04 pm

The yeary average temperature of Tokyo is ca. 6 degC higher than Boston.
Both are big and flourishing cities, though Tokyo (12 million) is much bigger.
Ergo, warming by 6 degC will never afflict Boston at least in ordinary living.

tokyoboy
April 15, 2013 8:05 pm

yearry should read yearly. Sorry.

ferdberple
April 15, 2013 8:11 pm

lsvalgaard says:
April 15, 2013 at 12:36 pm
Nonsense.
==========
That is not an argument. Specify on what grounds so we can judge. Otherwise your statement is abusive and irrelevant.

April 15, 2013 8:32 pm

In his WUWT post Darko Butina said,

The paper is long 20 pages and analyses in great details a single weather station of daily data, dataset collected at Armagh Observatory (UK) between 1844 to 2004, one of very few datasets in public domain that have not been either destroyed, corrupted or endlessly re-adjusted by the curators of the global thermometer data at East Anglia University or NASA.

I applaud the independent spirit to get as near as possible to the raw data for an analysis on a station limited to just the direct data.
– – – – – – – –
In his WUWT post Darko Butina said,

Since we know for a fact that the annual temperature ranges will depend on the location of that thermometer, and since mixing different datasets are not allowed in experimental sciences, it follows that if there are, say 6000 weather stations (or fixed thermometers) in existence and across the globe, the first step before raising an alarm would be to analyse and report temperature patterns for every single weather station. That was what I was expecting to see when I started to look into this man-made global warming hysteria three years ago, following the revelations of Climategate affair. But I could not find a single published paper that uses thermometer-based data.

Analyse and report the temperature patterns at every one of the ~6000 stations . . . . that is a refreshing idea. Do it.
John

ferdberple
April 15, 2013 8:34 pm

lsvalgaard says:
April 15, 2013 at 2:22 pm
Houston, TX is hotter than San Diego, CA, but there are every year days with temperatures below 25F in Houston, but never in San Diego.
==========
There is nothing in the methodology to suggest this sort of comparison. It is clear the author is subtracting like from like and has eliminated missing data to avoid the complications that would result otherwise.

ferdberple
April 15, 2013 8:57 pm

There is of course a weakness in the method in the limited number of years compared in calculating the differences. This would need to be expanded before confidence could be placed in the results.

ferdberple
April 15, 2013 9:05 pm

My take on the methodology is that the author is not trying to show that there might have been warming. Rather to shows that this is not unequivocal, because there are some years in the present that cannot be distinguished from some years 100 years ago. This should not be the case if the result is unequivocal.

k scott denison
April 15, 2013 9:35 pm

NeedleFactory says:
April 15, 2013 at 8:01 pm
k scott denison asks (at 4:14 pm)
… Out of curiosity … if you were to take … the best observations available and simply present the data without averaging, grinding, etc. what do the results look like?

_____________
The issue is that the “averages” are actually not averages of data at the same locations, etc. but are geographically averaged, etc.
Always pays to start by just looking at the data, station by station.
I know that the BEST project showed something like 30% of stations showed cooling trends. Would be interesting to see what the total “temperature days” difference for the cooling and warming stations would be. Not after geographic projections and manipulations.

jeez
April 15, 2013 10:14 pm

ferdberple says:
“That is not an argument. Specify on what grounds so we can judge. Otherwise your statement is abusive and irrelevant.”
Sorry ferd, but the statement Leif quoted:
“Can we detect unambiguous warming trend over 161 years at Armagh (UK) in thermometer data? All we need to do is to take difference between the youngest (2004) and the oldest (1844) annual fingerprints and display it as a histogram.”
IS ABSOLUTELY NONSENSE! How can it be anything else. ALL WE NEED TO DO? Just two years, just compare histograms. That’s it? That’s all we need to do? No other analysis or perspective necessary? Is the level of the discussion so juvenile that you need this explained? I can explain further if you really want. Now don’t make me go ALL CAPS on you any more.

Espen
April 16, 2013 12:41 am

I don’t really understand the point of finding a pair of years where every day in year 2 is warmer than the same day in year 1. But I look forward to the cluster analysis in part two, that sounds like a very good idea!

peter azlac
April 16, 2013 2:27 am

I completely agree with your statement that we have to examine each temperature record and not some mythical global average. Global warming/cooling does not exist except as a concept. What does exist is changes in climate by climate zone, hence the Koppen classification, that results in a movement of zonal boundaries – for example the boundaries of the green areas of the Sahel. And these changes can be linked to solar activity and resulting ocean oscillations.
What I do not agree with is your statement :
“ So if one wants, for some bizarre reason, to compare two annual patterns then one year can be unequivocally declared as warmer only if each daily reading of that year is larger than each corresponding daily reading of another year:”
Warmth is an indication of heat which is based on heat capacity, involving specific heat and energy flux, whereas temperature is only an indication of the heat flow. In agriculture changes in annual or seasonal warmth are measured in heat units that are based on the cumulative difference between min and max daily temperatures and a specific temperature that varies by by crop. A certain annual number of heat units, sunshine hours, soil moisture within the normal crop growth cycle are required. That is why we can say that for every 1 oC fall in annual average temperature the growth zone for grains moves about 170 km south, depending on altitude, aspect etc. Of course this statement should be stated not as annual average temperature but as these growth parameters based on heat units etc. Alberta, Canada, will be one of the first to be affected by the coming global cooling so I give you a reference to the use of these measures there.
http://www1.agric.gov.ab.ca/$department/deptdocs.nsf/all/sag6301
In “climate change” we are concerned with changes in energy flux (SWR – OLR) that drives the hydrological cycle and via pressure differences the winds and weather that supply or deny water to climate zones – e.g. the well known link between drought and flood in the SE USA that is linked to the ENSO cycle. Willis has shown that convective cooling keeps the SST within narrow bounds whereas Clive Best has shown that differences soil moisture for the same latitude have a big effect on MAT, hence the heat units calculation.
http://clivebest.com/blog/?p=3258
This means that whereas two years may have the same annual average temperature they can be very different in terms of our ability to grow food. That is the most significant aspect of climate change that is ignored by the IPCC in using computer models based on global average temperature rather than zonal changes in heat units, sunshine hours, precipitation and the timing of these factors in relation to the growth cycle of crops. And of course increased atmospheric carbon dioxide, up to 1200 ppm is known to increase crop yields and drought resistance so we should be increasing atmospheric concentrations not the attempting to depress crop growth.
In this respect temperature and moisture measurements have to be very local – on the farms where the crops are grown. This means that the ability of the various temperature series to predict (or project) the effects of future climate change are problematic. As an example, Vukcevic refers to the problems with the temperature series at Armagh, pointing out the problems of industrialization on the results until it came into line with CET:
“ …. In the 1980’s the UHI effect reached a plateau, and onward both the Armagh and CET move at approximately same rate. The above difference would imply that the UHI component in the CET is of order of 0.3C.”
However, CET is hardly a perfect record, being based on at least 15 changes of station, with seven since 1958 and with the contribution from Ringway Airport, that is embedded in Greater Manchester and subject to heavy industry similar to Belfast, only removed in 2004. An experiment at Armagh to test for UHI effects found major differences in min and max with other local stations that were installed to test this and that have been put down to differences in surface roughness as per the claims of Roger Pielke Sr.
“Mean temperature differences between Observatory (O) and the mean of the three rural (R) stations (A,C & D), February to October 1996
Means O R O-R
deg.C
tmax 14.63 14.52 0.11
tmin 6.9 6.53 0.41
The mean difference in daily maxima between the Observatory and the mean of the three rural stations is found to be 0.11oC, whilst the difference in minima is 0.41oC, with the Observatory station warmer than the mean of the rural stations in both instances.”
This effectively trashes the use of griding with homoginistation so beloved of the CRU and GISS. Incidentally, instead of trying to find a UHI effect in these series we should be looking at the impacts of soil moisture and surface roughness – that is why we see warming and cooling stations intermixed throughout the SE of the USA and which will not be picked up by MODIS or population data as they are either natural due to changes in the ENSO cyles or due to changs in agriculture.
UHI study of the UK Armagh Observatory
Posted on August 26, 2010 by Anthony Watts
http://wattsupwiththat.com/2010/08/26/uhi-study-of-the-uk-armagh-observatory/
The lack of data on heat units and other factors that affect crop growth and the experience of the UHI experiment at Armagh tell me that relying on the official temperature series as used to support the whole IPCC CAGW mantra is not only nonsense but dangerous nonsense for those who live and rely on food production in marginal climate zones such as the Sahel. Not only do w have a lack of adequate local temperature measures but also of surface SWR. Further, in areas where most of the SWR is received and crops are grown under irrigation the Class A Pan Evaporation units show a decrease in evaporation over the period that atmospheric carbon dioxide has been increasing so, along with the missing “hotspot” disproving the IPCC claims of feedback from water vapour that is required for their CAGW myth.
http://www.science.org.au/natcoms/nc-ess/documents/nc-ess-pan-evap.pdf
Also, from a crop growth viewpoint the important point about Armagh is that Hathaway has found a link not just with NAO as per CET but with solar activity.
http://solarscience.msfc.nasa.gov/papers/wilsorm/WilsonHathaway2006c.pdf

Keitho
Editor
April 16, 2013 2:37 am

Dear Dr Darko Butina, that was a most excellent article. Clear in its simplicity and revealing in what it portrays. Thank you.

Martyn
April 16, 2013 3:26 am

Please Label the axes on the graphs and use figure numbers in the graphs.
It would leave the article looking more professional.

Johan i Kanada
April 16, 2013 4:37 am

Yes, there is certainly a need to rely on actual measurements as opposed to models.
The rest is pretty much nonsense.

TLM
April 16, 2013 5:49 am

As others here have said, it is total nonsense to say that every day in a year has to be warmer than the equivalent day in a previous year or location to prove that it is warmer.
Quite often there is the odd day in the UK where it is warmer than Egypt. There is often a jokey headline in the tabloid press when that happens “London warmer than Cairo!” accompanied by a gratuitous picture of a bikini clad girl lying on the grass in Hyde Park. Of course this is totally meaningless. Anybody who has been to Cairo knows that it is “warmer” than London.
If you don’t like averages why not try counting instead? Assuming there is one meaningful temperature measurement each day – say T-Max, one approach might be to compare each day in each location and subtract T-Max in one location from T-Max in the other. You could then count the number of days in each location where T-Max was higher than the other.
You might then find that in Cairo on 360 out of 365 days that T-Max was higher than Brighton. I think you could then reliably say that Cairo is warmer than Brighton!
You could then do the same for T-Min and I suspect you would get a similar result.
Personally I think averaging is absolutely fine – provided it is done carefully with good data. It is also essential if you are going to find out the magnitude of any differences.
I have to say, regardless of the good Doctor’s scientific background that this paper seems to discard common sense and invents spurious made-up statistical “rules” to somehow discredit the temperature record.
As my daughter would say, an “epic fail!”

Robert of Ottawa
April 16, 2013 5:50 am

That’s the amusing thing. OMG, it’ll get 1C warmer over the next 100 years.
Here in Ottawa, Canada, the nation’s capital BTW, the temperature varies 60C or more over the 6 months, January/July.
Of course it doesn’t vary at all April/October 🙂

Steve Keohane
April 16, 2013 5:54 am

While there are many complaints, I think Darko Butina has confronted a fallacy of ‘climate science with its own argument. Any weakness is in the original use of temperature to indicate enthalpy when it does not. Any temperature measurement sans RH% and probably atmospheric pressure cannot be used to tell if the world is warming or not. Period.

April 16, 2013 5:58 am

If all the suggestions about why the differences in the Temperatures measured at Armagh are artifacts of the location in relation to Belfast or Lough Neagh, or are due to industrialisation or de-industrialisation in northern Ireland, or the passage of the clean air act and other such comments are germane, perhaps someone can explain why the average calculated annual temperature has a very convincing linear relationship with the length of the sunspot cycles

Chuck Nolan
April 16, 2013 6:13 am

The arguments against this paper are scientific, data referenced and logical.
Not to mention completely lame.
If you want in this debate cut the technobabble.
They don’t use it.
Hansen doesn’t say sea level rose 4cms. He says it will rise 20 meters because of the coal trains of death. Mann doesn’t say the temperature went from 15C to 15.012C he says it will increase 6C because of greedy big oil.
They have no compelling evidence to hit the people over the head with, just sound bites, stunts and staged pictures.
They cut AC to the US Capital for Pete’s sake.
They show lots of pix of :
Hurricanes
Floods
Displaced people
Black carbon dust on snow
Dried up river beds
Receding glaciers
Cute cuddly polar bear cubs
And the hockey stick.

None of it demonstrates CAGW , which is a term they never use.
PR people, not scientists are what is controlling the BS.
They call us global warming deniers and shout that we caused all of the a fore mentioned problems.
Emotion is where they have sent this discussion and emotion is what we will need to counter it
Most laymen have heard of ice ages with woolly mammoths and warm eras with giant reptiles so we know the climate has been here before. They never mention these periods.
We need pix of dead birds struck down by wind mills, baby seals eaten live by polar bears and old people suffering from lack of heat. Pictures that show the very poor scrounging dumps, using dung for cooking and the filthy water and lack of sanitary conditions. Show what trying to stop prosperity does to people of color who are the poorest among us.
I don’t think most poor people care if Al Gore’s house ends up submerged.
They don’t care about the science of tomorrow. They just want to eat, today.
It’s time to tack …. prepare to come about.
Show why they are the bad guys in no uncertain terms.
cn

CC Squid
Reply to  Chuck Nolan
April 16, 2013 10:07 am

You forgot to mention the people in the US who lost their homes due to the increase in gas prices. The middle and lower middle class could buy an inexpensive home in the outlying suburbs, now they cannot afford to drive to and from work. The choice was between the necessities of life for their families and the mortgage.

ferdberple
April 16, 2013 6:32 am

TLM says:
April 16, 2013 at 5:49 am
As others here have said, it is total nonsense to say that every day in a year has to be warmer than the equivalent day in a previous year or location to prove that it is warmer.
==========
The author has not said that. It is a straw man argument, a logical fallacy. The author has said this is test to see if warming is “unequivocal”. A phrase used by the IPCC and others.
What the author has shown is that there are years in the past that are indistinguishable from years in the present, so the argument that warming is unequivocal is the real nonsense.

ferdberple
April 16, 2013 6:39 am

peter azlac says:
April 16, 2013 at 2:27 am
That is the most significant aspect of climate change that is ignored by the IPCC in using computer models based on global average temperature…
===========
Agreed. Climate is not temperature. Two areas with exactly the same average temperature can have very different climates. For example, a rain forest and a desert can have the exact same average temperate, yet they could not be more different climate-wise.
By concentrating on average temperature, Climate Science is ignoring climate.

Brian Macker
April 16, 2013 6:39 am

Embarrassing analysis. Using this reasoning we would not be able to say summer was warmer than winter unless temperatures on every single day of summer exceeded those in winter. Why bother adjusting for yearly swings in temperature by lining up the graphs if you are not going to compensate from daily swings? If temperatures can swing daily by 30 degrees then this analysis will not agree that average yearly temperature has risen until things are 30 degrees warmer. Those calling this brilliant should understand that they are not equipped with the tools to do basic math let alone science.

Brian Macker
April 16, 2013 6:43 am

If you don’t like simple math like averages then maybe try sorting all the temps over the year and looking at the area between the graphs then.

ferdberple
April 16, 2013 6:55 am

jeez says:
April 15, 2013 at 10:14 pm
“Can we detect unambiguous warming …
IS ABSOLUTELY NONSENSE! How can it be anything else.
============
You have ignored the word “unambiguous “.
Science is not a voting system. It doesn’t matter how many examples you find that something is true. If you find a single example that something is false, then it is false. No ifs, no ands, no buts.
Climate Science is not conducting science, it is engaged in politics when it searches for positive confirmation. Somewhere along the line the school system has failed the recent crop of scientists. They have been taught political correctness and labelled it science.
What you are seeing in this paper is old school science. The way it used to be taught when we actually had to build things that worked. When folks actually used real data and opened their eyes and used their brains to analyze the results. Before we had computers to cook the books and tell us the “right” answer.

Chuck Nolan
April 16, 2013 6:56 am

ferdberple says:
April 16, 2013 at 6:39 am
…..By concentrating on average temperature, Climate Science is ignoring climate.
———————-
Great idea to ignore climate because that’s what they do.
People need to see graphs with real temperature so they see how ridiculous alarmists are.
That way, we can all play by the same rules.
cn

April 16, 2013 6:58 am

Enthalpy is the only scientific method of comparing energy in the climate system. Temperatures are only part of the equation, the other part is the amount of heat stored in the water vapor in the atmosphere. Without knowing that any discussion of global heating/cooling is useless. Moreover, what humans experience is the enthalpy of a location, not the temperature itself. In many areas, the “feels like” temperature is given along with the dry bulb temperature. You notice the difference between a temperature of 88 and a feels like temperature of 94.
It would be interesting to see some of Willis’ analysis of thunderstorms and their effects noted in the change in enthalpy, before and after the rainfall. That would certainly address the heat loss as a result of the thunderstorm. I think the magnitude of the change will surprise a lot of people. That heat engine is incredibly powerful.

Climate agnostic
April 16, 2013 7:15 am

The best and maybe only way to “measure” climate change is to observe nature. Glaciers melting, species advancing into new areas or retreating to old ones and forest lines moving upwards or downwards on mountains. Thus nature is a reliable thermometer.

Chuck Nolan
April 16, 2013 7:18 am

You can’t make a talking point out of Enthalpy.
You can’t make a sound bite out of Enthalpy.
You can’t make a sound bullet out of Enthalpy.
That’s why they don’t try.
cn

ferdberple
April 16, 2013 7:47 am

Interesting paper. Explains why Climate Science under-estimates the probability of extreme events being natural.
“Fractal systems extend over many scales and so cannot be characterized by a single characteristic average number (Liebovitch and Scheurle, 2000). ”
http://arxiv.org/ftp/arxiv/papers/0805/0805.3426.pdf
Fractal Fluctuations and Statistical Normal Distribution
A. M. Selvam
Deputy Director (Retired)
Indian Institute of Tropical Meteorology, Pune 411 008, India

Theo Goodwin
April 16, 2013 8:17 am

I applaud Darko Butina’s efforts to create a framework that takes thermometer readings as the data for claims about surface temperatures in climate science. His product is less than satisfying but might be improved.
The motivation for his product is clear. Anomalies do not report observations and were designed to make calculation of trends simple and easy. Thus climate science takes trends as its ultimate evidence. A science that rests on trends will never be accepted among the hard sciences. Using trends, climate science will lock itself in the prison occupied by economics.
Some have objected to Butina’s project that the necessary observations, thermometer readings, are not available for all needed times and places and that we must make do with what we have. But what we have, trends, provide no real connection to the world. The best thing for the future of climate science would be to create a regime for collecting data in the world. Whether that regime uses thermometer measurements or something else must be debated. Now is the proper time to undertake discussion of such a regime because climate science is in its infancy.

April 16, 2013 10:07 am

I too think homogenizing temperatures is not a valid use of the data, spacial temperatures are not linear, linearizing them should be a non-starter.
But I’ve gone about looking at real temperatures in a different fashion, I see the real issue as a hypothesized loss of cooling due to co2. The Earth cools at night, the question is, is there a loss of nightly cooling, and is there any evidence that it cools less at night now than in the past.
The short answer is no.
I use t-min and t-max, and create a daily rising temp (t-max – t-min), but how much does it cool? For that I take today’s t-max, and subtract tomorrows t-min to get a falling temp. Rising temp – falling temp gives a difference, this is the same as a daily anomaly (global temp chart, Map of stations). This number is useful for analysis, it can be averaged across all the stations in an area to remove the effects of weather, also when the area is isolated to a single hemisphere you can see the progression of temperature as the ratio of day to night changes (60 years of NH Diff * 100).
From daily diff, you can look at the slope of the change of temp each day (Summer to fall, Fall to summer).
And if you plot the change of slope of the change of daily temp as the ratio of day to night changes you get this.
So, while there is no trend or really any significant change in nightly cooling (~18F/day), no trend in the average annual diff (it’s both positive and negative), there is a trend in daily diff as the ratio of day/night changes, that seems to align with ~1970 to ~2002 cycle, it’s only one half of a cycle, so who know if it’s really a cycle, and since it’s present as a change in daily warming and cooling it almost looks like it’s from an orbital wobble.
But whatever it is, it was recorded in the temperature data, waiting to be found, and it doesn’t appear to be related to an increase of co2.

April 16, 2013 3:05 pm

The people commenting on Enthalpy have to core problem identified. I asked Dr. Mueller about this and his response was less than comforting — some hand waving and “trust me”. Ha.
Ignoring enthalpy causes no end of idiocy. You cannot “average” temperatures across time much less geography without going through enthalpy. If you add in the error caused by not using enthalpy you get a bubble of space to draw infinite flat lines through.
Beware the BEST data download. Most of the 4.5GB is the error file of which most of it is 0s….

tobias smit
April 16, 2013 9:49 pm

I have a Stevenson screen at home ( beside an orchard and no other “heat or cold” sources)
Can some one help me here , we have been taking measurements and recording them for about 20 years. I understand that the merc thermometer temperature gives me the high reading but that instrument cannot give me a true low reading in a 24 hr period , we have as well the alcohol (min-max) thermometer that gives the true low reading in said 24 hr period. so now I have 2 instruments. And then there is the the time of day each reading takes place. If I take a reading at 5 am is that the coldest reading? No in my opinion ( 7 am might be colder) and if I read the merc at 3 pm is that the hottest reading or is it 2 hrs later?? Or do they then self correct the next day (after resetting them) if I take the readings at even more different time slots? I think the errors would compound and at least are questionable… HELP ! ( BTW we do try to take measurements as close as possible in the AM and PM), thanks

April 17, 2013 8:33 am

Few very brief comments from the author of this report and the paper. I have sent the second part report to Anthony that deals with the main part of the analysis of Armagh dataset using clustering and kNN algorithms to quantify differences between 730-bits annual fingerprints. Another point to make is that some readers have either miss-read or miss-interpreted importance of number and magnitude of switch-overs when comparing any two years. I have started by comparing 2004 vs 1844 but went to say that I have written a special program to compare every year with every other year in dataset which means doing 161×160 comparisons which clearly shows that the switch-over patterns is the norm. Let me give you some numbers: 2004 is on 399 occasions warmer and on 250 occasions colder than 1844 with maximum difference in one direction of 10.9C and 8.8C in other. at Waterloo, 2009 is on 239 occasions warmer and 375 occasions colder than 1998, with total range of switch-over of 38.5C, while at Melbourn, 2009 was 219 times warmer and 146 times colder than 1998 with total range of 43.2C (21.7C in one direction and 21.5 in another). That to me means two things – the switch-over is happening every few days and coupled with the sheer magnitude of those switch-overs it is impossible to declare one year either warmer or colder, if you look in thermometer data and if you are not playing some silly-numbers game. Since the same patterns have been observed on two different continents, using any scientifically based logic, one would have to come to conclusion that those patterns should be found at all other weather stations that record temperatures on daily bases. Following the standard practices in experimental sciences, I have asked readers to be skeptical and to prove me wrong, not by expressing their opinions on global warming or annual temperatures, but by actually looking into thermometer data. I even offered the award for the first person who proves me wrong. And can I emphasize again, it is not me who is claiming that there is unequivocal global warming, that is official line of man-made global warming community – all I am saying is that I cannot find either warming or cooling in thermometer data and that nobody has bothered to look in thermometer data before me and report that work. I hope that things will become much more clearer once Anthony publishes my Part 2 report. Darko Butina

April 17, 2013 3:18 pm

Dr, Butina. It was not clear from the original article that “unequivocal” was a claim made by the IPCC. Nor that your exceedingly stringent criteria is the requirement for “unequivocal”. If your Part 1 had made the linkage between “unequivocal” and IPCC, it would have strengthened the point.
I’m still not sure that Ta(i)>Tb(i) for i=1 to 730 for year a and year b is a necessary condition for “unequivocal” i.e. “leaving no doubt.” No doubt it would be a sufficient condition for unequivocal, but I do not see it as a statistically necessary one.

Chuck Nolan
April 17, 2013 6:35 pm

“So even before I started to collect daily data that are available in public domain, I was almost 100% confident that I will not find any alarming trends in thermometer data. And I was proven right.”
————————————
I thought one was not supposed to start investigations with a bias.
Another “gut feeling” in science.
wtf over?

Chuck Nolan
April 17, 2013 6:42 pm

During his computational side of drug discovery
——————————
I don’t ever recall having a computational side during my drug discoveries.
/sarc
cn

Chuck Nolan
April 17, 2013 7:04 pm

Brian Macker says:
April 16, 2013 at 6:39 am
Embarrassing analysis. ………………………..Those calling this brilliant should understand that they are not equipped with the tools to do basic math let alone science.
—————————————————————
Just who do you think the CAGW propaganda is designed to reach?
You? No way.
The layman and his emotions are the target.
Few people are stupid enough to actually sit and think about it and using their ability to reason they would then decide it’s a good idea to stop the poor from having electricity and clean water.
I don’t think the statue of liberty will be submerged, do you? And if it was heading there, which would take a long time, I doubt it would be there or we could stop it.
The paper says alarmists are not presenting science.
imo They produce propaganda.
The game is afoot and we lag.
cn

Mike Rossander
April 17, 2013 7:31 pm

As much as I like this article, I can not agree with the basic premise. To say that one year is warmer than another if and only if “one year has every single daily thermometer readings larger than another year” is absurd. We say that something is warmer when it has more heat energy in it. We can make that statement either at a point in time or over a period of time. The example of point-in-time would be to say that boiling water is warmer (has more heat energy) than ice. This, incidentally, is what the thermometer measures. When expressed over a period, you take the integral of the heat at each point in time. The fact that the thermometer can not directly measure the integral does not invalidate the physical reality. A block of ice over the period of a day is still colder (has less heat energy) than the same mass of boiling water.
The challenge is calculating the integral with the temperature is changing. But it is only a challenge, not an impossibility. Which system contains more heat energy during the day – a kilogram of water held at 0 C for one hour then heated to 99 C for the next 23 hours or a kilogram of water held at 1 C for the full 24 hours. By the author’s methodology, those two scenarios can not even be compared yet even the most cursory analysis clearly shows that the first scenario has an “unequivocally” greater quantity of heat energy within the defined period.
Now, I will freely admit that evaluating the temperatures of the two scenarios in hourly units provides more useful information than the simplistic daily average. We should never throw away information unnecessarily. But to go so far beyond that as to define “warmer” as requiring every single reading to be warmer flies in the face of both common sense and the meaning of the concept of heat.

April 18, 2013 7:06 am

milodonharlani says:
April 15, 2013 at 1:00 pm
I wondered also why the author went on at such length re. C. v. F. Conversion is just arithmetic.
—————————-
In theory yes, but go back to the idea of temperature being +/- 0.5 degrees because of the accuracy of reading a thermometer. +/- 0.5degF is only just over half the range of +/- 0.5degC. And if the reading is 78degF (+/- 0.5degF), will you round that to 26degC or insist on 25.5555degC? That’s almost 1degF different just in the rounding. So it’s not easy to convert a reading in F into one in C and preserve the *intention* of the reading (ie. 78degF +/- 0.5degF) in a conversion to C, because it would not have been 26degC +/- 0.5degC even if the two thermometers had read the same. Assuming the accuracy of the article, it’s likely the human reader would have said it’s 26degC rather than 25.5degC, and certainly not 25.555degC. So we generate differences (of the same order of magnitude as AGW!) just from thinking about the conversion from F to C. And don’t we have this still in US temperature records even now, so it’s not even that it’s just a historical issue.
On a wider point, this paper assumes it’s ok to average max and min temps to get a single daily temperature. I’d love to see the graphs recreated looking at all the max and all the min temps to see if there’s more can be learned from that.

Reply to  Peter Ward
April 18, 2013 8:07 am

Peter Ward says:
April 18, 2013 at 7:06 am

On a wider point, this paper assumes it’s ok to average max and min temps to get a single daily temperature. I’d love to see the graphs recreated looking at all the max and all the min temps to see if there’s more can be learned from that.

While I can’t say whether this is okay or not, it is what NCDC does to generate their Mean temp in their Global Summary of Days data set, they take the mean of min and max.

Jim Butts
April 18, 2013 7:33 am

Following this crazy logic, Temperature itself is not a valid concept since it is the average of all the energies of the molecules of the gas for example.

Brian Macker
April 18, 2013 2:53 pm

Ferd,
” Rather to shows that this is not unequivocal, because there are some years in the present that cannot be distinguished from some years 100 years ago. This should not be the case if the result is unequivocal.”
Not true. Random variations (a one spot on the planet) could have cycles longer than a year. If the predominate wind direction were to vary then warm air off the Atlantic might be reduced during one year. That doesn’t mean the extra heat doesn’t end up somewhere else. This article is just silly on so many levels. Why insist that every day be warmer? Why not insist that the warming is not unequivocal until the minimum temperature recorded in the last year is greater than the maximum temperature of the starting year. Why not insist that to be unequivocal the coldest day of winter of the end year has to be warmer than the hottest day of summer in the starting year? He seems to think that only the axis of the earth effects temperature variation and not wind and a whole host of other factors, because he has attempted to adjust for that but nothing else.

Brian Macker
April 18, 2013 2:57 pm

Chuck Nolan,
“They produce propaganda. The game is afoot and we lag.”
We? WE? What is this we shit paleface? I’m not interested in producing propaganda to fit some preconceived result. Apparently you think this article is propaganda which is a far worse interpretation than I would give because it shows malice against the truth (aka lying). I just think that he is confused.