A few weeks after my paper came out I have received quite unexpected but greatly appreciated offer from Anthony to write a summary of the paper for his blog site. The paper’s title is:
Should We Worry About the Earth’s Calculated Warming at 0.7OC Over Last the Last 100 Years When the Observed Daily Variations Over the Last 161 Years Can Be as High as 24OC?
Guest post by Darko Butina
The paper is unique and novel in its approach to man-made global warming in many respects: it is written by experimental scientists, it is published in journal that deals with data analysis and pattern recognition of data generated by a physical instrument, it treats the Earth atmosphere as a system where everything is local and nothing is global, and it is the first paper that looks for temperature patterns in the data that is generated by the instrument designed to and used by experimental scientists since early 1700s – calibrated thermometer. What is also unique is that every single graph and number that I have reported in the paper can be reproduced and validated by reader using data that is in public domain and analyse that data using simple excel worksheet. There are two main conclusions made in the paper:
1. That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year
2. That the Hockey Stick scenario does not exists in thermometer data and therefore it must be an artefact observed in a purely theoretical space of non-existing annual global temperatures
The paper is long 20 pages and analyses in great details a single weather station of daily data, dataset collected at Armagh Observatory (UK) between 1844 to 2004, one of very few datasets in public domain that have not been either destroyed, corrupted or endlessly re-adjusted by the curators of the global thermometer data at East Anglia University or NASA.
Before we start to analyse this paper, few points need to be made about experimental sciences for my paper to be properly understood. ALL our knowledge and understanding about the physical world around us comes from analysing and learning from data that has been generated by an experiment and measured or recorded by a physical instrument. Let me demonstrate this point by a very simple example of what happens when we record air temperature by some fixed-to-ground thermometer:
Thermometer reading of 15.1 has several links attached to it that cannot be broken: it is linked to a unique grid point, unique date and time stamp, unique instrument – thermometer and that thermometer to unique symbol (OC). So if someone wants to analyse any temperature trends those trends have to come from thermometer readings; it follows that if thermometer to be used is calibrated using Celsius scale, no datapoint can be older than 1743, follow link to Anders Celsius. Since we know for a fact that the annual temperature ranges will depend on the location of that thermometer, and since mixing different datasets are not allowed in experimental sciences, it follows that if there are, say 6000 weather stations (or fixed thermometers) in existence and across the globe, the first step before raising an alarm would be to analyse and report temperature patterns for every single weather station. That was what I was expecting to see when I started to look into this man-made global warming hysteria three years ago, following the revelations of Climategate affair. But I could not find a single published paper that uses thermometer-based data. So we have situation that the alarm has been raised, the whole world alarmed, suicidal economic policies have been taken while totally ignoring the data generated by the only instrument that has been invented to measure temperature – the thermometer. Instead, thousands of publications have been written looking for temperature trends in purely theoretical space that does not and cannot exist, the space of annual global temperatures. Two key papers have been published earlier, both arguing and explaining why global temperature as a single number does not exists, Essex et al., in 2007, using statistical arguments and written by recognized statisticians, while Kramm and Dlugi in 2011 have shown why the Earth’s atmosphere cannot be treated as a homogeneous system but should be perceived as a network of local temperature systems, from astrophysics point of view.
The starting point for my paper was based on facts that it is impossible to have arguments and ambiguity when it comes to thermometer. If you have two readings, only one outcome is possible: T2>T1 or T2<T1 or T2=T1. So if one wants, for some bizarre reason, to compare two annual patterns then one year can be unequivocally declared as warmer only if each daily reading of that year is larger than each corresponding daily reading of another year:
This artificially created graph above has real year in Tmax-Tmin space from the Armagh dataset while the year ‘y2100’ was result of adding 15C to each daily reading of y1844. My point here is that everyone seeing that graph would come up with an identical conclusion – y2100 is unambiguously warmer than y1844. So my perfectly valid question was – why would anyone went to trouble to invent something that does not exists while ignore obvious source of temperature data – the thermometer data? My 40 years of experience in experimental sciences offered a most obvious answer to that question – because nothing alarming could be found in thermometer-based data. There is a quite simple rule when it comes to interpretation of data – if more than a single conclusion could be made about any given dataset it means one of two things: either that dataset is of right kind but more data is needed to understand the data, or the data is of the wrong kind. Every single graph and number that is found in my paper can be independently reproduced and validated and therefore the thermometer data is the right-kind of data to use but we need more of it to fully understand temperature patterns observed on our planet.
The opposite is true when we look at the calculated and not measured data called annual global temperatures. Nobody knows where the data comes from, since data is calculated the only way to validate it is to use another set of calculations, it has been constantly adjusted, modified and different trends have been generated on daily bases using ever-changing arguments. When you dissect this very complex looking scientific problem of man-made global warming to its basic components, what you find is that the whole concept of global warming and climate change has nothing to do with science but everything to do with a very desperate attempt to connect temperatures with a few molecules of CO2 that have been generated by burning fossil fuels, while ignoring vast majority of CO2 molecules that have been generated by nature itself. It must follow that if those alarming trends could not be found in thermometer data, than that data must be removed and new data created, type of data that cannot be either proved wrong or right and allow proponents of man-made global warming to generate any trend they need and to enable them to claim that they know everything about everything when it comes to our planet. But, the only problem with that approach is that you cannot cheat in experimental sciences and slowly but steadily, retired scientists like me, with bit of free time will start to look into this problem and use their respective expertise to critically evaluate the supposed science behind this man-made movement.
So even before I started to collect daily data that are available in public domain, I was almost 100% confident that I will not find any alarming trends in thermometer data. And I was proven right.
Let us now start with the experimental part of the paper, the part where all the details of the dataset and dataset itself are presented. The paper is 20 pages long and all conclusions are based on detailed analysis of the Armagh (UK) dataset that covers period between 1844 and 2004. Dataset can be downloaded from the Armagh Observatory website as two sets of files, Tmax and Tmin files:
http://climate.arm.ac.uk/calibrated/airtemp/tccmax1844-2004
http://climate.arm.ac.uk/calibrated/airtemp/tccmin1844-2004
Depending on the software that one wants to use to analyse data, it is important to format all datasets in the same way. Since all commercial software expect as default data to be read in row-wise manner, reformatted Armagh dataset was created as a matrix containing 161 rows (1 row for each year) and 730 columns (1 column for each day-night readings):
BTW, all the graphs and tables from my paper are presented as JPG image and once I made my paper available free of charge on my own website you will be able to match all those graphs presented in this report to the original ones in the paper.
As a result, we now have annual temperature pattern, let us call it ‘annual fingerprint’, as a 730-bit fingerprint with the first 365 bits assigned to Tmax 1 to Tmax 365 (Jan1 to Dec 365 daytime readings) followed by 365 bits assigned to Tmin 1 to Tmin 365 (Jan1 to Dec 365 night-time readings). So, the annual fingerprint space can be seen as a 161 (years) x 730 (daily readings) matrix. Looking at the table above column-wise, we have ‘day fingerprints’, each of them 161-bits long representing the history of each day-night readings over period of 161 years. Once this table is created, we need to decide what to do with the missing values and with the extra day in February in leap years. We delete that extra day in February, but with great care not to get rest of the year out of the sync. There are two options when dealing with the missing datapoint – either replace it with some calculated one or remove the whole column.
The danger of replacing missing value with some calculated one is that we are contaminating instrumental data with some theoretical data, and unless we really understand that data the safest way is to remove all columns that contain even a single missing datapoint. Once you remove all columns with missing data you end up with 649-bit annual fingerprints, 89% of the original data, i.e. loss of 11% of total information content that is contained in that dataset, but with knowledge that the starting set is not contaminated by any calculated data and all datapoints are generated by thermometer itself.
Now we have our table in excel, table containing 161 years of data where each year is collection of 649 day-night readings and we can ask the data that 64 million worth question: Can we detect unambiguous warming trend over 161 years at Armagh (UK) in thermometer data? All we need to do is to take difference between the youngest (2004) and the oldest (1844) annual fingerprints and display it as a histogram:
Let me briefly digress here to make the following point – when you analyse instrumental data you have to know accuracy or error levels of the instrument that is used to generate the data. If we assume accuracy of thermometer used in 1800s at +/- 0.5C that means that for two readings to be declared as different, the difference between them should be larger than 1.0C. For example, if T2=10.0 and T1=10.8 we have to declare those two readings as same, i.e. T2=T1, since those two readings fall within the error levels of that instrument. If T2=10.0 and T1=20.0 then the difference is real since it is way outside the error levels of the instrument.
So, what is this simple graph (Figure 5) telling us? First thing to notice is that year 2004 cannot be declared either as warmer or colder than 1844 since every few days there is this switchover occurring making 2004 few days warmer than few days colder than 1844. Second thing to notice is that the size of those switchovers can be as large as 10C in one direction and 8C in another, i.e. 18C in total – way above the error levels of thermometer and therefore those switchovers are real. To make sure that those switchover patterns are not some artefacts unique to those two years, I wrote a special program (in C) to systematically compare every year to every other year (161 * 160 = 25760 comparisons) and on average each year is 50% of time warmer and 50% of time colder than any other year in the Armagh dataset. What make things even more complex is that there is no obvious pattern in terms when the switchover occur and magnitude of it, as it can be seen when two different year pairs are plotted on the same graph:
So far, all I did was to plot the original data, without any adjustment and without making any prior assumptions. I did not start this exercise to either prove or disapprove existence of global warming, but to see what the actual data is telling us. And what the thermometer is telling us is that the sheer magnitude of those apparently random and chaotic switchovers are due to natural forces that we do not understand, yet, and the anti-scientific process in which all complexity of annual temperature patterns is removed, replaced by a single number and suddenly we ‘see the light’ cannot be used to acquire any knowledge. If we use a simple logic the following logical construct could be made: dataset that is based on thermometer readings contains 100% of information content when it comes to temperatures. If we reduce that 730-dimensional space into a single number, we reduce the information content of that dataset from 100% to 0% – i.e. there is no information there left to gain any knowledge. Let us do the following question/answer exercise to compare two datasets – one that has day-night thermometer readings for a single year and one where 1 number represents 1 year
Q. What is total range of temperatures observed in Armagh?
A. Lowest temperature observed was -15.1C on February 7 1895, the highest temperature +30.3C recorded on July 10 1895; Total range 45.4C
Q. What is the largest and the smallest natural fluctuations observed for individual days?
A. Day that has the most variability is May 4 (Tmax125) with total observed range of 23.8C, while day with least amount of variability is October 29 (Tmax302) with the observed range of ‘only’ 9.9C
In contrast, each year is presented as a single number in the annual temperature space, number obtain by averaging all daily data to a single number, and there are not too many questions that you can ask about that single number. Actually there are none – not a single question could be asked the number that has no physical meaning! For example, if two years have identical annual average we don’t know why they are the same, and if they have two different annual averages, we don’t know why they are different. If we do the same exercise in daily data, we know exactly which days are moving in a same direction and which days are moving in opposite direction.
Let us now ask the most obvious question – are those patterns, or rather lack of patterns, observed in Armagh unique to UK, i.e. are they local or do they reflect some global patterns? Scientific logic would suggest that the same random/chaotic switchover patterns observed in Armagh should be observed across the globe with the only difference being the size and magnitude of those switchovers, i.e. local variations. To test that I took two annual temperature samples from two weather stations on two different continents, one in Canada and one in Australia:
Waterloo (Canada):
Melbourne (Australia):
Please note difference in both, patterns and magnitude of switchovers.
Let me make a very clear statement here – the choice of Waterloo and Melbourne weather stations was driven by the ease to find weather stations with relatively easy-to-download formats and I did not get involved in method of cherry picking weather stations that fit patterns found in Armagh, as it is normal practice in man-made sciences. To prove that last point and to challenge readers to start looking into the real measured data and stop looking into non-existing and calculated data like annual global temperatures, I will offer a modest financial reward of £100.00 (UK) from my pension, to first person who finds a single example of year pair where one year has every single daily thermometer readings larger than another year. Any weather station that is not on permanent ice or sand (I don’t know what to expect in those cases) and any gap between two years. Obviously, the winner will have to give the link to the original data and contact either Anthony or myself at darkobutina@l4patterns.com to claim the award.
The way I see it, I am here in win-win situation. If nobody can find weather station that shows unambiguous warming trend, and if we keep record of all those analysed weather stations I saved the money but gain large number of additional information that should finally kill any notion of the man-made global warming hypothesis, since the proponents of that hypothesis would have to explain to general public those patterns observed in thermometer data. In strictly scientific terms and using null hypothesis that either all weather stations count or none does, I have already proven that those patterns are real and observed on three different continents, and therefore prove that the global warming trend does NOT exist in thermometer data. On other hand, if someone does find clear and unambiguous warming trend in thermometer data, that work will again make the same point – all temperature patterns are local and ONLY way to declare that trends are global is if ALL individual weather stations are showing the same trends.
This concludes this Part One report in which I explained how the first conclusion “That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year” in my paper has been reached.
The second conclusion in my paper which explains why the Hockey Stick scenario does not exist in thermometer data will be reported in a separate report. In Part Two report I will introduce two different bits of software, my own clustering algorithm and k Nearest Neighbours algorithm, or kNN, both used in sciences like pattern recognition, datamining and machine learning and apply them to annual temperature patterns observed in Armagh. The overall conclusions will obviously be the same as we have reached so far, but I will demonstrate how the observed differences between different annual patterns can be quantified and how we can use those computational tools to detect ‘extreme’ or unusual annual temperature patterns, like annual patterns of 1947 which is the most unique not only in Armagh but also in the rest of UK.
==================================================================
Dr Darko Butina is retired scientist with 20 years of experience in experimental side of Carbon-based chemistry and 20 years in pattern recognition and datamining of experimental data. He was part of the team that designed the first effective drug for treatment of migraine for which the UK-based company received The Queens Award. Twenty years on and the drug molecule Sumatriptan has improved quality of life for millions of migraine sufferers worldwide. During his computational side of drug discovery, he developed clustering algorithm, dbclus that is now de facto standard for quantifying diversity in world of molecular structures and recently applied to the thermometer based archived data at the weather stations in UK, Canada and Australia. The forthcoming paper clearly shows what is so very wrong with use of invented and non-existing global temperatures and why it is impossible to declare one year either warmer or colder than any other year. He is also one of the co-authors of the paper which was awarded a prestigious Ebert Prize as best paper for 2002 by American Pharmaceutical Association. He is peer reviewer for several International Journals dealing with modeling of experimental data and member of the EU grants committee in Brussels.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
@jorge
yes, I agree completely, the whole method seems fraught with massive uncertainty.
We have warm years , we have cooler years.. Its called NATURAL climate variability.
I was just looking at the graphs from a mathematical point of view. Please don’t think, for one instance, that I believe that they could show anything worth bothering about. ! 🙂
Mr. Mosher and Svalgaard: your points are well taken. Out of curiosity, though, if your were to take what you feel are the best observations available and simply present the data without averaging, grinding, etc. what do the results look like?
vukcevic says:
April 15, 2013 at 3:12 pm
Northern Ireland is generally milder than central England, I suspect the temperature taken at Armagh Observatory is heavily influenced by the Granite rock that is found all over N Ireland and the lough Neagh, the largest lake in Ireland and the UK, the other point you made about Belfast being 30 miles away from Armagh etc… It is but its on the other side of the Lough.
You refer to the difference I have termed “Procedural Certainty” vs “Representational Certainty”. The first is about the math, how it was done, what the statistical certainty is of similar results coming from a similar data management under certain randomness assumptions about the data. The second is about how well the result of all this procedural machinations correlates to what you are trying to determine, here the temperature record that would have been created using the equipment that we are using to determine the target parameter.
I have long argued that the global warming numbers could largely be an artefact of collection and adjustment, a statistical result of combining regional differences to produce an artificial global image that at present gives a “warming” profile but could easily have produced a “cooling” profile (and may yet do so) without an actual change in the total energy content of the system.
On 50% of the planet, a TOA TSI changes from 729.4 W/m2 on January 4th to 636.6 W/m2 simply as a result of orbital eccentricities. (The number of 341.5 W/m2 we are always told is the full planet, full year average. But half the world is in darkness at all times, and the orbital distance of the Earth from the Sun changes by 3.3%, meaning that the SI changes by 6.8% through the year.)
The TOA SI on the sunlit side therefore varies by 92.8 W/m2 from winter to summer, and yet the planet stays within 2C all year, and THAT is in the Northern Hemisphere that receives proportionately the lesser amount of SI.
The Earth is a giant energy redistribution system in which those like Trenberth claim they can determine a “missing” 0.58 W/m2 whole Earth power (1.16 W/m2) because they can measure exactly how this energy is distributed – despite an obvious 92.8 W/m2 that is being whipped around without our understanding. Others, like Hansen, claim that the heat redistribution system is in constant regional balance to less than that amount (for error considerations): 0.29 W/m2 whole Earth, 0.58 W/m2. If the energy redistribution is not consistent as to WHERE it goes to that level, natural thermal conductivities and capacities are going to change the resultant temperatures. Which then changes the “global” temperature.
The math employed is Procedurally sound and the outcome, certain – as a mathematical entity. The math does not necessarily (and I would say does not actually) present us with a Representationly sound or certain result with respect to what we are trying to understand, i.e. if the Earth is “heating up” as a result of anthropogenically introduced CO2.
I don’t agree with this idea that you cannot use an “average” … I would bet anything you like, that if thermally isolated a massive block of concrete in the ground with a Stevenson screen over it … the temperature of the block would be an average of the preceding temperatures.
Obviously we could spend a lot of money creating such a measurement device which (largely) removes the short term intra-yearly variation … or we could just use the measured temperature and calculate it.
The real problem with averaging is not the averaging technique itself, but the statistics of non homogenised noise. The problem is that most scientists are usually clueless about real world noise. and … for the select few of us who are electronics engineers … we can laugh at this ….they have this daft idea that averaging gets rid of noise!!!
Yes!! Scientists seriously naive enough to think that averaging gets rid of noise!! They haven’t a clue that most real world situations contain noise that cannot be averaged out.
In any real world situation (where the signal is changing), averaging does not increase signal to noise but DECREASES IT!!
This is because real world signals have a finite period … whereas 1/f noise has an infinite period and and so, if you average 1/f noise, eventually all you are left with is the noise!!
To use a simple example … if your signal is only present for the last 40 years, you have a finite amount of your signal in the frequency range up to 8E-10 Hz rise to a finite maximum at 0Hz. In contrast, 1/f noise has a finite level at 8E-10 Hz (i.e. when you take a 40year sample), but that noise increases if you take an 80 year sample, increases again for a 160year sample and it keeps on increasing and increasing NOT DECREASING the longer your average and approaches an infinite level of noise as you approach a frequency range of 0Hz (an infinite sample).
So, the lower you make your frequency filter (i.e. the longer the period over which you sample) the more the noise to signal ratio.
Usually scientists are protected from their ignorance by the simple fact that they take samples over short periods where 1/f noise is insignificant. So their ignorance doesn’t usually affect their results and they happily average their signals in the naive belief that all noise is removed by averaging.
But when you get these ignoramuses who have no idea of real-world noise and start giving them real world signals, not of seconds, nor hours nor even days for which their lab-based simplistic noise concepts is attuned, but start giving them signals covering spans of months, years, decades and …. heaven forbid … centuries or millennium. These simpletons move outwith their competence in understanding noise and (as we see from the global warming debacle) they make a complete hash of understanding what is going on confusing natural 1/f noise with real signal.
To use a simple example … imagine a scientist trying to determine how climate affects river flow. Their concept of natural variation is that of instrumentation noise, which is easily reduced by averaging. So … they measure river flow and temperature. First they do it over 1 year. No change, then they do it over two … a small change, eventually waiting 40 years they report that the river flow has definitely changed (as well as temperature). And then conclude (whatever the temperature change) that it caused the river flow to change.
What is wrong?
The problem is that river flow is constantly changing. But worse, the change seen in any year is smaller than expects in any decade which is much smaller than one expects in any century which is much much smaller than one expects in any millenmium.
The problem is that the apparently “constant” nature of the river hides the reality that the river bed is not a static channel. Each season new silt remoulds the bed. Each year, the banks erode, so that the rate and type of flow changes from year to year and like a random walk, rather than tending toward some “normal” flow, the river bed itself meanders across the landscape so that the flow tends more to a random walk than a constant unchanging entity.
On even longer timescales – even the mountains which give the static head cannot be considered static as they too erode (what do you imagine creates the silt but the rocks rubbing together … and where do the rocks come from but … the mountain). So, the river itself is constantly changing, the dynamics are constantly changing AND UNLIKE THE NOISE SCIENTISTS ARE USED TO…. REAL WORLD CHANGES CAN KEEP CHANGING IN ONE DIRECTION so that the longer one waits … the more it changes.
So, real world signals are full of natural variation in the form of a multitude of underlying trends and that variation, far from being “averaged out”, actually gets more dominant as you average it.
Unfortunately Armagh Observatory was heavily affected by coal and peat burning in Armagh Town. This is from the 19th century.
In his Foreword to the Armagh Catalogue2, Robinson had some hard things to say about the general siting. The prevailing west to south-east winds, he complained, were apt to “drive smoke from thousands of chimneys from the town over the Observatory, and interfere, by heated air, for nine months out of twelve”.
A state of affairs that persisted until the 1980s. Minimum temperatures usually occur after dawn, when incoming solar radiation exceeds OLWR. Smoke reduces early morning incoming solar radiation and hence decreases minimum temperatures.
Armagh minimum and average temperatures are probably most useful as a proxy for coal and peat consumption in Armagh Town.
This is an interesting approach. I too have battled in my field for measurements before/over hypotheses. Thank you, Mr. Butina. I look forward to updates, whatever these might show.
Data analyses of this sort are not likely to convince anyone of anything. In fact it will do harm because it appears to be a very strained attempt to prove that no warming at all is occurring. To say that one year cannot be said to be warmer than another year unless all corresponding days are each warmer than the other is just ridiculous.
We who are deniers should be more clear about what we are denying. When I am asked if I think there has been warming I say maybe, perhaps about 0.7 Kelvin (about 0.3%) over 150 years. When the follow up question about humans causing the warming comes I say maybe, since CO2 is increasing in the atmosphere by amounts consistent with human use of fossil fuels and CO2 is a greenhouse gas which will cause warming if nothing else changes (like cloud cover for example) but the effect is rather small, a doubling of CO2 would lead to about a 1 deg K temperature increase (about 0.3%). If asked what we should do about it, I say nothing because first, a temperature increase of 1 degree over the next 100 years is nothing to get alarmed about and moreover global warming and increased CO2 in the atmosphere are both good things enhancing both plant and animal life on the planet. When they then launch into the “but what about all the extreme weather” questions I tend to give up and respond that that is all BS.
Forgot to include the Armagh temperature graph.
http://junksciencearchive.com/MSU_Temps/Armagh_an.html
Note the decreased minimum temperatures as affluence increased post 1960 (= increased fuel consumption) and the abrupt increase when the 1981 N Ireland clean air act was implemented.
Could somebody please create and publish the usual graph of “global temperature” for the last 130+ years plotted with a scale of 0 to 30 dgrees C, and add to it about a dozen graphs of a typical days temperature at various locations where lots of people actually live, around the world (i.e. not the poles)
It should look like almost a horizontal line with a bunch of wildly swinging curves between 0 and 30 degrees.
It always helps to put things in perspective.
I’ve placed Dr Mann’s frightening 1998 hockey stick on a chart of the Galva, IL mean annual max and mean annual min temperatures, all at the same scale.
http://www.rockyhigh66.org/stuff/hockey_stick_d_galva.gif
Philip Bradley says:
“Armagh minimum and average temperatures are probably most useful as a proxy for coal and peat consumption in Armagh Town.”
Armagh: Average November Sunspot Number and February Minimum Temperature 1875-2012
http://thetempestspark.files.wordpress.com/2013/02/nov-ssn-v-feb-tmin-1875-20121.gif
Darko, I like your article very much, it is a heavy blow on the empty head of the warmism monster. No wonder some people have started obfuscating the matter by introducing fallacious comparisons to bank accounts and talking about radiation and CO2.
I guess, before the problem of comparisons between years there is another one: comparisons between days. Averaging Tmin and Tmax of days and then comparing the averages seems to be equally nonsensical to me. Maybe you could give it another thought.
I think the problem with averages is that we have to be careful with what we are actually averaging. An average temperature for a particular day, or even simply recording Tmin and Tmax might not take into account the actual temperature changes throughout the day. An average is saying that the average energy in the atmosphere across the whole day was some particular number, but that only works if the temperature changed uniformly across the whole day.
You don’t need to look at many days to know that this does not happen. Suppose that it gets to 35C at 1pm, you might suddenly get a thunderstorm and it drops back to 20C by 2pm until the night. In that case, the 35C maximum doesn’t really represent anything meaningful. On the other hand, it might start at 18C at 6am and very quickly (by 10:30am) get to 30, and stay there all day. Now which day would you say has been hotter?
This study suffers from the exact same problem. However I think that it suffers from another. The author states that you need every day’s temperature to be higher to consider one year hotter than another. I disagree with that. I say that the total energy that was present in the atmosphere needs to be higher to consider one year hotter than another. But how do we do that? We have to integrate every single temperature measurement throughout the whole day to get the actual energy content of that day. I doubt that there are very many weather stations with detailed enough records to be able to do that.
In conclusion, I don’t think that we have detailed enough records to be able to say many meaningful things about average temperatures of any particular year. Of course maximums and minimums still mean something; clearly Montreal gets colder than Sydney. But, I don’t know that it says as much as everyone would like us to believe.
Dear moderators, your spam filter swallowed my comment…
[Reply: I check the spam bucket at least 15 – 20 times every day, and rescue legit comments. Your patience is appreciated. — mod.]
Defining specific climate change with a single attribute, such as temperature, when the attribute is a composite of statistical manipulations is disingenuous. The same is true of the attribution of climate change to a single factor, such as CO2.
This paper is no less extreme or scientifically valid than Mr. Mann’s. One attribute at a single data point is just that: one attribute at one data point. Proxies are just that, proxies.
The science of climate change is not settled because man does not understand the physics and chemistry completely enough, the extant data sets are not finite enough nor spatially adequate to support irrefutable conclusions.
Expending mental capital expressing premature opinions delays understanding. And historically has made fools of the authors. But then, the dead have no ego.
Jarryd Beck just beat me to it. Averaging the daily high and the daily low is ludicrous really. You do have to integrate throughout the day to get something meaningful. And even if you could (which is technically but not economically feasible) you are still merely measuring the local variations, with a very spotty grid, plus dishonest scientists with a mission. The whole exercise is nonsense, particularly when expecting robust date to an accuracy of less than a degree or so.
And then for the ocean you would need to integrate over the full depth in millions of locations.
Jarryd Beck says (April 15, 2013 at 6:28 pm): “An average is saying that the average energy in the atmosphere across the whole day was some particular number, but that only works if the temperature changed uniformly across the whole day. …We have to integrate every single temperature measurement throughout the whole day to get the actual energy content of that day.”
=======================================================
This way you can not learn anything about “energy in the atmosphere” or “actual energy content of that day”. Because the air is moving. Every time you measure the air temperature it is another air. There is such a thing like wind in our nature. The temperature can decrease for different reasons. It might be clouds covering the sun. Or it might be a cold wind from the north. How can all that be reasonably put together and averaged?
“Climate science” should be closed.
Thanks, Darko.
The Armagh databases are very interesting.
I’ll be waiting for more from you.
k scott denison asks (at 4:14 pm)
… Out of curiosity … if you were to take … the best observations available and simply present the data without averaging, grinding, etc. what do the results look like?
If one does not like averages, one could take sums. “Degree Days” are often used in heating energy calculations and agricultural forecasts. For our present probelm we could, for every one of the days-of-the-year for which there is no missing data for any year, sum the degree days (for some arbitrary low temperature) (and using either min- or max-temps).
Some would suggest that such yearly totals would approximate our intuitive understanding of how to compare the warmth of different years. By such sleight of hand one could argue that we are not averaging intensive properties such as temperature, but summing extensive properties such as gallons of oil needed to heat a building. Dr. Darko Butina might be pleased; but the resultant comparisons of years would be identical to those obtained by the “dubious” averaging.
The yeary average temperature of Tokyo is ca. 6 degC higher than Boston.
Both are big and flourishing cities, though Tokyo (12 million) is much bigger.
Ergo, warming by 6 degC will never afflict Boston at least in ordinary living.
yearry should read yearly. Sorry.
lsvalgaard says:
April 15, 2013 at 12:36 pm
Nonsense.
==========
That is not an argument. Specify on what grounds so we can judge. Otherwise your statement is abusive and irrelevant.
In his WUWT post Darko Butina said,
I applaud the independent spirit to get as near as possible to the raw data for an analysis on a station limited to just the direct data.
– – – – – – – –
In his WUWT post Darko Butina said,
Analyse and report the temperature patterns at every one of the ~6000 stations . . . . that is a refreshing idea. Do it.
John
lsvalgaard says:
April 15, 2013 at 2:22 pm
Houston, TX is hotter than San Diego, CA, but there are every year days with temperatures below 25F in Houston, but never in San Diego.
==========
There is nothing in the methodology to suggest this sort of comparison. It is clear the author is subtracting like from like and has eliminated missing data to avoid the complications that would result otherwise.