On the scales of warming worry magnitudes–part 1

A few weeks after my paper came out I have received quite unexpected but greatly appreciated offer from Anthony to write a summary of the paper for his blog site. The paper’s title is:

Should We Worry About the Earth’s Calculated Warming at 0.7OC Over Last the Last 100 Years When the Observed Daily Variations Over the Last 161 Years Can Be as High as 24OC?

Guest post by Darko Butina

The paper is unique and novel in its approach to man-made global warming in many respects: it is written by experimental scientists, it is published in journal that deals with data analysis and pattern recognition of data generated by a physical instrument, it treats the Earth atmosphere as a system where everything is local and nothing is global, and it is the first paper that looks for temperature patterns in the data that is generated by the instrument designed to and used by experimental scientists since early 1700s – calibrated thermometer. What is also unique is that every single graph and number that I have reported in the paper can be reproduced and validated by reader using data that is in public domain and analyse that data using simple excel worksheet. There are two main conclusions made in the paper:

1. That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year

2. That the Hockey Stick scenario does not exists in thermometer data and therefore it must be an artefact observed in a purely theoretical space of non-existing annual global temperatures

The paper is long 20 pages and analyses in great details a single weather station of daily data, dataset collected at Armagh Observatory (UK) between 1844 to 2004, one of very few datasets in public domain that have not been either destroyed, corrupted or endlessly re-adjusted by the curators of the global thermometer data at East Anglia University or NASA.

Before we start to analyse this paper, few points need to be made about experimental sciences for my paper to be properly understood. ALL our knowledge and understanding about the physical world around us comes from analysing and learning from data that has been generated by an experiment and measured or recorded by a physical instrument. Let me demonstrate this point by a very simple example of what happens when we record air temperature by some fixed-to-ground thermometer:

clip_image002

Thermometer reading of 15.1 has several links attached to it that cannot be broken: it is linked to a unique grid point, unique date and time stamp, unique instrument – thermometer and that thermometer to unique symbol (OC). So if someone wants to analyse any temperature trends those trends have to come from thermometer readings; it follows that if thermometer to be used is calibrated using Celsius scale, no datapoint can be older than 1743, follow link to Anders Celsius. Since we know for a fact that the annual temperature ranges will depend on the location of that thermometer, and since mixing different datasets are not allowed in experimental sciences, it follows that if there are, say 6000 weather stations (or fixed thermometers) in existence and across the globe, the first step before raising an alarm would be to analyse and report temperature patterns for every single weather station. That was what I was expecting to see when I started to look into this man-made global warming hysteria three years ago, following the revelations of Climategate affair. But I could not find a single published paper that uses thermometer-based data. So we have situation that the alarm has been raised, the whole world alarmed, suicidal economic policies have been taken while totally ignoring the data generated by the only instrument that has been invented to measure temperature – the thermometer. Instead, thousands of publications have been written looking for temperature trends in purely theoretical space that does not and cannot exist, the space of annual global temperatures. Two key papers have been published earlier, both arguing and explaining why global temperature as a single number does not exists, Essex et al., in 2007, using statistical arguments and written by recognized statisticians, while Kramm and Dlugi in 2011 have shown why the Earth’s atmosphere cannot be treated as a homogeneous system but should be perceived as a network of local temperature systems, from astrophysics point of view.

The starting point for my paper was based on facts that it is impossible to have arguments and ambiguity when it comes to thermometer. If you have two readings, only one outcome is possible: T2>T1 or T2<T1 or T2=T1. So if one wants, for some bizarre reason, to compare two annual patterns then one year can be unequivocally declared as warmer only if each daily reading of that year is larger than each corresponding daily reading of another year:

clip_image004

This artificially created graph above has real year in Tmax-Tmin space from the Armagh dataset while the year ‘y2100’ was result of adding 15C to each daily reading of y1844. My point here is that everyone seeing that graph would come up with an identical conclusion – y2100 is unambiguously warmer than y1844. So my perfectly valid question was – why would anyone went to trouble to invent something that does not exists while ignore obvious source of temperature data – the thermometer data? My 40 years of experience in experimental sciences offered a most obvious answer to that question – because nothing alarming could be found in thermometer-based data. There is a quite simple rule when it comes to interpretation of data – if more than a single conclusion could be made about any given dataset it means one of two things: either that dataset is of right kind but more data is needed to understand the data, or the data is of the wrong kind. Every single graph and number that is found in my paper can be independently reproduced and validated and therefore the thermometer data is the right-kind of data to use but we need more of it to fully understand temperature patterns observed on our planet.

The opposite is true when we look at the calculated and not measured data called annual global temperatures. Nobody knows where the data comes from, since data is calculated the only way to validate it is to use another set of calculations, it has been constantly adjusted, modified and different trends have been generated on daily bases using ever-changing arguments. When you dissect this very complex looking scientific problem of man-made global warming to its basic components, what you find is that the whole concept of global warming and climate change has nothing to do with science but everything to do with a very desperate attempt to connect temperatures with a few molecules of CO2 that have been generated by burning fossil fuels, while ignoring vast majority of CO2 molecules that have been generated by nature itself. It must follow that if those alarming trends could not be found in thermometer data, than that data must be removed and new data created, type of data that cannot be either proved wrong or right and allow proponents of man-made global warming to generate any trend they need and to enable them to claim that they know everything about everything when it comes to our planet. But, the only problem with that approach is that you cannot cheat in experimental sciences and slowly but steadily, retired scientists like me, with bit of free time will start to look into this problem and use their respective expertise to critically evaluate the supposed science behind this man-made movement.

So even before I started to collect daily data that are available in public domain, I was almost 100% confident that I will not find any alarming trends in thermometer data. And I was proven right.

Let us now start with the experimental part of the paper, the part where all the details of the dataset and dataset itself are presented. The paper is 20 pages long and all conclusions are based on detailed analysis of the Armagh (UK) dataset that covers period between 1844 and 2004. Dataset can be downloaded from the Armagh Observatory website as two sets of files, Tmax and Tmin files:

http://climate.arm.ac.uk/calibrated/airtemp/tccmax1844-2004

http://climate.arm.ac.uk/calibrated/airtemp/tccmin1844-2004

Depending on the software that one wants to use to analyse data, it is important to format all datasets in the same way. Since all commercial software expect as default data to be read in row-wise manner, reformatted Armagh dataset was created as a matrix containing 161 rows (1 row for each year) and 730 columns (1 column for each day-night readings):

clip_image006

BTW, all the graphs and tables from my paper are presented as JPG image and once I made my paper available free of charge on my own website you will be able to match all those graphs presented in this report to the original ones in the paper.

As a result, we now have annual temperature pattern, let us call it ‘annual fingerprint’, as a 730-bit fingerprint with the first 365 bits assigned to Tmax 1 to Tmax 365 (Jan1 to Dec 365 daytime readings) followed by 365 bits assigned to Tmin 1 to Tmin 365 (Jan1 to Dec 365 night-time readings). So, the annual fingerprint space can be seen as a 161 (years) x 730 (daily readings) matrix. Looking at the table above column-wise, we have ‘day fingerprints’, each of them 161-bits long representing the history of each day-night readings over period of 161 years. Once this table is created, we need to decide what to do with the missing values and with the extra day in February in leap years. We delete that extra day in February, but with great care not to get rest of the year out of the sync. There are two options when dealing with the missing datapoint – either replace it with some calculated one or remove the whole column.

The danger of replacing missing value with some calculated one is that we are contaminating instrumental data with some theoretical data, and unless we really understand that data the safest way is to remove all columns that contain even a single missing datapoint. Once you remove all columns with missing data you end up with 649-bit annual fingerprints, 89% of the original data, i.e. loss of 11% of total information content that is contained in that dataset, but with knowledge that the starting set is not contaminated by any calculated data and all datapoints are generated by thermometer itself.

Now we have our table in excel, table containing 161 years of data where each year is collection of 649 day-night readings and we can ask the data that 64 million worth question: Can we detect unambiguous warming trend over 161 years at Armagh (UK) in thermometer data? All we need to do is to take difference between the youngest (2004) and the oldest (1844) annual fingerprints and display it as a histogram:

clip_image008

Let me briefly digress here to make the following point – when you analyse instrumental data you have to know accuracy or error levels of the instrument that is used to generate the data. If we assume accuracy of thermometer used in 1800s at +/- 0.5C that means that for two readings to be declared as different, the difference between them should be larger than 1.0C. For example, if T2=10.0 and T1=10.8 we have to declare those two readings as same, i.e. T2=T1, since those two readings fall within the error levels of that instrument. If T2=10.0 and T1=20.0 then the difference is real since it is way outside the error levels of the instrument.

So, what is this simple graph (Figure 5) telling us? First thing to notice is that year 2004 cannot be declared either as warmer or colder than 1844 since every few days there is this switchover occurring making 2004 few days warmer than few days colder than 1844. Second thing to notice is that the size of those switchovers can be as large as 10C in one direction and 8C in another, i.e. 18C in total – way above the error levels of thermometer and therefore those switchovers are real. To make sure that those switchover patterns are not some artefacts unique to those two years, I wrote a special program (in C) to systematically compare every year to every other year (161 * 160 = 25760 comparisons) and on average each year is 50% of time warmer and 50% of time colder than any other year in the Armagh dataset. What make things even more complex is that there is no obvious pattern in terms when the switchover occur and magnitude of it, as it can be seen when two different year pairs are plotted on the same graph:

clip_image010

So far, all I did was to plot the original data, without any adjustment and without making any prior assumptions. I did not start this exercise to either prove or disapprove existence of global warming, but to see what the actual data is telling us. And what the thermometer is telling us is that the sheer magnitude of those apparently random and chaotic switchovers are due to natural forces that we do not understand, yet, and the anti-scientific process in which all complexity of annual temperature patterns is removed, replaced by a single number and suddenly we ‘see the light’ cannot be used to acquire any knowledge. If we use a simple logic the following logical construct could be made: dataset that is based on thermometer readings contains 100% of information content when it comes to temperatures. If we reduce that 730-dimensional space into a single number, we reduce the information content of that dataset from 100% to 0% – i.e. there is no information there left to gain any knowledge. Let us do the following question/answer exercise to compare two datasets – one that has day-night thermometer readings for a single year and one where 1 number represents 1 year

Q. What is total range of temperatures observed in Armagh?

A. Lowest temperature observed was -15.1C on February 7 1895, the highest temperature +30.3C recorded on July 10 1895; Total range 45.4C

Q. What is the largest and the smallest natural fluctuations observed for individual days?

A. Day that has the most variability is May 4 (Tmax125) with total observed range of 23.8C, while day with least amount of variability is October 29 (Tmax302) with the observed range of ‘only’ 9.9C

In contrast, each year is presented as a single number in the annual temperature space, number obtain by averaging all daily data to a single number, and there are not too many questions that you can ask about that single number. Actually there are none – not a single question could be asked the number that has no physical meaning! For example, if two years have identical annual average we don’t know why they are the same, and if they have two different annual averages, we don’t know why they are different. If we do the same exercise in daily data, we know exactly which days are moving in a same direction and which days are moving in opposite direction.

Let us now ask the most obvious question – are those patterns, or rather lack of patterns, observed in Armagh unique to UK, i.e. are they local or do they reflect some global patterns? Scientific logic would suggest that the same random/chaotic switchover patterns observed in Armagh should be observed across the globe with the only difference being the size and magnitude of those switchovers, i.e. local variations. To test that I took two annual temperature samples from two weather stations on two different continents, one in Canada and one in Australia:

Waterloo (Canada):

clip_image012

Melbourne (Australia):

clip_image014

Please note difference in both, patterns and magnitude of switchovers.

Let me make a very clear statement here – the choice of Waterloo and Melbourne weather stations was driven by the ease to find weather stations with relatively easy-to-download formats and I did not get involved in method of cherry picking weather stations that fit patterns found in Armagh, as it is normal practice in man-made sciences. To prove that last point and to challenge readers to start looking into the real measured data and stop looking into non-existing and calculated data like annual global temperatures, I will offer a modest financial reward of £100.00 (UK) from my pension, to first person who finds a single example of year pair where one year has every single daily thermometer readings larger than another year. Any weather station that is not on permanent ice or sand (I don’t know what to expect in those cases) and any gap between two years. Obviously, the winner will have to give the link to the original data and contact either Anthony or myself at darkobutina@l4patterns.com to claim the award.

The way I see it, I am here in win-win situation. If nobody can find weather station that shows unambiguous warming trend, and if we keep record of all those analysed weather stations I saved the money but gain large number of additional information that should finally kill any notion of the man-made global warming hypothesis, since the proponents of that hypothesis would have to explain to general public those patterns observed in thermometer data. In strictly scientific terms and using null hypothesis that either all weather stations count or none does, I have already proven that those patterns are real and observed on three different continents, and therefore prove that the global warming trend does NOT exist in thermometer data. On other hand, if someone does find clear and unambiguous warming trend in thermometer data, that work will again make the same point – all temperature patterns are local and ONLY way to declare that trends are global is if ALL individual weather stations are showing the same trends.

This concludes this Part One report in which I explained how the first conclusion “That the global warming does not exists in thermometer data since it is impossible to declare one year either warmer or colder than any other year” in my paper has been reached.

The second conclusion in my paper which explains why the Hockey Stick scenario does not exist in thermometer data will be reported in a separate report. In Part Two report I will introduce two different bits of software, my own clustering algorithm and k Nearest Neighbours algorithm, or kNN, both used in sciences like pattern recognition, datamining and machine learning and apply them to annual temperature patterns observed in Armagh. The overall conclusions will obviously be the same as we have reached so far, but I will demonstrate how the observed differences between different annual patterns can be quantified and how we can use those computational tools to detect ‘extreme’ or unusual annual temperature patterns, like annual patterns of 1947 which is the most unique not only in Armagh but also in the rest of UK.

==================================================================

Dr Darko Butina is retired scientist with 20 years of experience in experimental side of Carbon-based chemistry and 20 years in pattern recognition and datamining of experimental data. He was part of the team that designed the first effective drug for treatment of migraine for which the UK-based company received The Queens Award. Twenty years on and the drug molecule Sumatriptan has improved quality of life for millions of migraine sufferers worldwide. During his computational side of drug discovery, he developed clustering algorithm, dbclus that is now de facto standard for quantifying diversity in world of molecular structures and recently applied to the thermometer based archived data at the weather stations in UK, Canada and Australia. The forthcoming paper clearly shows what is so very wrong with use of invented and non-existing global temperatures and why it is impossible to declare one year either warmer or colder than any other year. He is also one of the co-authors of the paper which was awarded a prestigious Ebert Prize as best paper for 2002 by American Pharmaceutical Association. He is peer reviewer for several International Journals dealing with modeling of experimental data and member of the EU grants committee in Brussels.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

115 Comments
Inline Feedbacks
View all comments
April 16, 2013 10:07 am

I too think homogenizing temperatures is not a valid use of the data, spacial temperatures are not linear, linearizing them should be a non-starter.
But I’ve gone about looking at real temperatures in a different fashion, I see the real issue as a hypothesized loss of cooling due to co2. The Earth cools at night, the question is, is there a loss of nightly cooling, and is there any evidence that it cools less at night now than in the past.
The short answer is no.
I use t-min and t-max, and create a daily rising temp (t-max – t-min), but how much does it cool? For that I take today’s t-max, and subtract tomorrows t-min to get a falling temp. Rising temp – falling temp gives a difference, this is the same as a daily anomaly (global temp chart, Map of stations). This number is useful for analysis, it can be averaged across all the stations in an area to remove the effects of weather, also when the area is isolated to a single hemisphere you can see the progression of temperature as the ratio of day to night changes (60 years of NH Diff * 100).
From daily diff, you can look at the slope of the change of temp each day (Summer to fall, Fall to summer).
And if you plot the change of slope of the change of daily temp as the ratio of day to night changes you get this.
So, while there is no trend or really any significant change in nightly cooling (~18F/day), no trend in the average annual diff (it’s both positive and negative), there is a trend in daily diff as the ratio of day/night changes, that seems to align with ~1970 to ~2002 cycle, it’s only one half of a cycle, so who know if it’s really a cycle, and since it’s present as a change in daily warming and cooling it almost looks like it’s from an orbital wobble.
But whatever it is, it was recorded in the temperature data, waiting to be found, and it doesn’t appear to be related to an increase of co2.

April 16, 2013 3:05 pm

The people commenting on Enthalpy have to core problem identified. I asked Dr. Mueller about this and his response was less than comforting — some hand waving and “trust me”. Ha.
Ignoring enthalpy causes no end of idiocy. You cannot “average” temperatures across time much less geography without going through enthalpy. If you add in the error caused by not using enthalpy you get a bubble of space to draw infinite flat lines through.
Beware the BEST data download. Most of the 4.5GB is the error file of which most of it is 0s….

tobias smit
April 16, 2013 9:49 pm

I have a Stevenson screen at home ( beside an orchard and no other “heat or cold” sources)
Can some one help me here , we have been taking measurements and recording them for about 20 years. I understand that the merc thermometer temperature gives me the high reading but that instrument cannot give me a true low reading in a 24 hr period , we have as well the alcohol (min-max) thermometer that gives the true low reading in said 24 hr period. so now I have 2 instruments. And then there is the the time of day each reading takes place. If I take a reading at 5 am is that the coldest reading? No in my opinion ( 7 am might be colder) and if I read the merc at 3 pm is that the hottest reading or is it 2 hrs later?? Or do they then self correct the next day (after resetting them) if I take the readings at even more different time slots? I think the errors would compound and at least are questionable… HELP ! ( BTW we do try to take measurements as close as possible in the AM and PM), thanks

April 17, 2013 8:33 am

Few very brief comments from the author of this report and the paper. I have sent the second part report to Anthony that deals with the main part of the analysis of Armagh dataset using clustering and kNN algorithms to quantify differences between 730-bits annual fingerprints. Another point to make is that some readers have either miss-read or miss-interpreted importance of number and magnitude of switch-overs when comparing any two years. I have started by comparing 2004 vs 1844 but went to say that I have written a special program to compare every year with every other year in dataset which means doing 161×160 comparisons which clearly shows that the switch-over patterns is the norm. Let me give you some numbers: 2004 is on 399 occasions warmer and on 250 occasions colder than 1844 with maximum difference in one direction of 10.9C and 8.8C in other. at Waterloo, 2009 is on 239 occasions warmer and 375 occasions colder than 1998, with total range of switch-over of 38.5C, while at Melbourn, 2009 was 219 times warmer and 146 times colder than 1998 with total range of 43.2C (21.7C in one direction and 21.5 in another). That to me means two things – the switch-over is happening every few days and coupled with the sheer magnitude of those switch-overs it is impossible to declare one year either warmer or colder, if you look in thermometer data and if you are not playing some silly-numbers game. Since the same patterns have been observed on two different continents, using any scientifically based logic, one would have to come to conclusion that those patterns should be found at all other weather stations that record temperatures on daily bases. Following the standard practices in experimental sciences, I have asked readers to be skeptical and to prove me wrong, not by expressing their opinions on global warming or annual temperatures, but by actually looking into thermometer data. I even offered the award for the first person who proves me wrong. And can I emphasize again, it is not me who is claiming that there is unequivocal global warming, that is official line of man-made global warming community – all I am saying is that I cannot find either warming or cooling in thermometer data and that nobody has bothered to look in thermometer data before me and report that work. I hope that things will become much more clearer once Anthony publishes my Part 2 report. Darko Butina

April 17, 2013 3:18 pm

Dr, Butina. It was not clear from the original article that “unequivocal” was a claim made by the IPCC. Nor that your exceedingly stringent criteria is the requirement for “unequivocal”. If your Part 1 had made the linkage between “unequivocal” and IPCC, it would have strengthened the point.
I’m still not sure that Ta(i)>Tb(i) for i=1 to 730 for year a and year b is a necessary condition for “unequivocal” i.e. “leaving no doubt.” No doubt it would be a sufficient condition for unequivocal, but I do not see it as a statistically necessary one.

Chuck Nolan
April 17, 2013 6:35 pm

“So even before I started to collect daily data that are available in public domain, I was almost 100% confident that I will not find any alarming trends in thermometer data. And I was proven right.”
————————————
I thought one was not supposed to start investigations with a bias.
Another “gut feeling” in science.
wtf over?

Chuck Nolan
April 17, 2013 6:42 pm

During his computational side of drug discovery
——————————
I don’t ever recall having a computational side during my drug discoveries.
/sarc
cn

Chuck Nolan
April 17, 2013 7:04 pm

Brian Macker says:
April 16, 2013 at 6:39 am
Embarrassing analysis. ………………………..Those calling this brilliant should understand that they are not equipped with the tools to do basic math let alone science.
—————————————————————
Just who do you think the CAGW propaganda is designed to reach?
You? No way.
The layman and his emotions are the target.
Few people are stupid enough to actually sit and think about it and using their ability to reason they would then decide it’s a good idea to stop the poor from having electricity and clean water.
I don’t think the statue of liberty will be submerged, do you? And if it was heading there, which would take a long time, I doubt it would be there or we could stop it.
The paper says alarmists are not presenting science.
imo They produce propaganda.
The game is afoot and we lag.
cn

Mike Rossander
April 17, 2013 7:31 pm

As much as I like this article, I can not agree with the basic premise. To say that one year is warmer than another if and only if “one year has every single daily thermometer readings larger than another year” is absurd. We say that something is warmer when it has more heat energy in it. We can make that statement either at a point in time or over a period of time. The example of point-in-time would be to say that boiling water is warmer (has more heat energy) than ice. This, incidentally, is what the thermometer measures. When expressed over a period, you take the integral of the heat at each point in time. The fact that the thermometer can not directly measure the integral does not invalidate the physical reality. A block of ice over the period of a day is still colder (has less heat energy) than the same mass of boiling water.
The challenge is calculating the integral with the temperature is changing. But it is only a challenge, not an impossibility. Which system contains more heat energy during the day – a kilogram of water held at 0 C for one hour then heated to 99 C for the next 23 hours or a kilogram of water held at 1 C for the full 24 hours. By the author’s methodology, those two scenarios can not even be compared yet even the most cursory analysis clearly shows that the first scenario has an “unequivocally” greater quantity of heat energy within the defined period.
Now, I will freely admit that evaluating the temperatures of the two scenarios in hourly units provides more useful information than the simplistic daily average. We should never throw away information unnecessarily. But to go so far beyond that as to define “warmer” as requiring every single reading to be warmer flies in the face of both common sense and the meaning of the concept of heat.

April 18, 2013 7:06 am

milodonharlani says:
April 15, 2013 at 1:00 pm
I wondered also why the author went on at such length re. C. v. F. Conversion is just arithmetic.
—————————-
In theory yes, but go back to the idea of temperature being +/- 0.5 degrees because of the accuracy of reading a thermometer. +/- 0.5degF is only just over half the range of +/- 0.5degC. And if the reading is 78degF (+/- 0.5degF), will you round that to 26degC or insist on 25.5555degC? That’s almost 1degF different just in the rounding. So it’s not easy to convert a reading in F into one in C and preserve the *intention* of the reading (ie. 78degF +/- 0.5degF) in a conversion to C, because it would not have been 26degC +/- 0.5degC even if the two thermometers had read the same. Assuming the accuracy of the article, it’s likely the human reader would have said it’s 26degC rather than 25.5degC, and certainly not 25.555degC. So we generate differences (of the same order of magnitude as AGW!) just from thinking about the conversion from F to C. And don’t we have this still in US temperature records even now, so it’s not even that it’s just a historical issue.
On a wider point, this paper assumes it’s ok to average max and min temps to get a single daily temperature. I’d love to see the graphs recreated looking at all the max and all the min temps to see if there’s more can be learned from that.

Reply to  Peter Ward
April 18, 2013 8:07 am

Peter Ward says:
April 18, 2013 at 7:06 am

On a wider point, this paper assumes it’s ok to average max and min temps to get a single daily temperature. I’d love to see the graphs recreated looking at all the max and all the min temps to see if there’s more can be learned from that.

While I can’t say whether this is okay or not, it is what NCDC does to generate their Mean temp in their Global Summary of Days data set, they take the mean of min and max.

Jim Butts
April 18, 2013 7:33 am

Following this crazy logic, Temperature itself is not a valid concept since it is the average of all the energies of the molecules of the gas for example.

Brian Macker
April 18, 2013 2:53 pm

Ferd,
” Rather to shows that this is not unequivocal, because there are some years in the present that cannot be distinguished from some years 100 years ago. This should not be the case if the result is unequivocal.”
Not true. Random variations (a one spot on the planet) could have cycles longer than a year. If the predominate wind direction were to vary then warm air off the Atlantic might be reduced during one year. That doesn’t mean the extra heat doesn’t end up somewhere else. This article is just silly on so many levels. Why insist that every day be warmer? Why not insist that the warming is not unequivocal until the minimum temperature recorded in the last year is greater than the maximum temperature of the starting year. Why not insist that to be unequivocal the coldest day of winter of the end year has to be warmer than the hottest day of summer in the starting year? He seems to think that only the axis of the earth effects temperature variation and not wind and a whole host of other factors, because he has attempted to adjust for that but nothing else.

Brian Macker
April 18, 2013 2:57 pm

Chuck Nolan,
“They produce propaganda. The game is afoot and we lag.”
We? WE? What is this we shit paleface? I’m not interested in producing propaganda to fit some preconceived result. Apparently you think this article is propaganda which is a far worse interpretation than I would give because it shows malice against the truth (aka lying). I just think that he is confused.

1 3 4 5