GISS & METAR – dial "M" for missing minus signs: it's worse than we thought

Here’s a story about how one missing letter, an M, can wreck a whole month’s worth of climate data. It is one of the longest posts ever made on WUWT, I spent almost my entire Saturday on it. I think it might also be one of the most important because it demonstrates a serious weakness in surface data reporting.

In my last post, we talked about the  a curious temperature anomaly that Jean S. found in the March GISS data and posted at Climate Audit:

The anomaly over Finland has an interesting signature to it, and the correction that GISS posted on their website confirms something I’ve been looking at for a few months.

The data shown between 4/13 and 4/15 were based on data downloaded on 4/12 and included some station reports from Finland in which the minus sign may have been dropped.

With some work I started back in late December and through January, and with GISS putting stamp of approval on “missing minus signs” I can now demonstrate that missing minus signs aren’t just an odd event, they happen with regularity, and the effect is quite pronounced when it does happen. This goes to the very heart of data gathering integrity and is rooted in simple human error. The fault lies not with GISS (though now they need a new quality control feature) but mostly with NOAA/NCDC who manages the GHCN and who also needs better quality control. The error originates at the airport, likely with a guy sitting in the control tower. Readers who are pilots will understand this when they see what I’m talking about.

I’ve seen this error happen all over the world. Please read on and be patient, there is a lot of minutiae that must be discussed to properly frame the issue. I have to start at the very bottom of the climate data food-chain and work upwards.

First, a discussion about the root of error and the differences between the surface and satellite dataset. I should mention that in the satellite image from NASA’s Earth Observations (NEO), we don’t see the same error as we see in the GISTEMP map above.

NASA NEO March 1-31 2010 day satellite measured temp anomaly – click for larger image

Why? Better sensors, maybe, but mostly it has to do with a different data gathering methodology. In the surface data sets, including land and ocean data, most every datapoint is touched by a human hand, even airport data that gets done by automated airport sensors sometimes gets transcribed manually (often in third world and technologically stunted countries). In the surface data, thousands of sensors are spread across the globe, many different designs, many different exposures, many different people with different standards of measurement and reporting. The precision, accuracy, and calibration of the vast surface network varies, especially when we have broad mix of instrumentation types.For example in the US Historical Climatological Network the equipment varies significantly.

In satellite data, the data is measured at a single point with one sensor type, the Microwave Sounder Unit on the satellite, calibrated to a precision source on-board. On-board redundant precision platinum resistance thermometers (PRTs) carried on the satellite radiometers. The PRT’s are individually calibrated in a laboratory before being installed in the instruments. The satellite data is automatically measured and  transmitted. In contrast to the surface temperature record, no human hands touch the data gathering or data reporting process. Satellite data generation is far more homogeneous than the mish-mash of surface data.

I think it would be safe to say that the chances of human error in raw surface data are at least an order of magnitude greater (if not several) than error in raw satellite data. Post measurement processing is another issue, but for the purposes of this essay, I’m focusing only on raw data gathering and transmittal.

As mentioned in the recently updated compendium of issues with the surface temperature data by Joe D’Aleo and myself, there has been a move in the Global Historical Climatological Network (GHCN) to rely more and more on airports for climate data. This, in my opinion, is a huge mistake because in addition to those issues

E.M. Smith aka “Chiefio” reports that in GISS (which uses GHCN) worldwide, there has been a wholesale migration towards airport weather data as a climatic data source.  In an email sent to me on Jan 20, 2010 he says that

Look at:

which as a fairly good descriptions of the problems in the data, we have a global report for GHCN as of that August data.  There is more deail in the link, but I think you care about “now”:

Percentage of sites that are AIRPORTS NOW, by decade of record

Year   S.P   S.C   S.T   S.W   EQ.   N.W   N.T   N.C   N.P  Total

1909   0.0  42.0  15.1  28.2  29.2  36.7  22.8  33.3  44.4  25.4

1919   0.0  36.4  12.8  23.5  25.1  37.7  20.9  35.0  39.8  24.1

1929   0.0  37.0  11.9  27.4  27.7  32.7  20.4  35.9  56.4  24.1

1939   0.0  43.9  17.6  32.0  33.8  29.1  20.2  36.2  51.0  25.1

1949   0.0  32.3  24.4  37.6  44.4  31.8  23.3  39.3  60.9  29.1

1959   0.0  24.0  35.0  50.0  59.4  39.4  30.9  41.0  62.9  37.3

1969   0.0  18.1  39.3  53.2  63.2  40.2  31.4  41.1  61.5  39.0

1979   0.0  17.9  39.1  52.0  64.2  40.7  28.8  41.1  62.3  37.7

1989   0.0  20.7  41.5  52.5  67.8  41.9  29.1  40.8  64.9  37.7

1999   0.0  21.0  53.5  57.4  68.0  53.0  32.6  49.0  59.0  41.6

2009   0.0  17.9  74.0  64.7  66.5  51.5  30.2  45.4  57.3  41.0
This is by major climate latitude band, the total is 41% for the globe (and rising daily 😉
Also in:

I do break outs by continent and by some countries.  For the USA, I further do a specific with / without USHCN (the older version, not the USHCN.v2 put in 15Nov09) and findFor COUNTRY CODE: 425

But it masks the rather astounding effect of deletions in GHCN without the USHCN set added in:
LATpct: 2006  3.7 18.3 29.5 33.2 14.4  0.0  0.4  0.3  0.1  0.1 100.0

AIRpct:       1.3  4.0  6.3  6.7  3.2  0.0  0.4  0.3  0.1  0.1 22.4

LATpct: 2007  8.2 17.2 28.4 26.9 11.2  0.0  3.7  3.0  0.7  0.7 100.0

AIRpct:       8.2 15.7 27.6 23.1  9.0  0.0  3.7  3.0  0.7  0.7 91.8

LATpct: 2008  8.8 16.9 28.7 26.5 11.0  0.0  3.7  2.9  0.7  0.7 100.0

AIRpct:       8.8 15.4 27.9 22.8  8.8  0.0  3.7  2.9  0.7  0.7 91.9

LATpct: 2009  8.1 17.8 28.1 26.7 11.1  0.0  3.7  3.0  0.7  0.7 100.0

AIRpct:       8.1 16.3 27.4 23.0  8.9  0.0  3.7  3.0  0.7  0.7 91.9

DLaPct: 2009  4.3 18.4 29.5 32.5 13.6  0.0  0.7  0.9  0.2  0.1 100.0

DArPct:       2.1  5.7  8.8  8.9  3.7  0.0  0.6  0.8  0.2  0.1 30.7

That in the YEAR 2009 the USA has almost 92% airports in GHCN.

So clearly, airports make up a significant portion of the climate data.

On the issues of airports as climate station, obvious issues with siting, UHI, failing ASOS instrumentation, and conflicting missions (aviation safety -vs-climate) aside, I’m going to focus on one other thing unique to airports: METAR

What is METAR you ask? Well in my opinion, a government invented mess.

When I was a private pilot (which I had to give up due to worsening hearing loss – tower controllers talk like auctioneers on the radio and one day I got the active runway backwards and found myself head-on to traffic. I decided then I was a danger to myself and others.) I learned to read SA reports from airports all over the country. SA reports were manually coded teletype reports sent hourly worldwide so that pilots could know what the weather was in airport destinations. They were also used by the NWS to plot synoptic weather maps. Some readers may remember Alden Weatherfax maps hung up at FAA Flight service stations which were filled with hundreds of plotted airport station SA (surface aviation) reports.

The SA reports were easy to visually decode right off the teletype printout:

From page 115 of the book "Weather" By Paul E. Lehr, R. Will Burnett, Herbert S. Zim, Harry McNaught - click for source image

Note that in the example above, temperature and dewpoint are clearly delineated by slashes. Also, when a minus temperature occurs, such as -10 degrees Fahrenheit, it was reported as “-10”. Hang on to that, it is important.

The SA method originated with airmen and teletype machines in the 1920’s and lasted well into the 1990’s. But like anything these days, government stepped in and decided it could do it better. You can thank the United Nations, the French, and the World Meteorological Organization (WMO) for this one. SA reports were replaced by METAR in 1996.

From Wikipedia’s section on METAR

METAR reports typically come from airports or permanent weather observation stations. Reports are typically generated once an hour; if conditions change significantly, however, they can be updated in special reports called SPECIs. Some reports are encoded by automated airport weather stations located at airports, military bases, and other sites. Some locations still use augmented observations, which are recorded by digital sensors, encoded via software, and then reviewed by certified weather observers or forecasters prior to being transmitted. Observations may also be taken by trained observers or forecasters who manually observe and encode their observations prior to transmission.


The METAR format was introduced 1 January 1968 internationally and has been modified a number of times since. North American countries continued to use a Surface Aviation Observation (SAO) for current weather conditions until 1 June 1996, when this report was replaced with an approved variant of the METAR agreed upon in a 1989 Geneva agreement. The World Meteorological Organization‘s (WMO) publication No. 782 “Aerodrome Reports and Forecasts” contains the base METAR code as adopted by the WMO member countries.[1]


The name METAR is commonly believed to have its origins in the French phrase message d’observation météorologique pour l’aviation régulière (“Aviation routine weather observation message” or “report”) and would therefore be a contraction of MÉTéorologique Aviation Régulière. The United States Federal Aviation Administration (FAA) lays down the definition in its publication the Aeronautical Information Manual as aviation routine weather report[2] while the international authority for the code form, the WMO, holds the definition to be aerodrome routine meteorological report. The National Oceanic and Atmospheric Administration (part of the United States Department of Commerce) and the United Kingdom‘s Met Office both employ the definition used by the FAA. METAR is also known as Meteorological Terminal Aviation Routine Weather Report or Meteorological Aviation Report.

I’ve always thought METAR coding was a step backwards, for reasons I’ll discuss shortly.

But first, quick! Spot the temperature and dewpoint in this METAR report:

The following is an example METAR from Burgas Airport in Burgas, Bulgaria, and was taken on 4 February 2005 at 16:00 Coordinated Universal Time (UTC).

METAR LBBG 041600Z 12003MPS 310V290 1400 R04/P1500N R22/P1500U +SN BKN022 OVC050 M04/M07 Q1020 NOSIG 9949//91=

Could you read this and know what the weather is in Burgas? I can, only becuase I’ve looked at hundreds the past few months, but I still have to pick through the report to find it. The reason is that METAR is a variable field reporting format. Data isn’t always in the same position.

In the report above. The temperature and dewpoint is: M04/M07

M04/M07 indicates the temperature is −4 °C (25 °F) and the dewpoint is −7 °C (19 °F). An M in front of the number indicates that the temperature/dew point is below zero (0) Celsius.

Notice also that the entire METAR report is visually more complex. This is fine if you are having computers code it, but many METAR reports are still hand coded by technicians at airports, and thus begins the introduction of human error into the climate data process. Complexity is not a good thing when manual labor is involved as it increases the likelihood of error.

Here is where METAR coding departs from normal numeric convention. SA reports did not have this problem.

In the METAR report above, instead of using the normal way we treat and write negative numbers, some policy wonk decided that we’ll use the letter “M” to report a negative number. Only a bureaucrat could think like this.

So instead of a below zero Centigrade temperature and dewpoint looking like this:


in the “new and improved” METAR coding, it looks like this:


OK not a problem you say? Well I beg to differ, because it forces technicians who manually code METAR reports for transmission to do something they would not do anywhere else, and that’s write down an “M” instead of a minus sign. Using an M is totally counter-intuitive and against basic math training, and increases the likelihood of error.

It gets worse. Let’s say the technician makes a boo-boo and puts a minus sign instead of an “M” in front of the numbers for temperature/dewpoint. You’d think this would be alright, and the system would correctly interpret it, right?

Let’s put the METAR report from Burgas Airport into an online METAR decoder.

Here’s the report with the easy to make mistake, using minus sign instead of M for the temperature.

METAR LBBG 041600Z 12003MPS 310V290 1400 R04/P1500N R22/P1500U +SN BKN022 OVC050 -04/M07 Q1020 NOSIG 9949//91=

The output from the online METAR decoder reads:

Hey look at that, the temperature is 39°F (3.8°C). Minus signs are discarded from METAR decoding. Note that decoded METAR temperature also comes out the same if the “M” is missing in front of the 04/-07 or 04/M07

If it had been decoded correctly we would have gotten:

(-4) degrees Celsius = 24.8 degrees Fahrenheit

A whole 14.2 degrees F difference!

Reference for METAR decoding:

Also note that METAR data has no year stamp component to the data, so the METAR decoder has no way of knowing this was a report from 2005, not 2010. Since each METAR report is essentially disposable within 24 hours, this presents no problem for pilots, they don’t care. But if you are tracking climate over years using METAR data, not having a year time stamp increases the likelihood of error.

Also the temperature error itself in this case has no bearing on a pilot’s decision to takeoff or land. Unless they are worried about density altitude on hot humid days, the temperature is a throwaway datum. They are mostly concerned about winds, sky conditions, and barometer (altimeter setting). In fact cool/cold days are far better for aviators; see AOPA’s Why Airplanes Like Cool Days Better.

My point here is this:

If a pilot or tower controller sees an erroneous METAR report like this:

METAR LBBG 041600Z 12003MPS 310V290 1400 R04/P1500N R22/P1500U +SN BKN022 OVC050 -04/M07 Q1020 NOSIG 9949//91=

Or this:

METAR LBBG 041600Z 12003MPS 310V290 1400 R04/P1500N R22/P1500U +SN BKN022 OVC050 04/M07 Q1020 NOSIG 9949//91=

Pilots/controllers/dispatchers aren’t likely to care, since current temperature and dewpoint are not important to them at these cooler temperatures. They also aren’t likely to call up the tower and holler at the technician to say “Hey, the temperature is wrong!”. Further, since the METAR report may be reissued sometime within the hour if somebody DOES spot the error, problem solved.

Point is that updates/corrections to METAR data may not be logged for climate purposes, since they are likely to be seen as duplicate reports because of the hourly timestamp.

So, in the case of M’s and minus signs, the propensity exists for erroneous METAR reports to not get corrected and to stay logged in the system, eventually finding their way into the climate database if that airport happens to be part of GISS, CRU, or GHCN datasets.

Maddeningly, even when egregious errors in aviation weather data are pointed out and even acknowledged by the reporting agency,  NOAA keep them in the climate record as was demonstrated last year in Honolulu, HI International Airport when a string of new high temperature records were set by a faulty ASOS reporting station. NOAA declined to fix the issue in the records:

NOAA: FUBAR high temp/climate records from faulty sensor to remain in place at Honolulu

The key sentence from that story from KITV-TV:

The National Weather Service said that is not significant enough to throw out the data and recent records.

Hmmm, look at another nearby station and compare the differences. You be the judge.

Does NOAA consider this a climate reporting station? Yes according to NCDC MMS database, it is part of the “A” network, designated for climate:

Clearly, NOAA simply doesn’t seem to care that erroneous records finds their way into the climatic database.

OK back to the METAR issue.

The problem with METAR reporting errors is worldwide. I’ve found many examples easily in my spare time. Let’s take for example, a station in Mirnvy, Russia. It is in Siberia at 62.5° N 113.9° E and has an airport, is part of GHCN, and reports in METAR format.

Weather Underground logs and plots METAR reports worldwide, and these METAR reports are from their database on November 11th, 2009.

It shows a clear error in the 12:30PM (330Z) and 1 PM (400Z) METAR report for that day:

UERR 010330Z    22005G08MPS 9999 -SN 21/M23 Q1026 NOSIG RMK QFE738 24450245

UERR 010400Z    22005G08MPS 9999 -SN SCT100 OVC200 20/M22 Q1025 NOSIG RMK QFE737 24450245

UERR 010430Z    21005G08MPS 4000 -SN SCT100 OVC200 M20/M22 Q1024 NOSIG RMK QFE737 24450245

UERR 010430Z    21005G08MPS 4000 -SN SCT100 OVC200 M20/M22 Q1024 NOSIG RMK QFE737 24450245

UERR 010500Z    21005G08MPS 4000 -SN SCT100 OVC200 20/M22 Q1023 NOSIG RMK QFE736 24450245

Note the missing ” M” on the 12:30PM (330Z) and 1 PM (400Z). It happens again at 2PM (500Z). Of course it isn’t very noticeable looking at the METAR reports, but like the GISS plot of Finland, stands out like a sore thumb when plotted visually thanks to Weather Underground:

Mirnvy, Russia

The effect of the missing “M” is plotted above, which coincidentally looks like an “M”.

Put those METAR reports in this online METAR decoder: and you get 70F for 12:30PM and 68F for 1PM

What do you think 70 degree F spike this will do to monthly averaged climate data in a place where the temperature stays mostly below freezing the entire month?

Does NOAA log METAR data from Mirnvy Russia (ICAO code UERR)?

Yes, they do. Plus many other METAR reporting stations discussed here.

Does NCDC classify it as a climate station?

Yep, it is part of the “A” network. Which means it either directly reports climate data and/or is used to adjust data at other stations, such as GHCN stations.

List of GHCN stations:

It is not however, part of GHCN. But there are plenty stations that have this error that are part of GHCN. Yakutsk, Russia, also in Siberia is part of GHCN and has a METAR reporting error. Here’s an example what one off-coded hourly reading will do to the climate database.

The city of Yakutsk, one of the coldest cities on earth, reported a high of 79˚F on November 14th with a low of -23˚F.

Weather Underground seems to have done some quality control to the METAR reports, but the erroneous high temp remains in the daily  and monthly report:

A month later, it happened again reporting a high of 93˚F on December 14th with a low of -34˚F

And the erroneous 93F high temp remains in both the daily and monthly reports, but has been removed from the METAR report, so I can’t show you the missing “M” I observed back in January. I wish I had made a page screen cap.

When the temperature data was calculated with that error then, this was found:

The average for the day,  30˚F, was some 67˚F above normal, pushing the anomaly for the month of December from 3.6˚F above normal to 5.9˚F above normal… quite a shift!

More examples:

Here’s an example of a properly coded METAR report from Nizhnevartovsk, Russia,  for December 11, 2009, but the data is wrong. I’m thinking it was supposed to be M30 but came out M13. The dewpoint value M16 is also erroneous.

Nizhnevartovsk, Russia Dec 7, 2009

METAR USNN 111230Z 00000MPS P6000 SCT026 OVC066 M27/M29 Q1014 NOSIG RMK QFE755 SC062

METAR USNN 111300Z 12005G08MPS P6000 SCT066 OVC200 M13/M16 Q1035 NOSIG RMK QFE772 SC063

METAR USNN 111330Z 12005G08MPS P6000 SCT066 OVC200 M13/M16 Q1035 NOSIG RMK QFE772 SC063

METAR USNN 111400Z 00000MPS P6000 SCT020 OVC066 M28/M29 Q1014 NOSIG RMK QFE755 SC065

And it was not a one time occurrence, happening again on Dec 25th as shown in the Monthly graph:

Nizhnevartovsk, Russia, December 2009

The daily graph and METAR reports, notice it happened about the same time (1300Z) and in the same way (M27 then M13) , perhaps pointing to the same technician on duty making the same habitual mistake again. Maybe too much Vodka having to work the Xmas night shift?

Nizhnevartovsk, Russia Dec 25, 2009

METAR USNN 251230Z 11006MPS 2200 -SN SCT014 OVC066 M27/M30 Q1015 NOSIG RMK QFE757 SC055

METAR USNN 251300Z 35002MPS 6000 -SN SCT015 OVC066 M13/M15 Q1010 NOSIG RMK QFE752 SC03

METAR USNN 251330Z 12006MPS 4100 -SN SCT015 OVC066 M27/M29 Q1014 NOSIG RMK QFE756 SC055

It did not appear initially to be in the GHCN list or on the GISS list, but I’ve found that some of the names on Weather Underground are different from the place names in the GHCN and GISS lists. It turns out that if you search in Weather Underground for the station ALEKSANDROVSKOE it will point you to use the data from Nizhnevartovsk. ALEKSANDROVSKOE is a GHCN/GISS station.

I found other instance of METAR errors for that station, this one was quite pronounced on Jan 16th, 2009, lasting for 7 hours before it was corrected.

Nizhnevartovsk, Russia Jan 16, 2009

Here’s the METAR reports

METAR USNN 151800Z 23002MPS P6000 BKN066 OVC200 M22/M24 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 151830Z 23002MPS 2900 -SHSN SCT020CB OVC066 22/M23 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 151900Z 23002MPS 2100 -SHSN SCT019CB OVC066 21/M23 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 152000Z 24001MPS 5000 -SHSN SCT022CB OVC066 21/M22 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 152030Z 24002MPS 4300 -SHSN SCT020CB OVC066 21/M22 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 152100Z 24002MPS 6000 -SHSN SCT018CB OVC066 20/M22 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 152130Z 25002MPS P6000 SCT020CB OVC066 20/M22 Q1009 NOSIG RMK QFE751 SC038

METAR USNN 152200Z 28002MPS P6000 SCT022CB OVC066 20/M22 Q1009 NOSIG RMK QFE752 SC038

METAR USNN 152300Z 27003MPS P6000 -SHSN SCT016CB OVC066 M19/M21 Q1010 NOSIG RMK QFE752 SC038

The monthly report shows the event:

Nizhnevartovsk, Russia, January 2009

It happened twice on Feb 2nd, 2007, and with a space added between the M and 09 on the 0300Z report, it is a clear case of human error:

METAR USNN 020100Z 11010G15MPS 0500 R03/1200 +SN +BLSN VV002 M09/M11 Q1003 TEMPO 0400 +SN +BLSN VV002 RMK QFE748 QWW060 MOD ICE MOD TURB S

METAR USNN 020200Z 12009G14MPS 0500 R03/1200 +SN +BLSN VV002 M09/M10 Q1001 TEMPO 0400 +SN +BLSN VV002 RMK QFE747 QWW060 MOD ICE MOD TURB S

METAR USNN 020300Z 11008G13MPS 1100 R03/1200 SN +BLSN BKN004 OVC066 M 09/M10 Q1000 NOSIG RMK QFE745 QRD120 MOD ICE MOD TURB SC045





The monthly data shows the double peak:

I’m sure many more can be found, I invite readers to have a look for themselves by looking for such events at Weather Underground

It is not just Russia that has METAR reporting errors

Lest you think this a fault of Russia exclusively, it also happens in other northern hemisphere Arctic site and also in Antarctica.

Svalbard, Oct 2, 2008

METAR ENSB 020550Z 13012KT 6000 -SN FEW010 SCT015 BKN030 M04/M06 Q1013 TEMPO 4000 SN BKN012

METAR ENSB 020650Z 14013KT 9000 -SN FEW010 SCT018 BKN040 03/M06 Q1013 TEMPO 4000 SN BKN012

METAR ENSB 020750Z 15011KT 9999 -SN FEW015 SCT025 BKN040 M03/M07 Q1013 TEMPO 4000 SN BKN012

Eureka, Northwest Territory, Canada March 3 2007

It hit 109.4 F (43C) there on March 3rd 2007 according to this METAR report. Eureka is the northernmost GHCN station remaining for Canada. It’s temperature gets interpolated into adjacent grid cells.

CWEU 031600Z 14004KT 15SM FEW065 BKN120 M43/M45 A2999 RMK ST1AS2 VIA YQD SLP150

CWEU 031700Z 15005KT 10SM FEW065 BKN012 43/46 A3000 RMK SF1AS1 VIA YQD SLP163

Decoded: 11:00 AM 109.4 °F 114.8 °F 100% 30.01 in 10.0 miles SSE 5.8 mph - Mostly Cloudy

CWEU 031800Z 11003KT 15SM FEW050 FEW065 OVC130 M43/M46 A3001 RMK SF2SC1AS1 VIA YQD SLP164

In these cases below for Antarctic stations Dome C and Nico, the METAR reports seem to have all sorts of format issues and I’m not even sure how where the error occurs, except that Weather Underground reports a spike just like we see in Russia.

Dome C station Dec 9, 2009

AAXX 0900/ 89828 46/// ///// 11255 36514 4//// 5//// 90010

AAXX 0901/ 89828 46/// ///// 10091 36514 4//// 5////

AAXX 09014 89828 46/// /1604 11225 36480 4//// 5//// 9014

Nico Station,  University of Wisconsin Dec 9, 2009

AAXX 0920/ 89799 46/// ///// 11261 4//// 5//// 92030

AAXX 0920/ 89799 46/// ///// 11103 4//// 5//// 92040

AAXX 0921/ 89799 46/// ///// 11270 4//// 5////

Admusen Scott Station Jan 14th, 2003

Here’s generally properly formatted METAR data, but note where the technician coded the extra space, oops!

NZSP 131350Z GRID36007KT 9999 IC SCT020 BKN060 M31/ A2918 RMK SDG/HDG

NZSP 131450Z GRID36007KT 9999 IC FEW010 FEW020 SCT035 SCT050 M3 1/ A2918 RM K SDG/HDG

NZSP 131550Z GRID10008KT 9999 IC BCFG FEW010 SCT020 BKN050 M31/ A2919 RMK VIS E 2400 BCFG E SDG/HDG

And I’m sure there are many more METAR coding errors yet to be discovered. What you see above is just a sampling of a few likely candidates I looked at over a couple of hours.

Missing M’s – Instant Polar Amplification?

It has been said that the global warming signature will show up at the poles first. Polar Amplification is defined as:

“Polar amplification (greater temperature increases in the Arctic compared to the earth as a whole) is a result of the collective effect of these feedbacks and other processes.It does not apply to the Antarctic, because the Southern Ocean acts as a heat sink. It is common to see it stated that “Climate models generally predict amplified warming in polar regions”, e.g. Doran et al. However, climate models predict amplified warming for the Arctic but only modest warming for Antarctica.

Interestingly, the METAR coding error has its greatest magnitude at the poles, becuase the differences in the missing minus sign become larger as the temperature grows colder. Eureka, NWT is a great example, going from -43°C to +43°C (-45.4°F to 109.4°F) with one missing “M”.

You wouldn’t notice METAR coding errors at the equator, because the temperature never gets below 0°C. Nobody would have to code it. In middle latitudes, you might see it happen, but it is much more seasonal and the difference is not that great.

For example:

M05/M08  to 05/M08 brings the temp from -5°C to +5°C, but in a place like Boston, Chicago, Denver, etc a plus 5C temperature could easily happen in any winter month a -5C temperature occurred. So the error slips into the noise of “weather”, likely never to be noticed. But it does bump up the temperature average a little bit for the month if uncorrected.

But in the Arctic and Antarctic, a missing M on a M20/M25 METAR report makes a 40°C difference when it becomes +20°C. And it doesn’t seem likely that we’d see a winter month in Siberia or Antarctica that would normally hit 20°C, so it does not get lost in the “weather” noise, but becomes a strong signal if uncorrected.

Confirmation bias, expecting to see polar amplification may be one reason why until now, nobody seems to have pointed it out. Plus, the organizations that present surface derived climate data, GISS, CRU, only seem to deal in monthly and yearly averages. Daily or hourly data is not presented that I am aware of, and so if errors occur at those time scales, they would not be noticed. Obviously GISS didn’t notice the recent Finland error, even though it was glaringly obvious once plotted.

With NASA GISS admitting that missing minus signs contributed to the hot anomaly over Finland in March, and with the many METAR coding error events I’ve demonstrated on opposite sides of the globe, it seems reasonable to conclude that our METAR data from cold places might very well be systemically corrupted with instances of coding errors.

The data shown between 4/13 and 4/15 were based on data downloaded on 4/12 and included some station reports from Finland in which the minus sign may have been dropped.


That darned missing M, or an extra space, or even writing “-” when you mean “M” (which is counterintuituve to basic math) all seem to have a factor in the human error contributing to data errors in our global surface temperature database. To determine just how much of a problem this is, a comprehensive bottom up review of all the data, from source to product is needed. This needs to start with NOAA/NCDC as they are ultimately responsible for data quality control.

It has been said that “humans cause global warming”. I think a more accurate statement would be “human error causes global warming”.

Note: In this post I’ve demonstrated the errors. In a later post, I hope to do some data analysis with the numbers provided to see how much effect these errors actually have. Of course anyone who wants to do this is welcome to leave links to graphics and tables. -Anthony

See these weather underground sites:
Yakutsk (Jakutsk)
and this one, is particularly interesting, because it shows a clear error in the 12:30PM and 1 PM METAR report for that day
12:00 PM -5.8 °F -9.4 °F 84% 30.33 in – SW 11.2 mph 17.9 mph N/A   Clear
UERR 010300Z 22005G08MPS CAVOK M21/M23 Q1027 NOSIG RMK QFE738 24450245
12:30 PM 69.8 °F -9.4 °F 4% 30.30 in 6.2 miles SW 11.2 mph 17.9 mph N/A   Unknown

UERR 010330Z 22005G08MPS 9999 -SN 21/M23 Q1026 NOSIG RMK QFE738 24450245

1:00 PM 68.0 °F -7.6 °F 5% 30.27 in 6.2 miles SW 11.2 mph 17.9 mph N/A   Unknown

UERR 010400Z 22005G08MPS 9999 -SN SCT100 OVC200 20/M22 Q1025 NOSIG RMK QFE737 24450245

1:30 PM -4.0 °F -7.6 °F 85% 30.24 in 2.5 miles SSW 11.2 mph 17.9 mph N/A Snow Light Snow

UERR 010430Z 21005G08MPS 4000 -SN SCT100 OVC200 M20/M22 Q1024 NOSIG RMK QFE737 24450245

Note the missing ” M” on the 1230 and 1PM reports 21/M23
Put that in this online METAR decoder:
and you get 70F for 12:30PM and 68F for 1PM man made global warming thanks to hand coded teletype report.
Pilots will know its wrong and disregard, they mostly worry about baro pressure/altimeter and winds. Temps on the ground are never as extreme as what aircraft experience in the air.
No incentive to correct this…not a big deal to aviation.
Here is what I think is going on:
1) Russian METAR is hand-coded from airports, thus prone to error. put in a minus sign for M or make an M+space+ temp/dp and you get the same thing. example:
UERR 010330Z 22005G08MPS 9999 -SN -21/M23 Q1026 NOSIG RMK QFE738 24450245 gives 70F in the online decoder above, later systems may strip the minus sign as being an invalid character in the report which is why we may not see it…or they just forget to add “M” either way, all we need is one of these per month.
2) or…character formatting western/cyrillic may contribute to missing or badly formatted characters in automated decoding.
Either way, there’s our spurious Russian warming, and why we seem to have a permanent red spot there.
Here’s what one off reading will do. The city of Yakutsk, one of the coldest cities on earth, reported a high of 94˚F on December 14th with a low of -35˚F. The average for the day, 30˚F, was some 67˚F above normal.. pushing the anomaly for the month of December from 3.6˚F above normal to 5.9˚F above normal… quite a shift!
It also happens in Antarctica:
Dome C station
Nico University of Wisconsin
Admusen Scott Station
Here is the list of sites with GHCN WMO numbers per this list:
WMO number    Station Name            Day/Month/Year of error
24266                Verhojansk                   13/11/2009
23955                ALEKSANDROVSKOE  11/12/2009
and also 01/16/2009
and 02/02/2007

Sponsored IT training links:

Ultimate VCP-410 practice files formulated by top experts to help you pass 220-702 and 640-822 exam in easy and fast way.


newest oldest most voted
Notify of
Ken Smith

Fascinating post. Regarding the UHI effect, I have a question I hope someone can address.
Is it plausible that UHI might be increased by a _local_ greenhouse effect? What I mean is this: if carbon dioxide and other greenhouse gases are trapping heat, might they be trapping it to a greater extent at the locations where large amounts of those gases are emitted? Like urban areas, and airports particularly?
Or does the greenhouse effect (whatever one may reckon as it’s extent) only operate at upper atmosphere levels and over massive regional or global levels?


Notice how the errors as shown ALWAYS result in higher temps.

Tom in Florida

It’s GIGO all over again!


Two wrongs (bad data plus bad models) did not make a right!

Thanks, Anthony. Great post (with lots of graphics). This was enlightening.
REPLY: Coming from Mr. “Gobs of Graphics” himself, that is quite a complement. Thanks -A

I noticed in February in Colorado that NCDC/NOAA was consistently showing above average temperatures along the Front Range – while Accuweather stats, the unusually prolonged frozen lakes, and practically everyone I talked to indicated that it was a very cold February. Accuweather showed it 2-4 degrees below normal.

A cultural/social effect may also be at work. When I was the US Representative to the Soviet Union on the Protection of the Environment [included weather forecasting] back in 1976 I noticed, both from verbal interchange and from the evening news on TV that in the middle of winter when the temperature was typically -30C, the Russians never mentioned the minus sign; they would simply state “it’s 30 today”.

Anthony, thanks for your hard work on this! Excellent analysis, I’ve seen such errors of omission/coding in epidemiological studies of pathogen counts with similar results. Garbage in, garbage out!
With the powerful neural network and AI tools available, you’d think that someone who was serious about the integrity of their data would subject it to some level of peer review by statistical process control. I don’t think it would be hard at all.
And yet, the AGW train keeps on rolling down the tracks, with a Senate cap & trade/energy bill in the works. It will be interesting.


That’s right,Anthony,accusations and extrapolation first,then ‘some data analysis’ later. Better pass this information along to the world’s glaciers. How’s the US surface stations project coming along?
REPLY: Nick, dial down your anger, then look at the hard data errors presented. Glaciers are mostly proxies for precipitation, so they don’t really care anyway. Surface project is coming along fine, a paper is being prepared. Thanks for asking. Of course if you just want to ignore what is presented, and rant, I suggest ClimateProgress, where ranting is an art form. -A


wonderful work.

Keith Minto

Excellent report, Anthony.
As Ric Werme said earlier, temperatures in the form of degrees Kelvin would solve the problem of negative temperatures.

Paul Vaughan

I have had the experience of finding a weather station with all negative (degrees Celsius – so below freezing) temperatures SET TO ZERO deg C.
When I consulted officials, they admitted serious data quality issues for the main sites (the ones most valuable to me) at which I was looking.
They also thanked me for pointing out 200cm (over 6 feet) of snow that came and went in a single day. They guessed it was probably 2cm (under 1 inch) and changed it to that.
They didn’t seem too surprised to hear that I also noticed amounts like 400cm to 500cm of snow appearing, disappearing, appearing again, then disappearing, etc. The response was something like, “Yeah, the quality was so bad at that site – we almost shut it down.”
I encouraged them to keep the sites running since the sites were essential to my research. I’d much rather deal with bad data than no data.

Gary from Chicagoland

Wow, outstanding work Anthony! This shows how important raw climate data is for the proper scientific method to work. You just showed us how a few hours of hard work from you uncovered a possible big flaw in the way climate data is recorded, and how important it is for this data to be corrected. Once the data is verified, then it should be compared to the theory, and if it is in conflict, then the theory gets modified not the valid data. Climategate showed how the correction clause was not achieved as the valid data was truncated to “hide the decline”. We need more eyes to find these temperature errors, and more open minded climate scientists and politicians to accept a modified global warming theory to better match the measured data!

Leif Svalgaard (19:15:23) :
Dr. Leif:
Thank you for that. I was immediately reminded of the work of Benjamin Lee Whorf, an “amateur” anthropological linguistics enthusiast whose day job was Fire Safety Engineer for the Hartford Insurance Company. He noted that an awful lot of industrial fires started in “empty rooms”…. specifically, the “empty drum room”. What better place to sneak a smoke than in the “empty drum room”…. no danger there, right?
Gotta wonder how many foreign visitors who thought they knew Russian froze to death when they dressed for unexpectedly balmy weather.

Neel Rodgers

While you have found a number of sites that have this error, I think there may be a slight exaggeration in how easy it is to do this. Does it create a significant jump in temps, yes. But that is why there should be some quality control of the data at the site. Every day, many of these places send 24-48 observations an hour (depending on hourly versus every half hour). And there is a huge number of these stations all over the world. It is kind of like saying “An Airplane crashed so flying isnt safe.”
As someone who deals with METAR day in and day out, I can say these errors arent quite as rampant as this leads to believe.
Also, when CORs (corrections to a wrong observation) are sent, it effectively overwrites the incorrect observation in the databases.
And lastly, could this be more of a problem of relying too much on computers as opposed to human interaction? In the old code, a space could just as easily be placed in the temp, or the minus sign could be forgotten. Human observers would catch and fix this, computers would not. And thus we would be in the same position we are now.

Sam Hall

“The data shown between 4/13 and 4/15 were based on data downloaded on 4/12 and included some station reports from Finland in which the minus sign may have been dropped.”
How do you download data on the day before it happens?

John Blake

What would it take to reinstate the old system, or edit METAR to minimize error-prone transcriptions plus render the reporting exercise intelligible without abstruse bureaucratic nomenclature?
Really shouldn’t be that complicated to produce a halfway decent reprise of basic weather conditions, transcribed and reported in a format no more complex than various financial listings. Try omitting minus signs in trading data, and watch regulatory authorities go stark bonkers.

Great job, you covered nearly all the nuances I’d be likely to think of.
This is sort of the ideal error – it’s more likely to make a temperature read too high, it’s more likely to affect recent temperatures than past ones, and being a warming error is less likely to attract the attention of people expecting to see warming.
If I had time, it would be fun to play with software to look for the problem in the records, even your +/- 5 error in temperate latitudes could be identified by looking at the time, sky cover, wind direction change, and even station history.


It can only lead to warmer temperature for the simple reason that it’s much much much likelier to forget/mistype an M than to add one where there isn’t supposed to be one !

Terry Jackson

Nice work. The motto is “Garbage in Gospel out”.

R. Gates

Excellent post. I’m sure (in fact, quite sure) that there will be many folks looking over your investigation very closely in the next few days…and of course, the bottom line of all this will be…
What difference or effect will this all make in the actual global temperature anomalies?
This was a great piece of work, and seriously you should at the very least be getting a big fat pay check from several agencies for doing such a thorough job of independent quality control work…
REPLY: Thank you sincerely. There’s always the tip jar 😉 – Anthony

Steve Koch

It would not be difficult to program a comparison of satellite temps vs surface sensor temps. When they are sufficiently different for a particular location/time, it can be flagged by the program. Probably it would be possible to run historic data through this comparison program to find past mistakes.
Given the mess that is surface data, the papers based on this data should be rewritten based on satellite temps.

Keith Minto (19:25:30) :

Excellent report, Anthony.
As Ric Werme said earlier, temperatures in the form of degrees Kelvin would solve the problem of negative temperatures.

Careful – that was a bit of a rant. (And an opportunity to say “Kelvins” instead of “degrees Kelvin” 🙂 .) For data in something like a METAR, expecting one of us Fahrenheit drooling Americans to accurately report in Kelvins would likely lead to all sorts of errors if any transcription is involved. Worse, it wouldn’t be immediately apparent, but it might have a better chance of being caught eventually, as the errors could be quite impressive.

Impressive. You should write a book…


The map was online between 4/13 and 4/15, showing March average using data available on March 12th.
If only high and low temperatures are kept for the day, then averaged to get the daily temperature, then averaged with the 30 other days of the month to get the monthly temperature, the impact can be quite big in cold regions.
Assume average temperature is -15 all month.
Assume just one mistake per month.
You get a monthly average of -14.5 instead of -15.
The thing is, to get the magnitude needed to reproduce Finland’s error (which is near zero therefore with less impact each type you flip the sign), you basically need to have the wrong side almost all month.
It would be interesting to check if they have a new employee feeding in the data…or maybe an intern… someone who wrote in minus signs all the time instead of M…. who knows ?

This is an artifact of how I/O is normally done in Fortran. Many scientists program in Fortran because that is how they have always done it. Older versions of Fortran require fixed positions in data files, which allows for fast I/O but makes parsing prone to errors like Anthony has uncovered.
But who cares how fast file I/O occurs? The science world is locked into a poor programming paradigm which affects everyone’s productivity and accuracy.

Wow. Who knew? And if I read your article right, since the records are discarded, you can’t go back year by year to correct them?
Reminded me of all the data corruption errors we used to get in the 80’s and 90’s converting data between different cpu architectures because people didn’t understand byte ordering differences (big endian processors vs little endian processors). Now I am certain when the climate centres converted their computer systems over time they would have caught that one. Certainly someone would have noticed 45 being changed to 54? Right? They caught that, right?
REPLY: I don’t know if NOAA archives all original METAR reports. If they don’t, they are in a pickle then. Maybe the FAA does. I don’t know that these things have been caught, because climate folks deal with monthly data, not hourly or daily. So if the data is automatically averaged as it comes in, chances are they’ll look at a month’s worth of data from a station and say “seems warm there”. If you are expecting it, like with polar amplification, then it may go unnoticed.
GISS missed an apparently identical error in Finland, and that probably would not have been caught if it were not a country that had a separate met service with a publication saying the country had been below normal in March 2009- Anthony


Neel Rodgers (19:34:22) :
While you have found a number of sites that have this error, I think there may be a slight exaggeration in how easy it is to do this. Does it create a significant jump in temps, yes. But that is why there should be some quality control of the data at the site. Every day, many of these places send 24-48 observations an hour (depending on hourly versus every half hour).

And yet, the errors persist…


Nick (19:21:26) :
That’s right,Anthony,accusations and extrapolation first,then ’some data analysis’ later. Better pass this information along to the world’s glaciers. How’s the US surface stations project coming along?
If people can’t understand what Anthony has written here in context then they aren’t going to understand anything anyway. This is quite interesting and I look forward to the real data crunching to find out what effect this is actually having. What I do love is how you make the extrapolation that glaciers are melting in your tongue in cheek manner after your disdain to the ‘accusations and extrapolation first’ of Anthony. Do we get to see your number crunching ever? I believe not.
We have however, seen the Arctic reach very close to normal ice levels with old ice being the major contributor. As far as deviation goes, it certainly deviated in the right direction, towards a large upswing recovering back to ~2003 levels. If there is anything you would like to actually contribute in the way of information about glaciers melting we’ll all be here.. waiting.

Kirk W. Hanneman

Anthony, this is both fascinating and deeply disturbing. As you say, there needs to be some serious quality control done before any other analysis can be done with such a dataset, much less conclusions drawn. But the saving grace is that at least we have billions budgeted for climate research that can be used to fix these sorts of egregious and pervasive errors, right? We need to know urgently if the high temperatures consistently reported in places like SIberia are correct, or if they are significantly influenced by these mistakes.

Gerald Machnee

I think there is also an additional problem. The USA is using Fahrenheit while most countries are using Celcius.
A number of years ago the reports were quality controlled by humans. When they retired they were not replaced. Now we see what the computers are doing.


Judith Curry, in comments at Bishop Hill and at Pielke Fils, is saying that the whole temperature and paleo records need to be reworked. I think she’s tired of the foolishness, and I think there are a lot of other people who are tired, too.

Rob Dawg

Lord Kelvin smiles.


Nick (19:21:26) :
That’s right,Anthony,accusations and extrapolation first,then ’some data analysis’ later. Better pass this information along to the world’s glaciers. How’s the US surface stations project coming along?
Bravo! Spoken like a true Post Normal Scientist, Nick. You need to ask for a raise.


Very good find!
I am a professional (bio)statistician of more than 50 years experience. I was worried from the beginning of my interest in so-called AGW about the way data were being handled and discussed. I have been a daily (if not hourly) visitor to your site for about two and a half years. I feel in my gut that you may have found a major source of the confusion which arises when one tries to follow the arguments for increasing temperatures. This may be true not only for the how supposed global “rise” in temperature is occurring, but also for how the arguments have never reached firm conclusions, even among the AWG supporters. If the data are subject to (possibly) random influences which have a major effect, then the foundation will never be solid and one can never feel truly grounded in the conclusions. As an aside, in my early career I was part of a study to determine the sources of error in data that had been computerized. We found that more than 90 % of data errors arose during transcription of data from one form to another (e.g., writing down a reading from an instrument, copying data from one paper from to another, recoding data. etc.). Surprisingly few errors were made during key punch data entry. I am eager to read the responses to this post. Thanks for this information.


Leaves little doubt about starting over.
Now we know how bad it can get when you have grid points depending on a single station, and that station has METAR data errors.
I pretty much have an idea of how a single day can mess up a month, as I do as much station graphs from daily data.


Once the Finland error was corrected, what was the effect on the average monthly temp for Finland?

REPLY: I don’t know if NOAA archives all original METAR reports. If they don’t, they are in a pickle then. Maybe the FAA does. I don’t know that these things have been caught, because climate folks deal with monthly data, not hourly or daily.>>
If that error crept in undetected for so long I would have to believe there are others, and if the original records no longer exist, then there is no way to know. I was being a bit fascetious on the big endian little endian thing, but I would have to believe they caught that one, the total of the errors introduced would be so enormous it would be unlikely to go undetected. I think. I hope.
But there are other issues and given the amount of lost data, undocumented code, even their backup systems don’t sound like they were properly maintained. Remember when computer viruses first started appearing? Most of them were instantly noticeable so people had to deal with them. There was one very nasty little monster named “Ripper”. It would sit on your computer and do mostly nothing. Every 1024th write to the hard drive it would swap two bytes, and then do nothing again. Word documents would suddenly have spelling errors. Spread sheet formulas would develop errors. But table data? No one would notice for a long long time. I cleaned up a mess at a research facility once where the researcher had 10 years of clinical trial data corrupted, and no way to recover it because there was no backup system and the paper records had been trashed as the data was entered.
Given the gaping hole you just exposed, and the clear lack of professionnal standards when it came to not just writing and documenting code, but managing the integrity of the data itself, I can’t but help thinking that it is all junk and we need to start with the paper records themselves. again.

Michael Larkin

Terrific post, Anthony. Every new wrinkle we hear about just makes it more and more obvious that temperature data is suspect.
Terry Jackson (19:54:00) :
The motto is “Garbage in Gospel out”.

Jim Arndt

Here is global warming for you. I think the ice sheets will melt soon. This is for real.
Vostok, Antarctica,106.87000275
Tuesday Night
Overcast. Low: -103 °F . Wind West 17 mph . Windchill: -148 °F .
Overcast. High: -81 °F . Wind West 17 mph . Windchill: -139 °F .
Wednesday Night
Overcast. Low: -110 °F . Wind SW 17 mph . Windchill: -155 °F .
Partly Cloudy. High: -77 °F . Wind WSW 20 mph . Windchill: -158 °F .
Thursday Night
Overcast. Low: -104 °F . Wind SSW 6 mph . Windchill: -133 °F .


There may be a tendency for recorders to forget to place minus signs in front of numbers. I believe this is called a systematic error. It could cause an upward bias in recorded levels but should not affect trend or change in levels unless there was an increasing tendency to forget to use the minus sign.
Switching from requiring a minus sign to requiring an “M” in front of the value to indicate it is negative could cause a one-time step up in a trend, if people are more likely to forget an M than a minus sign. But after that it should not affect the trend, unless again there was an increasing tendency to forget to use the M.
Perhaps the answer to addressing this systematic error is simply to require a sign, plus or minus, in front of each recorded value.

Mike McMillan

Sam Hall (19:35:58) :
How do you download data on the day before it happens?

Use climate models.
This seems like a fertile area for research, for those of us less technologically inclined. Just download a station’s data, plot it somehow, and look for spikes.

Tim F

I have seen this effect myself. Some years ago when I was in the AF we were flying through the Med in Jan or Feb and we asked the weather briefer to pull up the METAR for home–Grand Forks, ND. He read us the observation and when we asked how cold it was he said that “it looks like the mean temperature is 20 degrees.” We had to tell him that the letter M meant a negative temp–part of daily life in ND, not so much in Crete.
Good piece of detective work Mr. Watts.


Has anybody noticed that in mid 1966 the US started rounding negative temperatures hotter? From the Wikipedia article on rounding:

In a guideline issued in mid-1966,[8] the U.S. Office of the Federal Coordinator for Meteorology determined that weather data should be rounded to the nearest round number, with the “round half up” tie-breaking rule. For example, 1.5 rounded to integer should become 2, and −1.5 should become −1. Prior to that date, the tie-breaking rule was “round half away from zero”.

Michael Larkin

PS – Anthony: $50 on its way to you via the tip jar – wish it could be more. Hope it’s a useful contribution to your upcoming trip.


An outstanding piece of scientific forensic investigation Anthony. If the archived reports are available, it wont be difficult to develop some code to correct the record.
Looking at the March GISS its obvious NASA doesnt pay any attention to the material its producing. The Finland data is obviously wrong.
More importantly, this post just highlights the appalling state of our temperature record. Our policy makers would do well to take note.

JRR Canada

I agree with KIrk.The deeper we get to look the sadder this science seems.Is it because these people are all govt employees?I mean the errors are so pathetic its like they just do not care, no repercussions for poor work no consequences for mistakes.Lets save billions, their work can not be trusted? Stop funding.


Great work.
Reported thermometer readings matter. Instrument bias and uncertainty matter. Missing minus signs matter. They matter because we are attempting to measure a temperature rise allegedly caused by man-made CO2 and said temperature rise is conjectured to threaten to destroy the world. In this debate, thermometer readings matter.
The conjecture that world-wide AGW is from man-made CO2 is weak due to a paucity of observable evidence. Melting ices caps? What melting ice caps? Rising sea levels? Really? Disappearing glaciers? So what? They cover only a fraction of Earth’s surface and though they shrink and grow, are hardly disappering (much to the embarassment to IPCC). Ifs, coulds, maybes. On and on.
Isn’t this all about (global) catastrophic warming? If it’s really getting catastrophically hotter here; show me the thermometer readings. They will show distinct unmistakable catastrophic heating if the readings are any good and if it is really getting catastrophicall hotter, the thermometers will show it’s getting catastrophically hotter. Period.
What we actually see in surface thermometer temperature records is inconsistency, instrument bias, human error, and this engineer has had it up to his keester with wishful thinking, madness of crowds, with frauds and charlatans showing up to the debate with a suitcase full of political baggage, advocacy, and empty rhetoric. All we need in this debate is the temperature readings.
Kindly, Anthony, you and others have shown me the thermometers. I’m sure the analysis to follow will be interesting; but to me not as interesting as this post. I’ve seen enough to know that any global catastorphic warming from man-made CO2 is either being smothered by negative feedback within the climate system itself, and/or is so minute as to be within the measurement error and of little consequence in comparason to world destruction from AGW.
What is really fascinating to me is how a weak conjecture like AGW has grown into a mass delusion driving drastic political policy change despite a complete lack of observable results/catastrophes.


I’m having the same problem, sort of, using the drifting bouys in the arctic. Missing minus signs are screwing up the daily averages. All of them- none seem to be exempt.
As you can see, the temp goes from -5 to 5, and then back to -5. You can also export this page into Excell easily.
Missing minus signs- been there, doing that.

Cement a friend

Great post Anthony. Please get it published in a peer review paper so the AGW crowd can not dispute it.
Ken Smith (18:56:17) : The answer about local effect from CO2 is no. There is plenty of evidence about very high local CO2 levels. I have personally measured over 1000ppm around a furnace exhaust stack. No one has been able to demonstrate a change to local temperatures resulting from high CO2. W Kreutz (1941) made measurements of CO2 at four heights in association with climate data over 1.5 years. The data in his paper shows that solar radiation leads local air temperature which in turn leads CO2 levels. I have noted some criticisms of Kreutz but it would appear they these critics a) did not read the paper b) can not understand (technical) German or c)deliberately ignored parts of the paper and his intent. The paper shows that he, from time to time, during the 1.5 year period made CO2 readings at 1 or 2 hr intervals and his figure 1 (24 Sep 1940 to 28 Sep 1940) shows an example of a daily measurement. In this example lower temperatures are associated with higher CO2.
However, clouds obviously have a local effect. Humidity and wind chill have a personal temperature feel effect. Convective heat transfer is an important effect for local near surface air temperatures. This is particularly noticeable near the sea. We have a thermometer in the car (under the bonnet near the front) giving outside air temperature.
We noted on Friday night around 6PM (at sunset) the temperature where we live (bush and 10 km from the coast) at 22C (slight wind from west) while the temperature at the coast opposite a well known beach was 26C (slight wind from east over the sea)
Beside the effect of wrong data inclusion shown above and UHI effects, it has been pointed out on this site that the GHCN database also includes incorrect data splicing from a number of sites now being included as airports eg data on Darwin. Most airports did not exist before 1942, yet they have spliced data. Then many airports have been substantially upgrade in the last 20 years, causing UHI effects and maybe splicing. This is shown by the Chiefo’s dT/dt analyes method
Anthony, I welcome your blog and regularly look at this site when I look at the internet. Great posts from Willis, Dr Roy Spencer, and many others. Thanks for your efforts