Watts Up With Nuuk?

NEW UPDATES POSTED BELOW

As regular readers know, I have more photographs and charts of weather stations on my computer than I have pictures of my family. A sad commentary to be sure, but necessary for what I do here.

Steve Goddard points out this NASA GISS graph of the Annual Mean Temperature data at Godthab Nuuk Lufthavn (Nuuk Airport) in Greenland. It has an odd discontinuity:

Source data is here

The interesting thing about that end discontinuity is that is is an artifact of incomplete data. In the link to source data above, GISS provides the   Annual Mean Temperature (metANN) in the data, before the year 2010 is even complete:

Yet, GISS plots it here and displays it to the public anyway. You can’t plot an annual value before the year is finished. This is flat wrong.

But even more interesting is what you get when you plot and compare the GISS “raw” and “homogenized” data sets for Nuuk, my plot is below:

Looking at the data from 1900 to 2008, where there are no missing years of data, we see no trend whatsoever. When we plot the homogenized data, we see a positive artificial trend of 0.74°C from 1900 to 2007, about 0.7°C per century.

When you look at the GISS plotted map of trends with 250KM smoothing, using that homogenized data and GISS standard 1951-1980 baseline, you can see Nuuk is assigned an orange block of 0.5 to 1C trend.

Source for map here

So, it seems clear, that at least for Nuuk, Greenland, their GISS assigned temperature trend is artificial in the scheme of things. Given that Nuuk is at an airport, and that it has gone through steady growth, the adjustment applied by GISS is in my opinion, inverted.

The Wikipedia entry for Nuuk states:

With 15,469 inhabitants as of 2010, Nuuk is the fastest-growing town in Greenland, with migrants from the smaller towns and settlements reinforcing the trend. Together with Tasiilaq, it is the only town in the Sermersooq municipality exhibiting stable growth patterns over the last two decades. The population increased by over a quarter relative to the 1990 levels, and by nearly 16 percent relative to the 2000 levels.

Nuuk population dynamics

Nuuk population growth dynamics in the last two decades. Source: Statistics Greenland

Instead of adjusting the past downwards, as we see GISS do with this station, the population increase would suggest that if adjustments must be applied, they logically should cool the present. After all, with the addition of modern aviation and additional population, the expenditure of energy in the region and the changing of natural surface cover increases.

The Nuuk airport is small, but modern, here’s a view on approach:

Nuuk Airport, Aug 11, 2007, photo by Lars Perkins via Picasa web album

Closer views reveal what very well could be the Stevenson Screen of the GHCN weather station:

Nuuk Airport looking Southwest Image: Panaramio via Google Earth

Here’s another view:

Nuuk Airport looking Northwest Image: Panaramio via Google Earth

The Stevenson Screen appears to be elevated so that it does not get covered with snow, which of course is a big problem in places like this. I’m hoping readers can help crowdsource additional photos and/or verification of the weather station placement.

[UPDATE: Crowdsourcing worked, the weather station placement is confirmed, this is clearly a Stevenson Screen in the photo below:

Nuuk Airport, Stevenson Screen. Image from Webshots - click to enlarge

Thanks to WUWT reader “DD More” for finding this photo that definitively places the weather station. ]

Back to the data. One of the curiosities I noted in the GISS record here, was that in recent times, there are a fair number of months of data missing.

I picked one month to look at, January 2008, at Weather Underground, to see if it had data. I was surprised to find just a short patch of data graphed around January 20th, 2008:

But even more surprising, was that when I looked at the data for that period, all the other data for the month, wind speed, wind direction, and barometric pressure, were intact and plotted for the entire month:

I checked the next missing month on WU, Feb 2008, and noticed a similar curiosity; a speck of temperature and dew point data for one day:

But like January 2008, the other values for other sensors were intact and plotted for the whole month:

This is odd, especially for an airport where aviation safety is of prime importance. I just couldn’t imagine they’d leave a faulty sensor in place for two months.

When I switched the Weather Underground page to display days, rather than the month summary, I was surprised to find that there was apparently no faulty temperature sensor at all, and that the temperature data and METAR reports were fully intact. Here’s January 2nd, 2008 from Weather Underground, which showed up as having missing temperature in the monthly WU report for January, but as you can see there’s daily data:

But like we saw on the monthly presentation, the temperature data was not plotted for that day, but the other sensors were:

I did spot checks of other dates in January and February of 2008, and found the same thing: the daily METAR reports were there, but the data was not plotted on graphs in Weather Underground.The Nuuk data and plots for the next few months at Weather Underground have similar problems, as you can see here:

But they gradually get better. Strange. It acts like a sensor malfunction, but the METAR data is there for those months and seems reasonably correct.

Since WU makes these web page reports “on the fly” from stored METAR reports in a database, to me this implies some sort of data formatting problem that prevents the graph from plotting it. It also prevents the WU monthly summary from displaying the correct monthly high, low, and average temperatures. Clearly what they have for January 2008 is wrong as I found many temperatures lower than the monthly minimum of 19 °F they report for January 2008, for example, 8°F on January 17th, 2008.

So what’s going on here?

  • There’s no sensor failure.
  • We have intact hourly METAR reports  (the World Meteorological Organization standard for reporting hourly weather data for January and February 2008.
  • We have erroneous/incomplete presentations of monthly data both on Weather Underground and NASA GISS for the two months of Jan Feb 2008 I examined.

What could be the cause?

WUWT readers may recall these stories where I example the impacts of failure of the METAR reporting system:

GISS & METAR – dial “M” for missing minus signs: it’s worse than we thought

Dial “M” for mangled – Wikipedia and Environment Canada caught with temperature data errors.

These had to do with missing “M’s” (for minus temperatures) in the coded reports, causing cold temperatures like -25°C becoming warm temperatures of +25°C, which can really screw up monthly average temperatures with one single bad report.

Following my hunch that I’m seeing another variation of the same METAR coding problem, I decided to have a look at that patch of graphed data that appeared on WU on January 19th-20th 2008 to see what was different about it compared to the rest of the month.

I looked at the METAR data for formatting issues, and ran samples of the data from the times it plotted correctly on WU graphs, versus the times it did not. I ran the METAR reports through two different online METAR decoders:

http://www.wx-now.com/Weather/MetarDecode.aspx

http://heras-gilsanz.com/manuel/METAR-Decoder.html

Nothing stood out from the tests with the decoder I did. The only thing that I can see is that some of the METAR reports seem to have extra characters, like /// or 9999, like these samples, the first one didn’t plot data on WU, but the second one did an hour later on January 19th:

METAR BGGH 191950Z VRB05G28KT 2000 -SN DRSN SCT014 BKN018 BKN024 M01/M04 Q0989

METAR BGGH 192050Z 10007KT 050V190 9999 SCT040 BKN053 BKN060 M00/M06 Q0988

I ran both of these (and many others from other days in Jan/Feb) through decoders, and they decoded correctly. However, that still leaves the question of why Weather Underground’s METAR decoder for graph plotting isn’t decoding them correctly for most of Jan/Feb 2008, and leaves the broader question of why GISS data is missing for these months too.

Now here’s the really interesting part.

We have missing data in Weather Underground and in GISS, for January and February of 2008, but in the case of GISS, they use the CLIMAT reports (yes, no ‘e”) to gather GHCN data for inclusion into GISS, and final collation into their data set for adjustment and public dissemination.

The last time I raised this issue with GISS I was told that the METAR reports didn’t effect GISS at all because they never got numerically connected to the CLIMAT reports. I beg to differ this time.

Here’s where we can look up the CLIMAT reports, at OGIMET:

http://www.ogimet.com/gclimat.phtml.en

Here’s what the one for January 2008 looks like:

Note the Nuuk airport is not listed in January 2008

Here’s the decoded report for the same month, also missing Nuuk airport:

Here’s February 2008, also missing Nuuk, but now with another airport added, Mittarfik:

And finally March 2008, where Nuuk appears, highlighted in yellow:

The -8.1°C value of the CLIMAT report agrees with the Weather Underground report, all the METAR reports are there for March, but the WU plotting program still can’t resolve the METAR report data except on certain days.

I can’t say for certain why two months of CLIMAT data is missing from OGIMET, why the same two months of data is missing from GISS, or why Weather Underground can only graph a few hours of data on each of those months, but I have a pretty good idea of what might be going on. I think the WMO created METAR reporting format itself is at fault.

What is METAR you ask? Well in my opinion, a government invented mess.

When I was a private pilot (which I had to give up due to worsening hearing loss – tower controllers talk like auctioneers on the radio and one day I got the active runway backwards and found myself head-on to traffic. I decided then I was a danger to myself and others.) I learned to read SA reports from airports all over the country. SA reports were manually coded teletype reports sent hourly worldwide so that pilots could know what the weather was in airport destinations. They were also used by the NWS to plot synoptic weather maps. Some readers may remember Alden Weatherfax maps hung up at FAA Flight service stations which were filled with hundreds of plotted airport station SA (surface aviation) reports.

The SA reports were easy to visually decode right off the teletype printout:

From page 115 of the book “Weather” By Paul E. Lehr, R. Will Burnett, Herbert S. Zim, Harry McNaught – click for source image

Note that in the example above, temperature and dewpoint are clearly delineated by slashes. Also, when a minus temperature occurs, such as -10 degrees Fahrenheit, it was reported as “-10″, not with an “M”.

The SA method originated with airmen and teletype machines in the 1920′s and lasted well into the 1990′s. But like anything these days, government stepped in and decided it could do it better. You can thank the United Nations, the French, and the World Meteorological Organization (WMO) for this one. SA reports were finally replaced by METAR in 1996, well into the computer age, even though it was designed in 1968.

From Wikipedia’s section on METAR

METAR reports typically come from airports or permanent weather observation stations. Reports are typically generated once an hour; if conditions change significantly, however, they can be updated in special reports called SPECIs. Some reports are encoded by automated airport weather stations located at airports, military bases, and other sites. Some locations still use augmented observations, which are recorded by digital sensors, encoded via software, and then reviewed by certified weather observers or forecasters prior to being transmitted. Observations may also be taken by trained observers or forecasters who manually observe and encode their observations prior to transmission.

History

The METAR format was introduced 1 January 1968 internationally and has been modified a number of times since. North American countries continued to use a Surface Aviation Observation (SAO) for current weather conditions until 1 June 1996, when this report was replaced with an approved variant of the METAR agreed upon in a 1989 Geneva agreement. The World Meteorological Organization‘s (WMO) publication No. 782 “Aerodrome Reports and Forecasts” contains the base METAR code as adopted by the WMO member countries.[1]

Naming

The name METAR is commonly believed to have its origins in the French phrase message d’observation météorologique pour l’aviation régulière (“Aviation routine weather observation message” or “report”) and would therefore be a contraction of MÉTéorologique Aviation Régulière. The United States Federal Aviation Administration (FAA) lays down the definition in its publication the Aeronautical Information Manual as aviation routine weather report[2] while the international authority for the code form, the WMO, holds the definition to be aerodrome routine meteorological report. The National Oceanic and Atmospheric Administration (part of the United States Department of Commerce) and the United Kingdom‘s Met Office both employ the definition used by the FAA. METAR is also known as Meteorological Terminal Aviation Routine Weather Report or Meteorological Aviation Report.

I’ve always thought METAR coding was a step backwards.

Here is what I think is happening

METAR was designed at a time when teletype machines ran newsrooms and airport control towers worldwide. At that time, the NOAA weatherwire used 5 bit BAUDOT code (rather than 8 bit ASCII) transmitting at 56.8 bits per second on a current loop circuit. METAR was designed with one thing in mind: economy of data transmission.

The variable format created worked great at reducing the number of characters sent over a teletype, but that strength for that technology is now a weakness for today’s technology.

The major weakness with METAR these days is the variable length and variable positioning of the format. If data is missing, the sending operator can just leave out the data field altogether. Humans trained in METAR decoding can interpret this, computer however are only as good as the programming they are endowed with.

Characters that change position or type, fields that shift or that may be there one transmission and not the next, combined with human errors in coding can make for a pretty complex decoding problem. The problem can be so complex, based on permutations of the coding, that it takes some pretty intensive coding to handle all the possibilities.

Of course in 1968, missed characters or fields on a teletype paper report did little more than aggravate somebody trying to read it. In today’s technological world, METAR reports never make it to paper, they get transmitted from computer to computer. Coding on one computer system can easily differ from another, mainly due to things like METAR decoding being a task for an individual programmer, rather than a well tested off the shelf universally accepted format and software package. I’ve seen all sorts of METAR decoding programs, from ancient COBOL, to FORTRAN, LISP, BASIC, PASCAL, and C. Each program was done differently, each was done to a spec written in 1968 for teletype transmission, and each computer may run a different OS, have a program written in a different language, and different programmer. Making all that work to handle the nuances of human coded reports that may contain all sorts of variances and errors, with shifting fields presence and placement, is a tall order.

That being said, NOAA made a free METAR decoder package available many years ago at this URL:

http://www.nws.noaa.gov/oso/metardcd.shtml

That has disappeared now, but a private company is distributing the same package here:

http://limulus.net/mdsplib/

This caveat on the limulus web page should be a red flag to any programmer:

Known Issues

  • Horrible function naming.
  • Will probably break your program.
  • Horrible variable naming.

I’ve never used either package, but I can say this: errors have a way of staying around for years. If there was an error in this code that originated at NWS, it may or may not be fixed in the various custom applications that are based on it.

Clearly Weather Underground has issues with some portion of it’s code that is supposed to plot METAR data, coincidentally, many of the same months where their plotting routine fails, we also have missing CLIMAT reports.

And on this page at the National Center for Atmospheric Research (NCAR/UCAR), we have reports of the METAR package failing in odd ways, discarding reports:

>On Tue, 2 Mar 2004, David Larson wrote:

>

>>

>>Robb,

>>

>>I've been chasing down a problem that seems to cause perfectly good

>>reports to be discarded by the perl metar decoder.  There is a comment

>>in the 2.4.4 decoder that reads "reports appended together wrongly", the

>>code in this area takes the first line as the report to process, and

>>discards the next line.

>>

Here’s another:

On Fri, 12 Sep 2003, Unidata Support wrote:

>

> ------- Forwarded Message

>

> >To: support-decoders@xxxxxxxxxxxxxxxx

> >From: David Larson <davidl@xxxxxxxxxxxxxxxxxx>

> >Subject: perl metar decoder -- parsing visibility / altimeter wrong

> >Organization: UCAR/Unidata

> >Keywords: 200309122047.h8CKldLd027998

>

>

> The decoder seems to mistake the altimeter value for visibility in many

> non-US METARs.

So my point is this. METAR is fragile, and at the world’s premier climate research center, it seems to have problems that suggest it doesn’t handle worldwide reports yet it appears to be the backbone for all airport based temperature reports in the world, which get collated to GHCN for climate purposes. I think the people that handle these systems need to reevaluate and test their METAR code.

Even better, let’s dump METAR in favor of a more modern format, that doesn’t have variable fields and variable field placements requiring the code to not only decode the numbers/letters, but also the configuration of the report itself.

In today’s high speed data age, saving a few characters from the holdover of teletype days is a wasted effort.

The broader point is; our reporting system for climate data is a mess of entropy on so many levels.

UPDATE:

The missing data can be found at the homepage of DMI, the Danish Meteorological Institute http://www.dmi.dk/dmi/index/gronland/vejrarkiv-gl.htm “vælg by” means “choose city” choose Nuuk to get the numbers monthly back to january 2000.

Thanks for that. The January 2008 data is available, and plotted below at DMI’s web page.

So now the question is, if the data is available, and a prestigious organization like DMI can decode it, plot it, and create a monthly average for it, why can’t NCDC’s Peterson (who is the curator of GHCN) decode it and present it? Why can’t Gavin at GISS do some QC to find and fix missing data?

Is the answer simply “they don’t care enough?” It sure seems so.

UPDATE: 10/4/10 Dr. Jeff Masters of Weather Underground weighs in. He’s fixed the problem on his end:

Hello Anthony, thanks for bringing this up; this was indeed an error in

our METAR parsing, which was based on old NOAA code:

/* Organization: W/OSO242 – GRAPHICS AND DISPLAY SECTION */

/* Date: 14 Sep 1994 */

/* Language: C/370 */

What was happening was our graphing software was using an older version of

this code, which apparently had a bug in its handling of temperatures

below freezing. The graphs for all of our METAR stations were thus

refusing to plot any temperatures below freezing. We were using a newer

version of the code, which does not have the bug, to generate our hourly

obs tables, which had the correct below freezing temperatures. Other

organizations using the old buggy METAR decoding software may also be

having problems correctly handling below freezing temperatures at METAR

stations. This would not be the case for stations sending data using the

synoptic code; our data plots for Nuuk using the the WMO ID 04250 data,

instead of the BGGH METAR data, were fine. In any case, we’ve fixed the

graphing bug, thanks for the help.

REPLY: Hello Dr. Masters.

It is my pleasure to help, and thank you for responding here. One wonders if similar problems exist in parsing METAR for generation of CLIMAT reports. – Anthony

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

144 Comments
Inline Feedbacks
View all comments
October 3, 2010 10:26 am

Rui Sousa says:
October 3, 2010 at 8:17 am (Edit)
Is there any graphical representation of flow of climate information, available anywhere?
For example:
Airports -> METAR -> GHCN -> CRU
-> GISS
$$$$$$$$
all the info is out there if somebody want to take the time to draw the chart

Benjamin
October 3, 2010 10:35 am

I have a question that I haven’t seen asked…
Many stations can’t be used to measure local, let alone global temperatures. But if they’re no good for that, then what _are_ they good for? I mean, really, it either works or it doesn’t, right? Okay, so who pays for these things to not work as advertised?

Dr T G Watkins
October 3, 2010 10:50 am

Lost in admiration for your skill and tenacity. Brilliant.
Why didn’t your ‘M’ post some time ago produce a flurry of activity and corrections?
No doubt Fred Pearce will pick this up and write another article fir the Sunday Mail.

deepslope
October 3, 2010 10:56 am

Anthony, thank you for this extremely thorough investigation and analysis! So far, I have not been able to do more than an initial scan.
It caught my attention because I have had the fortune of spending quite a few days in Nuuk during the past eleven months. Toward more and more autonomy from Denmark, Greenland is enjoying the attention and tourist traffic that go along with the many facets of the “melting scares”. However, its more and more self-confident and pragmatic population is keenly looking toward new resource development, from offshore oil and gas (compare with Greenpeace action at the beginning of September…) to Rare Earth minerals and cyclically changing fisheries.
As an aside, there are decent temperature records from Southern Greenland, showing a steady decline in recent years. I will dig them up.

Doug in Seattle
October 3, 2010 10:58 am

Is the answer simply “they don’t care enough?”

What is more obvious, based on what I see with GISS, GHCN, HADCRU and other products of the “Climate Illuminati” is that they care very much, but only when it supports their cause.
A bureau of standards for climate data, run independently of the climate science community, is badly needed. But even this must be held to rigorous, transparent standards if it is to be trusted.

Dave
October 3, 2010 11:11 am

Note to moderators/writers:
I’m just nit-picking here, but sometimes it can be a little hard to tell where an article has shifted from commentary to quotes, or from update back to original text. It would clarify things for your readers if, for example, the updates in this article were closed off with ‘End of Update’ or some such, where they are inserted in the middle of the article. There was a post recently concerning NASA/GISS, where it was easy to miss the switch from Anthony’s commentary to the article being commented on.
My apologies if this is something you are aware of, but consider the effort in fixing to be better spent elsewhere (or not at all!) – that’s your decision to make, not mine – but in case you’re not aware of it, if no-one says anything, you’ll never know.
Please don’t take this post as anything except the most minor constructive criticism. Peace, love, home-cooking and good weather to you all 🙂

crosspatch
October 3, 2010 11:14 am

This problem if GISS not being able to obtain data while the data are available by other means has been going on for years (Steve McIntyre has made note of it in the past).
It seems to be a matter of priority. GISS’s priority seems to be getting the desired results. As long as the results they get are consistent with their stated hypothesis, they will see no reason to dig into and might resist any such digging as a “waste of time and resources” since clearly the results are “correct” and match the hypothesis.
Missing values allow them to calculate things. The more missing values they have, the more they get to calculate. These calculations tend to bias things more toward their hypothetical result and so are not really seen as a “bad thing”.
What needs to happen is for someone to go through and plug in all those missing values from all those stations from around the world and pump that through their software and see what comes of it but they know nobody has the time or resources to do that.
There needs to be an open database of climate data that all have access to. The problem is that the people who sell climate data are not going to provide it for free. I was surprised at how much it cost to obtain temperature archives. People are charging a small fortune for those data.
This is the perfect niche opportunity for some well-respected institution of higher learning who has no dog in the climate hunt one way or the other (which would rule out the UC system and most Ivy League schools). They could assemble a climate data base and make it available for other institutions to use. They would have no interest in the outcome, their only interest would be in the quality of the data in the database.
Probably never happen.

JDN
October 3, 2010 11:16 am

Anthony:
The coding languages you listed are not as important as whether they used Regular Expressions to parse the date or whether they tried to code some sort of pattern recognition themselves. Modern languages all have access to a Regular Expression Library (sometimes with predictive pattern recognition), BUT, old time coders working from an old code base typically don’t use it. I think this is the problem you’re experiencing.
More modern languages like Perl or Python have this facility built-in. You may want to contact http://www.perlmonks.org to see if they will look at the problem for you. Alternatively, I could do it myself if you want to supply me with exemplars of all the weird output formats and the “rules”. I’ve done a few projects like this, and, it usually takes a few weeks of successive approximations to get it right. The fact that a whole team might be required for C or COBOL should not deter you.
REPLY: What we might have here is an opportunity to contribute a truly “robust” decoder that can be integrated into a wide variety of systems. Intriguing idea. – Anthony

kuhnkat
October 3, 2010 11:22 am

Why fix the code when it will only be used to disprove your theory???
Remember GISS and CRUT do a GREAT job of infilling data that is missing. It works better the more missing data they have, at least for them!!

LearDog
October 3, 2010 11:22 am

Is it feasible to “crowd source” similar QC of data used for GISS and UCR analyses? The workflow you’ve used has proven robust – the challenge lies in correcting the data. Mosh and Chiefio are up to their elbows in analysis of the data, but is there a way to ‘cook book’ this for the rest of us ala Surfacestaions? Or are all of these stations a unique mystery, a finite series of ‘one offs’?

October 3, 2010 11:47 am

kuhnkat says:
October 3, 2010 at 11:22 am (Edit)
Why fix the code when it will only be used to disprove your theory???
Remember GISS and CRUT do a GREAT job of infilling data that is missing. It works better the more missing data they have, at least for them!!
$$$$$$$$$$$
“infilling” has no noticeable effect on the answer. I use no infilling whatsoever in my approach and the answer is the same. the reason for this is mathematically obvious.
1 X 3
Average them.
Now in fill:
1,2,3
Average them.
The problem with “infilling” and extrapolation is NOT that it gives you the wrong answer. The problem is it doesnt “carry forward” the uncertainty in the infilling
data operation. In the first example you have two data points. In the second example, you do NOT have three data points, you have 2 data points and the model ( with error) of the third. So if your calculating a standard deviation, you dont get to pretend ( in the second example) that you magically have a third data point, although people do.
To recap: Infilling and extrapolation ( averaging) dont bias ( generally) your estimate. if you think they do, it’s very easy to prove if they do. What infilling and extrapolation does do, is give you a false sense of certainty ( smaller error bars) unless you account for it

October 3, 2010 12:05 pm

LearDog says:
October 3, 2010 at 11:22 am (Edit)
Is it feasible to “crowd source” similar QC of data used for GISS and UCR analyses? The workflow you’ve used has proven robust – the challenge lies in correcting the data. Mosh and Chiefio are up to their elbows in analysis of the data, but is there a way to ‘cook book’ this for the rest of us ala Surfacestaions? Or are all of these stations a unique mystery, a finite series of ‘one offs’?
########
ah ya. there are a whole host of crowd sourced projects that would be great. I know of one guy who is working all by himself on reconciling the airport locations. he doesnt post or spend his time commenting, he just plugs away at a known problem. As I’ve said a bunch of times there are many known problems, many problems that do not get attention because folks get distracted by the wrong issues. It’s getting a bit weary stating it all over again. When it comes to problems that will have an impact on the final numbers ( they have a chance too ) the problems are:
1. the accuracy of metadata.
2. the uncertainty of adjustments.
3. reconciliation of data sources.
The problems are not:
1. using anomalies
2. dropping stations
3. UHI adjustments ( its the input to the adjustment, not the math itself)
4. extrapolation
Which means that the real problems ( lets call them issues ) are not with GISS, but rather with all the stuff they ingest. Now GISS makes a nice target, but technically, if you want to make a difference in understanding the numbers, focus on NOAA. And from my perspective, it’s a bit annoying to have so much energy focused on the wrong area.
So, my wishlist:
1. the data flow diagram everybody has been to lazy to draw.. clickable with links
to the underlying documents and files. I’ll contribute all my R code to read files and docs. ( but I refuse to scrape web pages, drives me batty )
2. A mash up of the station inventories. Again, I’ll contribute the R code. to pull metadata.
3. On its own damn server.
I’m gunna refuse to write any html because it sucks and drives me crazy.

IanG
October 3, 2010 12:09 pm

I have a, probably naive, question regarding METAR. METAR reports an integer value for the temperature. Therefore the actual temperature value will be rounded. Aircraft pilots need to know when the temperature is so hot that they require to take special action for take off. For safety reasons would not the practice be for all temperature reports in METAR to be rounded up, eg 30.1 degrees C reported as 31 degrees C ?
Regards,
Ian

October 3, 2010 12:12 pm

OH, and i refuse to write an regular expressions beacuse they too drive me batty
pat <- "\\.([^.]+)$"
just to grab the file extension! and that took way too long to figure out
Anthony, there are a number of tools you can use to develop the regular expressions
to decode the stream. In any case I can test it

Bill Hunter
October 3, 2010 12:16 pm

Usually a careful audit of the data will reveal whether laziness and incompetence occurs in a biased fashion. People may tend to fix only what they want to fix.

Hank Hancock
October 3, 2010 12:56 pm

Before the days when programmers could assign the value NaN (Not a Number) to a numerical data type, it was common to assign an unreasonable value like 999 to indicate missing data. If we could go back and just change the sign of all missing or erred data entries, making them a negative value instead of positive, I wonder if there would be a significant lowering of temperature in the record? Following the many articles at WUWT, one thing stands out for me. It seems that when errors in data occur, it always results in hotter temperatures.

Matt Dawes
October 3, 2010 1:06 pm

Are you sure it is an artifact of incomplete data?
Or an artifice of incomplete data.
Regards

crosspatch
October 3, 2010 1:33 pm

“Which means that the real problems ( lets call them issues ) are not with GISS, but rather with all the stuff they ingest. ”
Exactly. Which is why I would like to see the “missing” values plugged in and the whole thing run through the mill again to see what pops out the other side. But there are some subjective pieces, such as deciding which way to adjust the values of a station for UHI.
But “liking to see” something and actually having the time and other resources needed to do it is another thing entirely.

October 3, 2010 1:44 pm

Nuuk summer temperatures are still significantly lower than in the 1930s. Here is a graph of July average temperature: http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJulJulI188020090900110MR43104250000x

October 3, 2010 1:53 pm

Dya think they are grappling with this at Exeter? Didn’t they ask for issues from out side the group? Is it too late? Maybe Anthony, you should send them this critique. I belielve as Lucy does that the mule work in reviewing the thousands of stations is going to have to devolve on volunteers outside the parish. Perhaps the Nuuk data problem should be sent directly to the Senate or Congressional committee that gave NCAR/NOAA their marching orders on the data quality issue so they will be forced to deal with it. Otherwise, we will soon have the fourth whitewash of the year. “Yeah we found a few issues but a statistical evaluation of the data set shows that the effect on the trends is insignificant .”

October 3, 2010 2:03 pm

When i saw this article I thought: Ok, how about september 2009 then?
On september 2009 the coldest semtember temperature registered was measured on Greenland, -46.1 C – on Summit, the Ice sheet:
http://www.klimadebat.dk/forum/vedhaeftninger/summit8799.jpg
http://www.klimadebat.dk/forum/vedhaeftninger/summitminus47b.jpg
http://www.dmi.dk/dmi/iskold_september_paa_toppen_af_groenland
This event is obviously likely to have been accompanied by a severely cold september 2009 on Greenland.
So did GISS manage to publish september 2009 data for Nuuk?
http://wattsupwiththat.files.wordpress.com/2010/10/nuuk_giss_datatable.png
Nope. The only month missing for 2009 is …. september.

John
October 3, 2010 2:22 pm

Anthony congratulations on yet another outstanding piece of work.
You say “I don’t think there’s active manipulation going on here, only simple incompetence, magnified over years.”
Maybe that’s true, but you have have to wonder at the fact that all the “incompetence” that gets uncovered always seems to be in the same direction…..
When have we ever seen an “error” which showed LESS warming?

ShrNfr
October 3, 2010 3:14 pm

Very simple answer really, they took the thermometer inside so that they could have more light to read it 😉

Alvin
October 3, 2010 3:22 pm

If you check out WU’s blogs and such you may change your opinion of their agenda.

Louis Hissink
October 3, 2010 3:43 pm

I have to agree with Anthony – simple plain incompetence at work in government, and I’ve previously opined that AGW hypothesis seems very much similar – the result of scientific incompetence, mostly due to the physical sciences being influenced by the imprecision of the social sciences and their consensus beliefs.
What prompts this view is the obvious sincerity many AGW’s advocates have concerning CO2 emissions – it’s the outcome, in American terms, of the Pelosi’s and Reid’s of this world doing science – and what Feynman called Cargo-Cult science 0 the science when you aren’t doing science.
And it’s the well meant policies of the well meaning that seem to cause most of humanity’s misery over the centuries.