Uncovered: decades-old government report showing climate data was bad, unfit for purpose

IPCC Knew from Start Climate Data Inadequate

Climate Monitoring Weather Station at the University of Arizona, Tucson, measuring temperature in the University parking lot if front of the Atmospheric Science Building. The station has since been closed. 2007 Photo by Warren Meyer

Guest opinion: Dr. Tim Ball

In 1999, the National Academy of Sciences, the research arm of the National Research Council, released a study expressing concern about the accuracy of the data used in the debate over climate change. They said there are,

“Deficiencies in the accuracy, quality and continuity of the records,” that “place serious limitations on the confidence that can be placed in the research results.”

The people who reached these conclusions and their affiliations at the time follows.

———————-

  • THOMAS R. KARL (Chair), National Climatic Data Center, Asheville, North Carolina
  • ROBERT E. DICKINSON,University of Arizona, Tucson
  • MAURICE BLACKMON,National Center for Atmospheric Research, Boulder, Colorado
  • BERT BOLIN,University of Stockholm, Sweden
  • JEFF DOZIER,University of California, Santa Barbara
  • WILLIAM P. ELLIOTT, NOAA/Air Resources Laboratory, Silver Spring, Maryland
  • JAMES GIRAYTYS, Certified Consulting Meteorologist,Winchester, Virginia
  • RICHARD E. HALLGREN,American Meteorological Society, Washington, D.C.
  • JAMES E. HANSEN, NASA/Goddard Institute for Space Studies, New York, New York
  • SYDNEY LEVITUS, NOAA/National Oceanic Data Center, Silver Spring, Maryland
  • GORDON MCBEAN, Environment Canada, Downsview, Ontario
  • GERALD MEEHL, National Center for Atmospheric Research, Boulder, Colorado
  • PHILIP E. MERILEES, Naval Research Laboratory, Monterey, California
  • ROBERTA BALSTAD MILLER, CIESIN, Columbia University, Palisades, New York
  • ROBERT G. QUAYLE, NOAA/National Climatic Data Center, Asheville, North Carolina
  • S. ICHTIAQUE RASOOL, University of New Hampshire, Durham
  • STEVEN W. RUNNING, University of Montana, Missoula
  • EDWARD S. SARACHIK, University of Washington, Seattle
  • WILLIAM H. SCHLESINGER, Duke University, Durham, North Carolina
  • KARL E. TAYLOR, Lawrence Livermore National Laboratory, Livermore, California
  • ANNE M. THOMPSON, NASA/Goddard Space Flight Center, Greenbelt, Maryland
  • Ex Officio Members
  • W. LAWRENCE GATES, Lawrence Livermore National Laboratory, Livermore, California
  • DOUGLAS G. MARTINSON, Lamont-Doherty Earth Observatory, Columbia University, Palisades, New York
  • SOROOSH SOROOSHIAN, University of Arizona, Tucson
  • PETER J. WEBSTER, University of Colorado, Boulder

—————————

These are prominent names, many of them important in the Intergovernmental Panel on Climate Change (IPCC) and promotion of its ideas. For example, Gordon McBean chaired the formation meeting of the IPCC in Villach Austria in 1985. Bert Bolin was appointed first IPCC co-chair along with Sir John Houghton. Thomas Karl and James Hansen were two dominant figures in terms of the control and manipulation of the data right up to their very recent retirements.

Karl chaired the study, so he knew better than any that to achieve the results they wanted, namely a steadily increasing temperature over the 120 + years of instrumental record, was made easier by the inadequacy of the data. They ignored the fact that the inadequacy of the data negated the viability of the work they planned and did. For example, the extent, density, and continuity of the data are completely inadequate as the basis for a mathematical computer model of global climate. In short, they knew they would have to create, make up, or modify data to even approximate a result. The trouble is the data was so inadequate that even with their actions the results could not approximate reality.

Despite this, the IPCC committed to the surface data even though elsewhere decisions were made to reduce the number of stations and thus further limit the coverage. There were two main reasons for the reduction, the increasing diversion of funds to global warming research and expensive computer models, and the anticipation of weather data from satellites. NASA GISS produced a plot (Figure 1) to show what was happening. I expanded each graph to show the important details (Figures 2, 3, 4).

clip_image002

Figure 1

Commendably they upgraded the data, but all that does is emphasize the anomalies.

clip_image004

Figure 2

Important points:

· There are no stations with over 130 years of record.

· There are approximately 300 stations with about 120 years of record.

· Virtually all the stations with over 100 years of record are in the eastern US or western Europe.

clip_image006

Figure 3

Important points;

· First significant decrease after 1960 anticipating satellite replacement.

· Second decrease around 1995 associated with switch of funding to global warming away from data collection and reduction of stations used.

· Figure 2 shows maximum number of stations at approximately 7200, but Figure 3 shows it at approximately 5500.

clip_image008

Figure 4

Important points;

· Despite reduction in number of stations, coverage only declines slightly. That is scientifically not possible.

· Currently 20 percent of the Northern Hemisphere and 25 percent of Southern Hemisphere has no coverage.

· Quality of the coverage is critical but very variable as Thomas Karl notes in the forward “Unlike sciences where strict laboratory controls are the rule, climate researchers have to rely on observations collected in different countries and using differing instruments.” Remember, the US coverage is the best yet, but only 7.9% of their stations are accurate to < 1°C. Here is an example from the preface to the 1951-1980 Canadian Climate Normals of what Karl is reporting. “No hourly data exists in the digital archive before 1953, the averages appearing in this volume have been derived from all ‘hourly’ observations, at the selected hours, for the period 1953 to 1980, inclusive. The reader should note that many stations have fewer than the 28 years of record in the complete averaging.”

Although he did not contribute to the study, Kevin Trenberth commented on its release that,

It’s very clear we do not have a climate observing system…This may be a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t

Despite this Trenberth, created an energy balance model that is central to the entire greenhouse effect climate claims.

The National Research Council study focussed on the inadequacies of the instrumental record. It appeared shortly after H. H. Lamb released his autobiography (1997) in which he expanded on the larger limitations for climate research. He wrote that he established the Climatic Research Unit, because,

“…it was clear that the first and greatest need was to establish the facts of the past record of the natural climate in times before any side effects of human activities could well be important.”

So, in 1999 the people involved in the creation and control of the IPCC knew that the data during the 120 years of human activity was inadequate. They also knew that the data necessary to show the extent of the impact was equally inadequate. That didn’t stop them, but it meant they knew they had to create the data they needed and how to do it.

The IPCC could maintain the story and the data that supported their claim of human caused global warming (AGW) if they controlled all the data sources. This all started to unravel after 2000 A.D.

1. The satellite data confirmed by balloon records were reaching a length they could no longer ignore. By 2007 the IPCC included the following comment in the Fourth Assessment Report (FAR).

“New analyses of balloon-borne and satellite measurements of lower- and mid-tropospheric temperature show warming rates that are similar to those of the surface temperature record and are consistent within their respective uncertainties, largely reconciling a discrepancy noted in the TAR.”

2. Severely cold winters and increased snowfall captured public attention as many cartoons attested.

clip_image010

3. The pause or hiatus passed the 17 years Benjamin Santer required before even considering the pattern.

4. The gap between increasing atmospheric CO2 levels and actual temperature continued to grow.

But none of this stopped the search for proof Thomas Karl created a record and wrote a paper with Tom Peterson that claimed the hiatus did not occur using data and methods with serious limitations. After exposure of the misuse, the co-authors refused a congressional subpoena seeking the data and methods used.

Karl and others involved in the deception that is anthropogenic global warming (AGW) knew from the start and better than anyone, the severe limitations of the instrumental data set. They likely knew from Lamb’s work the limitations of the historic record. Despite this, or rather because of this, they oversaw the building of computer models, wrote papers, produced ‘official’ Reports and convinced the world that it faced impending doom through runaway global warming, insisted. Their work, based on what they knew was inadequate data from the start, achieved universal acceptance. Of course, they also knew better than most how to manipulate and select data to make points that supported their false hypothesis; at least until the satellite data achieved scientific status, but even then, they maintained the deception. They prove they are charter members of the post-fact society. The latest example was triggered by panic over Trump’s withdrawal from the Paris Climate Agreement and analysed by Tony Heller.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
115 Comments
Inline Feedbacks
View all comments
TDBraun
August 12, 2017 3:06 pm

“Uncovered”? Why is this report something that is “uncovered”? This implies it was “covered” before now — hidden or lost or forgotten. What is new about it now?

Brad
August 12, 2017 4:07 pm

I seem to remember reading here about how many cold-weather stations were lost when the USSR collapsed? Something like 5,000 reporting stations?

BallBounces
August 12, 2017 5:11 pm

One gets the sense reading early IPCC reports that the IPCC started out as real scientists doing real science, but then it got hijacked by advocacy scientists doing for-hire advocacy science. (Didn’t one well-known advocacy scientist have a business card that read “Have hockey stick – will travel”?

Mark T
Reply to  BallBounces
August 12, 2017 10:35 pm

Nah. What we ended up with was planned from the beginning. They simply couldn’t start out as complete alarmist advocates without tipping everyone off. This is all part of Maurice Strong’s (and every other communist pretending to care about the environment) playbook.

knr
August 12, 2017 5:23 pm

If you cannot measure it, you cannot ‘know it’ but you still can guess it. And throwing no amount of models at it will change.
Oddly considered how important such measurements are , the resource level out in is a bit of a joke.

bw
Reply to  knr
August 12, 2017 5:53 pm

Even if you can measure something, that does not mean you understand it.
At least be able to have a reliable quantification. For example. concentration with a defined range of error.
The current claims that “global average surface temperature” have a reliable value is simply false.
I’ve never seen the surface temperature plot include any indication of relative error.
The only global temperature that has some clear scientific methodology started in 1979 with the satellite data.
But that is not truly global. Also, the microwave sounder data are also not surface measurements, they can’t be as the atmosphere is totally opaque to IR below 1000 meters.

Dan Sage
Reply to  bw
August 14, 2017 2:03 am

“they can’t be as the atmosphere is totally opaque to IR below 1000 meters” Can you educate me, if that is still possible, as to what you are refering to? I hadn’t heard that before, and I am extremely curious about it. Is it all IR, or just what the satellite sensors are looking at? Please excuse my ignorance.

Neo
August 12, 2017 5:25 pm

Of course, you declare the data deficient so you can “fix” it

Ted
August 12, 2017 5:55 pm

“Despite reduction in number of stations, coverage only declines slightly. That is scientifically not possible.”
Dr. Ball, it is possible, because removing two of three stations with greatly overlapping coverage results in only a small decrease. The trick is how large an area is considered covered by a single station – an incredible radius of 1,200 km or 745 miles (Figure 1 c). Thus each station provides coverage for 4.5 million square km (~1.7 million sq mi.) .
By this standard, a station just outside Winnipeg provides ‘coverage’ for everywhere from Calgary to Chicago and from Kansas City to the border with Nunavat on the Hudson Bay. In US terms, the weather in Chicago, New York, Washington, DC, Miami, and Dallas. and all space in between, could all be (accurately) covered by a single station in Atlanta. With such a large area per station, you could theoretically cover all the land of the earth with less than 50 stations.

hunter
Reply to  Ted
August 12, 2017 9:32 pm

Ted, you just proved the point that it might be possible but it would not be honest.

Ted
Reply to  hunter
August 12, 2017 10:14 pm

“Intellectually dishonest” yes, straight dishonest, no. I agree its BS, but since they labeled it and used that as the convention, it’s more unethical and/or unjustified than pure dishonest.

hunter
Reply to  hunter
August 13, 2017 4:58 am

Good point.
“Intellectually dishonest” is less inflammatory.

Jer0me
August 12, 2017 8:05 pm

Come on!
Where are the usual trolls and drive-by undefended comments? I wanna see some torn apart 🙂

The Reverend Badger
Reply to  Jer0me
August 13, 2017 10:30 am

It’s nice and warm now, they have all gone outside to enjoy the weather. They will be back when it gets colder.

MangoChutney
August 13, 2017 3:23 am

#HansenKnew #KarlKnew #IPCCKnew

MarkW
August 13, 2017 12:03 pm

“Currently 20 percent of the Northern Hemisphere and 25 percent of Southern Hemisphere has no coverage.”
Surely you are talking about land surface here. The oceans cover a lot more of the surface than that, and they are effectively uncovered as well. (Areas outside the trade lanes are uncovered.)

August 13, 2017 12:04 pm

The article has photo of a temperature station in the middle of a parking lot.
I guess we are supposed to laugh and make fun of those crazy warmunists.
No so.
The temperature in a parking lot is very important and we should demand more thermometers in parking lots.
In fact, the station should have an external LED temperature readout so the people parking in the lot would know how much to leave their car windows open (so their car interior is not too hot when they get back to their car).
If that was done, a temperature station would have some value.
Meanwhile, surface temperature stations currently have no value in the era of weather satellites,
and should be abandoned or bulldozed to the ground.
Also:
This is another good article in a long series of good articles by Dr. Ball
— no other writer here is so consistent.

Russell Johnson
August 13, 2017 7:33 pm

Thanks Dr Ball another dagger to the heart of CAGW. The warmist religion is a false faith that is only supported by hyperbole.