How not to measure temperature, part 74

Sometimes, words fail me in describing the absolute disregard of the placement of NOAA official climate monitoring sites. For example, this one in Clarinda, Iowa submitted by surfacestations volunteer Eric Gamberg:

Click for larger image

The MMTS temperature sensor is the short pole next to the half pickup truck.

For those of you that don’t know, this station is located at the wastewater treatment plant there. I’ve written many times about the placement of stations at WWTP’s being a bad idea due to the localized heat bubble that is created due to all the effluent coming though. The effect is especially noticeable in winter. Often you’ll see steam/water vapor in the air around these sites in winter, and more than one COOP observer has told our volunteers that snow sometimes does not stick to the ground at WWTP’s.

The larger pole appears to be a gas burnoff torch for excess methane. I can’t say how often it is activated (note the automatic ignitor circuit on the pole) but I can tell you that putting an official NOAA climate thermometer within a few feet of such a device is one of the worst examples of thoughtless station placement on the part of NOAA I’ve ever seen. Here is an example of a methane burn-off device at another WWTP.

020806_methane_flare_pipes

click for larger image

We’ll probably never know what the true temperature is in Clarinda because untangling a measurements mess like this is next to impossible. How many days was Tmin and/or Tmax affected at this location by gas burnoff and to what magnitude? We shouldn’t have to ask these questions.

And, adding insult to stupidity, the GISTEMP Homogenization adjustment makes the trend go positive, especially in recent years:

clarinda_ia_temp_anim

Click image if animation does not start automatically

According to the NCDC MMS database for this station, the MMTS was installed on October 1, 1985. Who knows what the data would have looked like if somebody had thought through the placement. Whether or not the temperature sensor has been significantly affected or not by this placement is not the issue, violation of basic common sense siting guideline that bring the data into question is. Anything worth measuring using our public tax dollars is worth measuring correctly.

Dr. Hansen and Mr. Karl – welcome, feast your eyes on the source of your data. You might want to think about changing this description on the NCDC website for USHCN:

The United States Historical Climatology Network (USHCN) is a high quality, moderate-sized data set of daily and monthly records of basic meteorological variables from over 1000 observing stations across the 48 contiguous United States.

I suggest to NCDC that “high quality” doesn’t really apply in the description anymore.

I really could use some help, especially in Texas, Oklahoma, Alabama, Mississippi, and Arkansas to get the USHCN nationwide climate network survey completed. If you have a digital camera and can follow some simple instructions, why not visit www.surfacestations.org and sign up as a volunteer surveyor. If you can’t help that way, donations to help fund trips such as these that I’ve been doing are greatly appreciated.

UPDATE 11/20 4:20PMPST: Some commenters such as Krysten Byrnes and Steve have suggested that the blink comparator above is wrong due to the fact that the scale on the left changes in offset. I realize that may create some confusion. A couple of clarifications are needed to address that.

First, these graphs are generated by the GISTEMP database, not me. I simply copied both from the GISTEMP website into my animation program. This includes the scale offset which is part of the difference in the original GISTEMP generated images. You can do the same thing also by visiting here: http://data.giss.nasa.gov/gistemp/station_data/  and putting Clarinda in the search box. Use the pulldown menu to select either data set you want. The above is the “combined sources” and also “after homogeneity adjustment”.

Second what is important to note here is that the slope of the trend changes as a result of the adjustment applied by GISS. It becomes more positive in the “homogenized” data set.

Third, in the “homogenized” data set, the past has been cooled, the present also made warmer, making the slope more positive over the timeline. Here is the Clarinda GISTEMP Homogenized data plot overlaid on the “raw” data plot. Again these are the original unmodified GISTEMP generated graphs using a simple cut and paste with transparent background technique:

clarinda_giss_compare-520

Click graph for full sized image

Note how the hinge point appears around 1980 where the data appears to match. Note also how the divergence between the two data sets increases either direction from this hinge point.

About these ads
This entry was posted in Climate_change, Ridiculae, Weather_stations. Bookmark the permalink.

96 Responses to How not to measure temperature, part 74

  1. hereticfringe says:

    Hey, I have a spot next to my barbeque grill in the backyard where a temperature station could be installed. I’m sure that Hansen would be thrilled to have it put there!

  2. Pierre Gosselin says:

    I suggest the following description for Dr. Hansen and “Dr.” Karl to consider:

    “The United States Historical Climatology Network (USHCN) is a ship-shod, positive-biased data set of daily and monthly records of easy-to-manipulate meteorological variables from over 1000 poorly sited observing stations across the 48 contiguous getting hotter United States.”

  3. Steven Hill says:

    Don’t worry, they have tweaked the reading for that station to allow for that burner. (wink)

  4. Harold Ambler says:

    Pierre Gosselin: (08:11:04): “The United States Historical Climatology Network (USHCN) is a ship-shod, positive-biased data set of daily and monthly records of easy-to-manipulate meteorological variables from over 1000 poorly sited observing stations across the 48 contiguous getting hotter United States.”

    What is actually very hard to take is the knowledge that the network is almost certainly the best on Earth, as Bob Carter has observed.

    Why is GISS data ramping up when declining temperatures worldwide are on the verge of shifting the debate? Coincidence? Second warmest October in history? Absolutely, except for …. no.

    We need to start a 12-step program for warmaholics. Step 1: We admitted we were powerless over temperature, that our lies had become unmanageable.

    When should the ENSO number for October be released? Is it published somewhere besides here http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/ensoyears.shtml sooner?

  5. Phillip Bratby says:

    Every point before about 1905 has been adjusted down, 1905 to about 1935 is unadjusted and after about 1935 every point has been adjusted upwards! Any real reason for this?

  6. TerryC says:

    Would it be possible to adjust the temperature axes of the two graphs to the same scale. The animation makes the temperature appear to increase in recent years but if you correct for scale shift, the change appears to actually be a marked decrease in earlier temperatures. Recent temperatures appear to be minimally changed or even very slightly adjusted downwards.

  7. Bill in Vigo says:

    It is now my humble opinion that there will never be anything useful to come from the USHCN other than to be a useful tool for the political powers that be to control the American people. And now with the financial crisis the control will become almost total. We may be witnessing the end of the “Land of the Free”.

    What is happening now is not conducive to the old American belief in home rule. God help us.

    Bill Derryberry

  8. MG says:

    I would worry about the sludge pools more than the methane fire, which is also problematic. The positive bias could be HUGE at a site like this on a cold, calm winter night.

  9. Tom Carter says:

    Has anyone performed an analysis of USA temperature trends (1950-2008 ?)using the untainted, unadjusted, temperature data available from ACCURATE weather stations (Those not directly affected by manmade heat sources such as paved roads and buildings)??

    Given the pathetic general lack of use of scientific method in climate data analysis and reporting by NOAA and NASA, this would be very useful information.

  10. Every time I read one of these reports, I think it could never get any worse, yet, there is always yet another example that just makes me shake my head. Anthony, I am still hoping to get to some of the sites in North Dakota soon to fill in that part of your map.

  11. jerry says:

    Anthony,

    where can a person buy a Stevenson screen and all the guts that go into the box. The reason I am asking is my work is located in the Republic of Georgia and we have access to open rolling grasslands with nothing around on our property and also have lots of people on the payroll. If I know how to get one and what the cost would be I might be able to have one installed and done properly and start recording data. We are located in the boonies between the Caucasus and the Lesser Caucasus mountain ranges.

  12. Frank Mosher says:

    Lppking at the UAH data, since 1997, the warmest anomalies were 4/98 (warm) and 1/2000 (cool). Until those values are exceeded, it would appear we are observing “noise”. Doesn’t a trend require higher or lower values to constitute a trend? A doctor would not hospitalize a patient with a temp of 98.7 F or conversely 98.5 F. Anthony, this is a great web site, and i marvel at you and your readers knowledge.

  13. Steve Keohane says:

    Just another S.N.A.F.U.

  14. Jim Pfefferle says:

    “… I can tell you that putting an official NOAA climate thermometer within a few feet of such a device is one of the worst examples of thoughtless station placement on the part of NOAA I’ve ever seen.”

    I would contend that much thought and planning went into the careful placement of this station.

  15. Jeff Alberts says:

    I’d also be interested in finding out where to get a weather station I can place on my open 2 acres in Western Washington State…

  16. Drew Latta says:

    Well, the methane burn off thing is interesting, but it might never be activated. A lot of these WWTPs use excess methane from sludge digestion to heat the digester and other processes. Depending on the temperature their digester runs at it might be heated most or all of the time, and they might hardly ever flare the methane. Or they could almost always flare, which is what happens in my city because they get a lot of industrial wastewater that ends up having contaminants that would destroy a heating system. You’d have to talk to the operator.

  17. Mike C says:

    Like some other posters pointed out, the scale on the left is confusing but if you put the numbers on a spreadsheet, there are some problems. In this instance, Hansen cooled the past until the early 90′s when he starts taking the cooling away which has the effect of warming the station in the present. And the artificial cooling in the past is significant; between .8 and 1.2 degrees C. This is a GISS lights = 24 station, it should be adjusted the other way around.
    Oddly enough, USHCN has very few adjustments to this station despite equipment changes and location changes since 1984.

  18. crosspatch says:

    Every point before about 1905 has been adjusted down, 1905 to about 1935 is unadjusted and after about 1935 every point has been adjusted upwards! Any real reason for this?

    That was noted on Climate Audit some time ago. It is as if you can’t make the present warmer, you can always make the past cooler and give the impression of a warming trend. Which is what GISS is really all about, not so much showing current absolute warming but in showing a trend over time. Hansen’s goal is to show a given rate of warming to validate his climate models. I believe there should be a clear separation of responsibility between those who create the models/forecasts/predictions and those who produce actual climate data.

    GISS has a built-in conflict of interest and puts Hansen in a double-bind if the climate models are incorrect. If he massages the climate data to match the model he appears incompetent and if he doesn’t massage the data and it doesn’t match the model, he appears incompetent in his model creation. He has positioned himself in such a way that they must match (because remember, the science is “settled”) or the entire argument comes down around them and their credibility approaches zero.

    One supposed reason for making the adjustments is to compensate for station moves over time. If a station has moved from a cooler location to a warmer location, temperatures are “adjusted” in order to give a consistent result. At least that is the theory. But in reality what you will find is pretty much what is noticed with this station … older temperatures are adjusted downward, newer temperatures adjusted upward.

  19. Gary says:

    What’s the “nightlights” factor on a WWTP flamethrower?

  20. George E. Smith says:

    “” TerryC (08:36:51) :

    Would it be possible to adjust the temperature axes of the two graphs to the same scale. The animation makes the temperature appear to increase in recent years but if you correct for scale shift, the change appears to actually be a marked decrease in earlier temperatures. Recent temperatures appear to be minimally changed or even very slightly adjusted downwards. “”

    Terry if you look closely at those scales you will see that one goes from 6.5-11.5, while the other goes from 7.5 to 12.5, so both scales are exactly the same 5degree range with just a 1 degree offset. You can pick the offset to be anything you like, and that will change the location of the stationary point on the two graphs.

    What Anthony has done is the rigth way to do it. If the scales were the same, the whole graph would be jumping up and down by one degree and you couldn’t discern any relative shifts at all. Blink comparators depend on something (or most things) staying constant.
    For example in the search for planetary or comet objects, a star blink comparator, will leave the stars stationary and only objects in relative motion with the stars will move; and styand out like a sore thumb.

    This is a very clever demonstration anthony; and the whole project is laudable. the whole network looks like a giant Rube Goldberg to me. How anyone can call that science is beyond me; well the computer climate models do put it to shame, but even in high school you learn that one of the biggest sources of experimental error is “instrumentation error, where the measuring instrument is accurate but what it is responding to is NOT the intended measurement parameter. This is also a problem in process control, when such readings are used in feedback loops to control some process. If the sensor is not measuring the intended variable, you can’t control it.

  21. P Folkens says:

    Appearance is everything. This cooling of past numbers to create the impression of a steeper rise in temperatures is not new. In the earliest presentation of the IPCC scenarios regarding a rise in sea level, the original data graph went across the page and the height was less than the width. The scale to measure sea level increase was expressed in tenths of a meter. In subsequent presentations of the scenarios and sea level rise the graph was compressed laterally such that the width was now shorter than the height. This created a more dramatic appearing rise in the slope. They also changed the measurement to centimeters. As we all know, 64 cm is a larger number than 0.64 m. (Larger number yes, but larger quantity? Some dolts think so. I present this change in the presentation in my lectures and I am astonished at the number of people who do not realize that 64 cm is the same as 0.64 m.)

  22. Paul H says:

    Thought you might be interested, GISS related article at

    http://www.theregister.co.uk/2008/11/19/nasa_giss_cockup_catalog/

    REPLY: It is an essay by our good friend. EVAN JONES – Go Ev!

    Anthony

  23. How not to measure temperature is one of my favorite reads on the web!! :)

    Anthony..thanks for your work..it doesn’t go unnoticed.

    http://www.cookevilleweatherguy.com

  24. Michael J. Bentley says:

    Crosspatch,

    I think you give Hansen too much leeway when you say he’s in a double bind. I remember the picture of the donkey between two bails of hay starving because it doesn’t know which way to move – in a psych class I think.

    Hansen got where he is under his own steam. He’s the one who choses to spout off like he does. He is attempting to shout down those who question his results even now. He appears to be going up a creek without a paddle at the moment, at least among those who watch the data with a critical eye. The problem is that he still has the microphone.

    I hope someone finds the off switch pretty soon, because most folks on the AGW side don’t have a clue about science (and slept through what you get in school anyway) and have a suspension of disbelief when those four magic letters are mentioned N-A-S-A.

    Mike

  25. J. Peden says:

    Continues to be….Mindboggling!

    And, imo, “the science is settled” = presuming that which is to be proven, which is not only logically fallacious but also the very antithesis of doing science. But that’s all the ipcc Climate Scientists have been doing the whole time.

    Also, at the Clarinda site the presence of that half-pickup and the large storage tank is worrisome. I don’t think the people at Clarinda are really even aware that they are supposed to be trying to measure Global temperature. And the USHCN doesn’t seem to actually care about it either.

  26. Oh, it could be worse. Like this USFS Fire Weather station that had the Local Weatherman scratching his head every night exclaiming “This cannot be right, folks” when referring to the daytime high of a mountain community exceeding that of Redding, CA, one of the hottest places in California. Anyone driving up out of that city to the hills immediately feels the temp drop 10 degrees. After stomping around for 3 separate trips, I finally found the sensor. Surrounded by masses of metal and itself encased in about 5 pounds of aluminum.
    After explaining to the amazed USFS personell that I had found the location of the weather station that nobody had seen in 10 years, I got a call from a Tech who was going to replace the sensor, since they deemed it was acting erratically. The Tech brought a replacement sensor, not from a suppy of fresh ones, but from another failed station. The story read like a budgetary problem. And so, the 100 year history of one of the few rural mountain communities having a solid weather gathering record is tainted with 7 years of suspect data making it suddenly hotter than one of the warmest cities on the West Coast.

    REPLY: Was this the sensor in Mineral, CA? – Anthony

  27. M White says:

    UK to auction carbon permits

    http://news.bbc.co.uk/1/hi/sci/tech/7736005.stm

    “But the UK led a revolt against the Commission’s plan and refuses to ring-fence the carbon dividend. A government spokesman said: “We are committed to reducing carbon emissions as our climate change legislation proves, but we do not hypothecate [specify the intended use of] revenues.”

    Perhaps its the treasury that sites weather stations?

  28. crosspatch says:

    What I would support would be a tax on emissions from coal seam fires. These fires are allowed to burn because the cost of putting them out is higher than the value of the coal being burned. If those fires were taxed to such an extent that it would be cheaper to put them out than to allow them to burn, then global CO2 emissions could be greatly reduced. Any by greatly, I mean the fires in only four countries amount to all the emissions of all the automobiles on the planet.

  29. Tom Jefferson says:

    It is Not “Historical” Data for Temperature and GASES

    It IS “HISTERICAL” Data for Temperature and GASES

    Get Real.

    But this is quite amusing.

    Except for the government control and wasting $40 Trillion.

    The Earth is Billions of years old.

    You are all arguing over “TRENDS” based upon an
    infinitly short stretch of time.

    Basing any decisions on such a nano data sample is moronic.

    Acknowledging that GISS, Gore, Hansen, et al have been incompetent and fraudulent in data sampling, collection and decimation is material. – They ALL should be PERP walked.

  30. anna v says:

    OT, but interesting BBC article:

    http://news.bbc.co.uk/2/hi/science/nature/7733509.stm

    Under-ice flood speeds up glacier
    By Jonathan Amos
    Science reporter, BBC News

    Byrd Glacier (Nasa/Landsat)
    The Byrd Glacier is about 135km long and 24km wide

    Great floods beneath the Antarctic ice sheet can now be linked directly to the speed at which that ice moves towards the ocean, scientists say.

    Diffidently the truth is acknowledged:

    Scientists have known about subglacial lakes in Antarctica for half a century. There are more than 150; and the biggest, Lake Vostok, is the size of Lake Ontario in North America.

    Despite being capped by, in some cases, several kilometres of ice, the lakes’ contents stay liquid because of warm spots in the underlying rock.

    It was always thought, however, that these lakes were stagnant bodies, containing waters that were perhaps unaltered for millions of years.

    Warm spots, not to be named as volcanic activity.

    But also encouragingly:

    Cause and effect

    It should be stressed the events seen at Byrd are not of themselves climate-related. The lakes probably flood and drain on a regular basis that has nothing to do with atmospheric or ocean warming.

    But:
    However, the scientists say the mechanisms involved need to be understood so the knowledge can be applied to those ice masses which are being exposed to warmer temperatures, such as in Greenland.

  31. hyonmin says:

    Obama is in agreement with Hansen and Gore. We have not done enough and we need change. We need to roll back CO2 by 80% in the year 2050. Forget physical reality only political reality counts. Great work, who would have believed such a crazy placement but then who would have believed in the current U.S. death wish.

  32. Roy says:

    I lived four years in Tucson…in my younger years. I kept in shape by jogging around the city park in the central part of town. There on the corner of 22nd street….was a sponsored weather station. I jogged by it over 1,000 times in four years. The temp gauge was about four feet away from the AC unit on th side of this RV trailer. The entire arrangement was surrounded by concrete and pavement. The gauge had to be showing a temp that was three degrees hotter than reality. Twenty years have passed…and I doubt anyone ever changed anything.

  33. Steve says:

    Umm…about the y axis on the chart…. could be get a gif that compares the two data sets with a range of 6.5-12.5? That way the visual comparison will be valid. I agree with TerryC, we need to avoid this kind of thing. This is what the AGW crazy guys do. We have to behave better than them.

  34. An Inquirer says:

    Tom Carter (08:53:13) :

    “Has anyone performed an analysis of USA temperature trends (1950-2008 ?)using the untainted, unadjusted, temperature data available from ACCURATE weather stations . . .”

    This issue has been discussed before. See http://wattsupwiththat.com/2008/11/13/gistemps-gavin-schmidt-credits-wuwt-community-with-spotting-the-error/
    John V compared the anomalies of CRN 1&2 stations with the GISTEMP anomalies and came to the conclusion that for some reason, the adjustment algorithm in GISS produces the same outcome as quality stations do. A few notes on this. First, John V has been very open about his data, procedures, and potential pitfalls. Second, I do not know if anyone has verified or duplicated John V’s work which should not be difficult for a modeler to do. Third, pitfalls include lack of geographic dispersion, U.S. focus, and small sample size — on the sample size issue, I understand that John V initially used CRN 1&2 stations which numbered only 17 at the time and that sample size does not produce statistically valid results; so he then included CRN 3 stations, bring the sample size to 50, with the same results.
    Fourth, given that GISS procedures are so bizzare, lacking quality control, and producing counter-intuitive results, John V’s conclusions are unexpected, and I would love to see an audit on John V’s work. In contrast to GISS folks, it seems that John V would welcome an audit.

  35. JoeH says:

    Every time I read this, “The United States Historical Climatology Network (USHCN) is a high quality, moderate-sized data set of daily and monthly records of basic meteorological variables from over 1000 observing stations across the 48 contiguous United States,” or Anthony post’s a site review, I am prompted to ask how many for real high qualtiy sites are required to evaluate the surface temperature trends in the U.S.? Is it 100, 13, 86,133, 167? Do 1,000 sites, even if they were perfectly sited and maintained, give us a better answer than 500 or 200 sites of the same quality?

    Anyone? Anyone? Buehler?

  36. B Kerr says:

    M White
    “Perhaps its the treasury that sites weather stations?”

    Where are the UK weather stations?
    Have you any idea where they are?

    I downloaded a list from MET OFFICE.
    The list had more weather stations in Lerwick than there are in the USA.

    Where are all the UK sites?
    Can anyone help?

  37. Curious says:

    The gas flare here is not the only issue. The cylindrical tank to the left is an anaerobic digester. These are typically heated to 95 degrees F using the biogas generated by the digestion process. As this gas is a free byproduct, and would be flared if not used to heat the digester, these concrete walled tanks are typically not insulated. The tank here looks to be about 500,000 gallons, and in my part of the world (northern Illinois) on the design coldest day, heat loss from a tank like this would be around 300,000 to 400,000 Btu per hour. That is quite a bit of influence on the surrounding area.

  38. Ray Reynolds says:

    I don’t know if lawns require watering in Clarinda, Iowa but if some of these site are, like this one, out in the middle of a lawn and under a automatic sprinkler system.

    I would guess the average annual temperature is near mainline water temp….well, cept when they have the flame thrower stoked up.

  39. Philip_B says:

    , I am prompted to ask how many for real high qualtiy sites are required to evaluate the surface temperature trends in the U.S.?

    60 ‘randomly’ located stations would be sufficient to determine the temperature trend for the whole of the USA or any area for that matter.

    BTW, the main problem with the current network is that none of the stations is ‘randomly’ located. It doesn’t matter how many non-random sited stations you have if you don’t know what biases the non-random siting introduces. Which is the surface stations problem in a nutshell.

  40. Ed Scott says:

    All accurate temperatures are computer generated, using GISS algorerithms.

    Heritage Foundation Blasts Obama’s Climate Message

    “Few challenges facing America and the world are more urgent than combating climate change,” Obama said. “Many of you are working to confront this challenge. But too often, Washington has failed to show the same kind of leadership. That will change when I take office.”

    http://www.newsmax.com/insidecover/Heritage_obama_climate/2008/11/19/153043.html

  41. Leon Brozyna says:

    @George E. Smith (10:19:58)

    Good points on that business of blink comparators. What we’re seeing in that image is not so much about temperature trends but about types of adjustments. I suppose if someone was really curious, they could go back to the source and do their own temperature comparison.

    And extra kudos to those volunteers who document sites located at wastewater treatment plants. Some days those facilities can be quite aromatic.

  42. George E. Smith says:

    “” Steve (12:36:28) :

    Umm…about the y axis on the chart…. could be get a gif that compares the two data sets with a range of 6.5-12.5? That way the visual comparison will be valid. I agree with TerryC, we need to avoid this kind of thing. This is what the AGW crazy guys do. We have to behave better than them. “”

    Steve I have to disagree. what is missing is the underlying assumption that the vertical scale, is actually a temperature scale; it isn’t it’s an “anomaly” scale, and the units just happen to be in degrees. the numbers are quite arbitrary since they refer to some baseline that itself was never measured accurately in terms of something real like the internationally recognised SI units of temperature.

    So you can’t believe that the numbers are real, other than relative to the same plot; the scale difference is nothing more than a readjustment of some baseline.

  43. steven mosher says:

    Mike C. the lights=24 datum is NOT used to adjust data. That is the brightness index which is not used in any processing step. The datum you want to look for is the unlit/dim/bright field. Unlit stations ( regardless of population) are used to adjust dim and bright stations ( rregardless of population)

  44. evanjones says:

    GISS. Where homgenization meets pasteurization.

    The .gif that keeps on giving.

  45. CyberZombie says:

    GISS. Where homgenization meets pasteurization.

    The .gif that keeps on giving.

    hehe

  46. Chris says:

    The temperature anomaly map for Oct 2008 from UAH is out.

    http://climate.uah.edu/oct2008.htm

    Interesting that it’s got the hotspot in western Russia, but no equivalent to the massive dark blotch in Siberia.

    c.f.

    http://wattsupwiththat.files.wordpress.com/2008/11/gistemp_after_october_correction.gif

  47. Mike Bryant says:

    Chris,
    There is nothing alarming about the UAH anomaly map. I wonder how NASA would respond to questions about this disparity?
    Mike

  48. Graeme Rodaughan says:

    I think that “crosspatch (10:08:21) :” raises and interesting point wrt the double bind for Hansen.

    If true, he will be strongly motivated to continue his behaviour. The problem for Hansen is that the real world appears to have stopped warming and may even be entering a cooling phase of unknown extent. If the cooling continues the disparity between his data/models/pronouncements and peoples experience of cold weather will begin to be too wide – and people will begin to notice.

    If Hansen’s credibility drops as a consequence, so will the credibility of GISS data which is the foundation of a lot of the AGW meme.

    I would not be suprised if AGW does get exposed – that Hansen will become the patsy/fallguy of the politically well connected AGW proponents who will simply claim that they were “deceived” by Hansen’s “scandalous medacity” – Hansen will be crucified.

    The stakes are very high for someone playing Hansens game.

  49. Graeme Rodaughan says:

    If Hansens’ head ever rolls – so will those who work for him.

    REPLY: He’d be more useful to society if he recognized the limits of the surface system and produced a product that was not fraught with so many issues, and took more external influences into account. As it stands now he uses questionable data and methods as a soapbox.- Anthony

  50. Anthony – Is anyone doing surveys of the quality of met stations outside the US. If the UK hasn’t then I am sure we could get it done given that we talk of little else in the UK. Weather and climate vane would be a good community to get involved .. unless that is it has been done already?

    REPLY: No it hasn’t been done, and given how labor intensive the USHCN has been I haven’t the time to start one. – Anthony

  51. Hank McCard says:

    hhhh

  52. Chris says:

    Mike,

    I’m sure it’s set alarm bells ringing :)

    By the way, genuine question here (not trying to be funny) – does anyone know why NASA doesn’t have it’s name to a satellite-derived global temperature record, considering it provides the satellites? I mean everyone’s heard of NASA as in NASA-GISS, but how many people have heard of RSS or UAH? Why isn’t there a NASA-GISS and a NASA-SAT for example? Why are the MSUs used to derive the satellite records carried on satellites called NOAA, yet NOAA is synonymous with yet another (warm-biased?) surface record? I’m probably being ignorant here and missing something stupid, but I thought I’d let someone else work it out for me.

    Cheers,

    Chris

  53. Thought I would take a look at my local weather station record which is and has been at Kew Gardens in west London in GISSTemp. For some reason GISS stopped using this station in 1980. It’s still there in the same place. I will ask them why they stopped sending data in.

  54. Robert Wood says:

    M White (11:34:42)

    Truly insane. What happens if no one makes a bid?

  55. Mike C says:

    steven mosher (15:13:05)

    Mr Mosher, what you refer to is the different classification that Hansen created. The brightness index is very much used in determining how much to adjust the station for low frequency variation. I would refer you to Hansen et al 2001 to correct yourself. Neverthe less, since this is an R2C station (R = rural and 2 = dim only being relevant here due to it’s location), there is no reason that it got an upward adjustment when it should have been adjusted downward.

  56. Mike C says:

    Pardon me as I take my foot out of my mouth. The brightness index is used to determine the PNB which Mr Mosher referred to. So brightness index creates classification and is not involved in weighting.
    That technicality out of the way, the point still remains the same, this station should have recieved an adjustment that cooled the station, instead it got an adjustment that warmed the station.

  57. Nick Yates says:

    Anthony,

    This is off topic but may shatter the illusions of anyone who thought that an ETS is not just an excuse for more taxes.

    http://business.timesonline.co.uk/tol/business/industry_sectors/natural_resources/article5192585.ece

  58. Patrick Henry says:

    Here is how the hottest October in history looked to a satellite.
    http://climate.uah.edu/oct2008.htm

  59. crosspatch says:

    If the cooling continues the disparity between his data/models/pronouncements and peoples experience of cold weather will begin to be too wide – and people will begin to notice.

    Unless, that is, the greatest “anomaly” that creates the overall “warming” happens to occur in an area where few people live and those that do live there generally have little contact with the outside world. Like, say, Siberia and far Northern Canada …

  60. MattN says:

    “And, adding insult to stupidity, the GISTEMP Homogenization adjustment makes the trend go positive, especially in recent years:”

    I do hope I see the day that James Hansen is exposed for the charletan he is.

  61. Graeme Rodaughan says:

    @Patrick H,

    “Southern Hemisphere: +0.07 C (about 0.13° Fahrenheit) above 20-year average for October.”

    Sure is A-GLOBAL-W – That global signal in the SH sure is a big one…

    Thanks for the link

  62. Steve Huntwork says:

    Anthony:

    My daughter lives in Arkansas and I have been recruiting her to obtain the surface station data.

    It has been very frustrating finding the information for her on the http://www.SurfaceStations.com website, when I knew darn well that a KML file for Goggle Earth was available.

    http://wattsupwiththat.com/2008/11/06/surfacestations-ushcn-ratings-in-google-earth/

    Anthony, could you please make this KML file VERY EASY TO OBTAIN on your primary website?

    When I work hard recruiting people to help, it would be rather nice to make it easy for them.

    REPLY: I thought it was, here is what I’ve had posted for over a week, right at the top:

    NEWS Updated 11/06/2008

    Google Earth Station Rating Map now available – download here.
    Sincere thanks to Gary Boden for this contribution!

  63. melissa says:

    Nice post!

  64. Graeme Rodaughan says:

    @crosspatch (19:03:46) :

    I’m hoping that the average Citizen in Main Street USA is shoveling snow this winter of their driveways while being told that it’s the warmest January on record…

    It’s going to be interesting to see how the disconnect unfolds.

  65. Steve Huntwork says:

    I know that the KML file was at the top!

    “Google Earth Station Rating Map now available”

    This KML file can locate the surface stations closest to their homes, and this should be VERY OBVIOUS!

    This resource is way too important waste, because of a bad title on the website.

  66. Steve Huntwork says:

    Change the title to:

    Download this Google Earth KML file to locate the closest weather reporting station to your home.

    Yikes, I already knew about this file, but and had to search for it.

  67. George E. Smith says:

    “” Philip_B (14:51:46) :

    , I am prompted to ask how many for real high qualtiy sites are required to evaluate the surface temperature trends in the U.S.?

    60 ‘randomly’ located stations would be sufficient to determine the temperature trend for the whole of the USA or any area for that matter.

    BTW, the main problem with the current network is that none of the stations is ‘randomly’ located. It doesn’t matter how many non-random sited stations you have if you don’t know what biases the non-random siting introduces. Which is the surface stations problem in a nutshell. “”

    I’m puzzled; what scientific basis is there for “adjusting” any of these measurments at any of these stations?

    The whole idea of measuring things in the real universe is to find out what they really are. One man’s “adjustments” are another man’s data falsification.

    The whole problem with the whole concept of GISSTemp; and your “randomly” sited stations, is that the resulting measurement is improperly assigned to areas that it does not truly represent; the UHI problem; and then the GCMers compound the felony in that they don’t use those numbers properly to evaluate the resulting energy losses via EM radiation et al; so there is no way that you can get any valid climate information as far as the big question; are we heating or cooling, and once you homogenise the data from a bunch of stations you can’t determine energy transport from one point to another either.

    The whole idea of NOAA’s isothermal planet in dynamic equilibrium, with mysterious GHG or other perturbing “Forcings” driving the system up or down in total energy, is simply preposterous.

    It may be a “Climate model”; but it isn’t a model of ANY planet which human beings are interested in.

    The UHI problem would completely disappear, if the measured temperature was assigned to the correct surrounding area, and was correctly modelled in the energy transfer equations.

    So heat islands get warmer in the day time, than the original pasture got; so what, during that same mid day sun, those heat islands are radiating like crazy, and doing more than their fair share to cool the planet.

    Just look how fast the ground cools in a UHI after sundown.

    George

    PS And I’ll repeat my assertion that the global sampling network and procedure violates the Nyquist sampling theorem, by many orders of magnitude in the spatial sampling, and by integer factors for the time sampling, so there is no way that such a “random” sampling process can recover the true average surface temperature of the earth; let alone at some nebulous five feet or so above the ground (which can be minus several hundred meters to over + 8000 meters altitude.) Some system!

    According to NOAA, the solar constant (incoming solar flux) is 342 W/m^2.
    Actually it is 1368 W/m^2, which is roughly 4 times what NOAA puts out on their website as comprising the earth’s energy budget, and moreover the sun puts that same 342 Watts/m^2 down on the south pole in the middle of the Antarctic winter night, and every other place on earth. Total balderdash !

  68. Retired Engineer says:

    The sensor looks to be far away from any buildings, at least compared to other stations. I assume (dangerous) that it is an MMTS type sensor, which Anthony says seem to have cable length limitations. Am I missing something here ?

    REPLY: Yes you are missing something, but not your fault. The surveyor did not include pictures of the office for this site but it is nearby. See this aerial photo of the site.

    The specified length limit for the MMTS cable is 1/4 mile according to NOAA tech specs. But it seldom, if ever gets extended that far. The reason is quite simple.

    The NWS is responsible for installing these. The NWS WSFO that services the area has a position called “COOP Manager”. One of the jobs of that person is to install these. Typically they have a day. They have to drive to the site, choose a location, dig a trench, lay the cable, bring it into the building, install the readout unit, test the system, then train the observer, all in that day.

    Since they typically have only hand and garden tools, such as a spade shovel and pickaxe, and seldom are given budget or time to rent a motorized trencher, they often have to take shortcuts. Simple surface barriers such as sidewalks, driveways, gravel and asphalt roads often can’t be trenched around or under with such basic tools and time limits. Hence the MMTS almost without exception gets closer to the observer residences and offices than the Stevenson Screen was, which could be placed anywhere. of course such proximity has been shown to create a positive bias in the temperature record. Such step changes may not be caught and corrected for.

    A perfect example of this problem is this photo from Bainbridge Georgia, which shows where the Stevenson Screen original location is and where the new MMTS was placed due to these limitations

    http://gallery.surfacestations.org/main.php?g2_itemId=5645&g2_imageViewsIndex=1

    They simply couldn’t get the wire under the road.

    Hope that explains it – Anthony

  69. Mark says:

    Here are some videos of the recent Governors Global Climate summit.
    http://www.uctv.tv/climate/videos.asp

    Some of them are kind of annoying to watch.

    Note that the most recent videos get bumped as newer ones are added to the list.

  70. Geo says:

    We all know the hoary acronym GIGO (garbage in, garbage out). But for this project on weather stations to be really useful, it becomes necessary to go to the next step. To compile the percentage of really egregious sitings and show that the already-understood concept of the Urban Heat Island has actually been significantly understated up to now, even tho the AGW crowd claims to have already corrected for it.

    Otherwise, the axe grinders on the other side will just continue to sniff that individual anecodotal sites may be poorly placed, but that their culminative effect is “non-material” in the beloved big picture. . . .

  71. evanjones says:

    Do 1,000 sites, even if they were perfectly sited and maintained, give us a better answer than 500 or 200 sites of the same quality?

    The new NOAA/CRN network is either started or about to be up and running. They have 83 on line so far as I could figure out. They are a complete network, for the most part extremely well sited and have 24-hour automated data collection (No T-Min/T-Max. No TOBS) and all data raw. No adjustments required.

    If this is indeed so and if NOAA would give CRN some face time we might actually find out what’s going on around here!

    I am waiting for more word on this.

  72. Retroproxy says:

    Slightly off topic, but it’s another example of the nonsense behind AGW, I actually heard Arnold Schwarzenegger blame the recent wildfires in Southern California on global warming. Then we just learned that a bonfire set by college students caused one of the fires. But the other ones must have been caused by global warming!

  73. J. Peden says:

    The cylindrical tank to the left is an anaerobic digester. These are typically heated to 95 degrees F using the biogas generated by the digestion process. As this gas is a free byproduct, and would be flared if not used to heat the digester, these concrete walled tanks are typically not insulated.

    Geez, I didn’t know they were heated, but it sure makes sense. I was envisioning only sewage composting w/o heat added, which produces some heat itself, along with a lot of gas – that’s one reason why you have a “stack” on your home drain system venting above the roof, and “traps” in the drains. Once one of my traps dried out and I nearly blew up my house, before I realized what had really happened to produce that special gassy smell.

  74. evanjones says:

    It is an essay by our good friend. EVAN JONES – Go Ev!

    Thanks, Anthony!

  75. Katherine says:

    Drew Latta wrote:

    Well, the methane burn off thing is interesting, but it might never be activated. A lot of these WWTPs use excess methane from sludge digestion to heat the digester and other processes. Depending on the temperature their digester runs at it might be heated most or all of the time, and they might hardly ever flare the methane.

    Given the discoloration on the top of the burnoff torch, I think “never” is out of the question.

  76. M White says:

    B Kerr (14:12:35) :

    Where are all the UK sites?
    Can anyone help?

    You may like to check this page out

    http://badc.nerc.ac.uk/data/ukmo-midas/

    Scroll down to “Find your weather station” and download
    Alternatively goto

    http://badc.nerc.ac.uk/googlemap/midas_googlemap.cgi

  77. Steven H says:

    With regard to sampling errors and possible misreadings, the following CNN report contains testimonials from Inuit concerning the “decline” in Polar Bear numbers as being a sampling error on the part of briefly visiting researchers:

    http://edition.cnn.com/video/#/video/tech/2008/11/19/intv.kangaroo.dna.graves.cnn

    Obviously, being hunters, they may be biased, but the testimony at least seems sincere and heartfelt.

  78. Steven H says:

    OFFLIST- TO THE MODERATOR:

    In my recent post I mistakenly added the wrong link. The correct one for the Inuit dispute of polar bear decline, at least in their area, was:

    http://edition.cnn.com/video/#/video/tech/2008/11/20/mcginty.can.polar.bears.pt.3.itn

    If you do post the comment, please do correct the link!

  79. Philip_B says:

    George E. Smith (20:06:47), you raise a number of points.

    1. There is a different rational for each adjustment and some are better documented than others.

    The basic problem is we cannot go back and redo temperature measurements from the past. Therefore, in order to determine the temperature in the past, the existing temperature measurements (at the existing sites) must be used and any deficiencies ‘fixed’ by adjustments. (A rationale I don’t agree with. See below)

    2. You then get into gridded temperature averages.

    Determining the average temperature and trend of an area by random sampling, and by weighting and adjusting measurements across the area are different and as far as I am concerned exclusive methods.

    Random sampling by definition deals with any and all biases that affect station siting and hence you need to know nothing about the biases. Weighting and adjusting assumes you know what the biases are in order to weight and adjust for them.

    3. I agree with you that the Forcings model is invalid and the GCMs are based on invalid premises.

    BTW, the way to approximate random siting is to only use sites that are remote from human influences. What I call pristine locations. Stations on remote islands are especially good candidates because the location of remote islands are close to ‘random’ for the purposes of determining the Earth’s climate.

  80. Mike Bryant says:

    Great article Evan!

  81. George M says:

    To: Ray Reynolds (14:33:12) and others.
    I wouldn’t worry about sprinkler systems. Look at that fire hose lying in the grass. How the near end looks like it was just disconnected from the adjacent hydrant and dropped in the grass, after the other end was used to water the grass? And the MMTS and everything else in the area? Ever tried to hold a fire hose single-handedly? Why else would it not be rolled up neatly like the one in the lower left corner of the photo if it wasn’t used regularly? And that sure is lush green grass. I wonder where they get their fertilizer?

  82. LG Harris says:

    I can think of about a hundred ways that a site could produce anomalous warmer readings. Is there any way that a site could produce anomalous cooler readings?

    REPLY: Fast growing bushes/trees near the sensor could produce shade (we’ve seen some of those too), The sensor could be relocated to shade areas, the sensor could malfunction.

    Water in the form of a lawn sprinkler near the area has the effect of reducting Tmax if run during the day, but increased humidity at night will elevate Tmin.

    There’s not a lot of cool biases. – Anthony

  83. Grant Hodges says:

    OT headsup: “WASHINGTON (Reuters) – Much of the United States can anticipate a mild winter, with warmer-than-normal weather forecast from the Rocky Mountains to the Appalachian Mountains through February, government forecasters said on Thursday.

    The National Oceanic and Atmospheric Administration said the greatest chance for above normal temperatures was expected in Missouri, eastern Kansas, Oklahoma and Arkansas, with a lesser probability extending into southern Wisconsin, western Ohio and Texas.

    The agency’s winter forecast also predicted an equal chance for temperatures to be normal, above normal or below normal on the East Coast and in the western United States, extending into Montana and northern Minnesota.

    The forecast could be good news for the Northeast and the Midwest, which are the largest users of heating oil and natural gas, respectively, in the United States.
    http://www.reuters.com/article/domesticNews/idUSTRE4AJ4V020081120
    Of course, right now here in Indiana we are running 5-10 degrees below normal. I have changed sides. I am PRAYING for global warming.

    Grant

  84. George E. Smith says:

    I don’t know exactly how Dr. James Hansen computes GISStemp anomalies or even why he does that.
    But determining the mean surface temperature of the earth is inherently very simple. You put some thermometers in the gound (solid and liquid ground), and then you read them all at the same time, multiply each by the area representing each thermometer; add them up, then divide by the total surface area of the earth to get the mean. You then repeat the process periodically to update the temperature which will change because the sun moves to a different place over the earth. So a complete year of records would seem to fairly represent the global average for climate purposes (whatever climate purposes ther may be in even doing this)
    Now the SI units of length and time, are the Metre, and the second, so I recommend you put one thermometer at the center of each square meter of the surface, and you read them all at the same time, once per second. How easy is that?
    Well that’s a lot of thermometers, and if you calculate how fast the sun is moving, I think at the equator it moves about 500 meters per second or thereabouts, so it hops over a lot of thermometers. So maybe it is more suitable to place one thermometer at the center of each 1000 metre square on the surface rather than each metre. Now the sun is in each square for about two successive readings, and this has the advantage of reducing the number of thermometers by a factor of a million; very good for the budget.

    Well that is still one hell of a lot of thermometers, and people don’t tend to notice how much the temperature changes in just one second, so maybe we don’t need to read it that often.

    So how can we determine how often we really need to read each thermometer, and how many do we really need.

    What i am hearing from these discussions, is that the climatology community thinks it is quite sufficient to read the thermometers twice a day; or just get the max and min readings each day, and don’t worry about when you actually read them. Well now you never would actually know what the mean temperature was at any moment, because you never ever have a true current set of readings. The also seem to think that a thousand or 10,000 or even 100,000 thermometers globally are more than enough.

    And if you distribute them randomly they think you get more believeable result. Nobody seems to realize that one perfectly reasonable result of a completely random positioning,, would be if every thermometer ended up in exactly the same place. that is no more unlikely than any other random placement.

    What we are talking about here is a continuous function of basically two variables; one is time, and the other is spatial location, and it isn’t possible to measure that function at every possible place and time; so we have to go to a SAMPLED DATA SYSTEM. We have to sample this two variable function, in time and space; and we have to do it in such a way that we don’t lose any important information.

    The theory of sampled data systems; which apaprently climatologists know absolutely nothing about, is a well studied branch of mathematics, and the modern world wouldn’t work without it. Our entire global communications networks and hardware dpend on sampled data systems, and the communications engineering communit live and breathe sampled data theory.
    Sampled data system theory, says that it is possible to sample a continuous function in such a way that the continuous function can be completely reconstructed from the recorded sample; PROVIDED YOU COMPLY WITH CERTAIN RULES.

    Now when I say, the continuous function can be completely reconstructed, that means that you will be able to determine the value of that continuous function, not only for the specific points at which it was sampled; but for ANY other point in the continuous function; there is NO loss of information.

    So WHAT ARE THE RULES ? Well first of all the continuous function must be “band limited”. That means it may only contain signal frequencies (cyclic changes) less than some maximum frequency called the bandwidth of the signal. For our climate puzzle, those frequencies would be cyclic changes in time and also in space.
    For example if we wanted to sample say an audio frequency continuous signal from say a symphony orchestra, we might limit the bandwidth of the signal to 20 KHz., in the belief that any signals at higher frequencies would be inaudible, and so eliminating them would result oin no loss to the music listener.
    The second, and most important rule says that the band limited signal must be sampled at a rate (frequency) that is at least twice the frequency of the band limit. So our orchestra signal would have to be sampled at a rate of at least 40 KHz.
    Now, the samples don’t have to be uniformly spaced in the independant variable axis; but if they are not, then you still have to be sure that at least one sample is taken during each and every half cycle of the highest signal frequency (the band limit). So that is what is wrong with random sample placement; it does not allow you to take fewer sample; in fact it demands that you take more samples, because the samples can never be further apart, that half the wavelength of the highest frequency of signal change; but they can be closer and that means more samples.
    So the moste economical sampling regimen is to take uniformly spaced samples.

    If you do that, SDS theory says you can completely recover the original continuous function from just those discrete samples. In the communications industry the gaps between those regular samples don’t go to waste; they simply put samples of a completely different signal or thousands of different signals, into the gaps, and then sort them out to distribute to wherever each signal was supposed to go.

    Now the problem of high fidelity reconstruction is itself quite complicated; but it is a well understood discipline.

    The rule that says you need one sample (at least) per half cycle of the highest frqeuency in the band limited signal, is called the Nyquist criterion, after the scientist (Belll Labs I seem to recall) who first announced it.

    So what happens if you don’t comply with the Nyquist criterion in your sampling regimen. In other word if you sample a signal at the Nyquist rate; but your signal contains information at higher frequencies than half that rate.

    If the band limit is B and you sample at a 2B rate; suppose you have a signal at a frequency B+b, outside the band limit you designed for.
    After you sample that signal at a rate 2B, you will find on reconstruction, that you now have a completely fictitious (noise) signal at a new frequency of B-b. It is total garbage, and since its frequency is B-b, it is inside your signal bandwidth; so it is impossible to remove without throwing away usefull real signal information.
    Suppose your errant out of band signal is at a frequency of 2B, the same as the sampling rate, and twice the band limit.
    Well now you have an erroneous noise signal at a frquency of B-B, which is zero; and the zero frequency signal is in fact the average value of your recovered signal.
    So you only have to violate Nyquist’s criterion by a factor of two, and you can no longer correctly compute the average value of your recovered continuous function; it it permanently corrupted by “aliassing ” noise.
    All practical sampled data signal processing systems include an “anti-aliassing filter” which limits the bandwidth of the signals to be processed to less than half the system sampling rate.

    Movies, Television, hi fi CDs, digital cameras, are all examples of sampled data systems, and many of them exhibit aliassing noise. In the movie or TV horse opera, the wagon wheels go backwards, because the frame repetition rate is too low to correctly sample the moving wheel spoke images.

    So now would somebody like to try and convince me that GISStemp actually derives anything vaguely akin the the global mean temperature of any climately useful partion of the planet’s surface (five feet over the weber or wherever).

    It is total garbage; and even if the sampling sytem was correct; the result you might obtain, has absolutely no scientific validity or meaning whatsoever, as far as the global climate is concenred.
    Hansen’s religious ritual if rigidly applied determines occasional new values of a value called the GISStemp anomaly; which has no scientific connection to anything else, including global climate.

    George

  85. PaulM says:

    B Kerr, Colin Aldridge, M White
    For anyone interested in discussing UK sites I have set up a thread at
    http://www.climateaudit.org/phpBB3/viewforum.php?f=6
    The locations of the sites GISS uses are interesting.

  86. Tom Jefferson says:

    The obvious incompetent and fraudulent temperature sampling, manipulation, and publication are unacceptable.

    We need a program to STOP this COUP and restore faith and trust in science.

    OT ?, Science is supposed to be a pure pursuit of the TRUTH… do we have anything like that now?

    Also, if you are trying to control the masses… revising history is imperative to support your goals.

    Temperature Data, Ocean Water Levels, Glacial Masses, etc.; are being sampled for such an insignificant slice of time that NO ONE, especially a real SCIENTIST, can be even remotely honest in presenting any of this as anything more than an academic exercise.

    “Retroproxy (22:00:55) :

    Slightly off topic, but it’s another example of the nonsense behind AGW, I actually heard Arnold Schwarzenegger blame the recent wildfires in Southern California on global warming. Then we just learned that a bonfire set by college students caused one of the fires. But the other ones must have been caused by global warming!”

    Arnold Schwarzenegger chose to ignore L.A.’s Historical Santa Ana Winds AND the Historical Fires they support – at least 10,000 years.

    The Science of CO2 migration in ICE CORES prevents ANY honest presentation of that data.

    The Science of SOIL’s along the coastlines does not seem well understood and sinking (substinence) of soils is not included in calculations.

  87. Old Coach says:

    George E. Smith

    Nobody seems to realize that one perfectly reasonable result of a completely random positioning,, would be if every thermometer ended up in exactly the same place. that is no more unlikely than any other random placement.

    The above statement is false, although the rest of your arguments are logical. Garbage in = garbage out, as we say in the land of data collection. We seem to be spending a lot of time trying to optimize our computer climate models based on suspect data.

  88. Old Coach says:

    Correction:
    That second paragraph should not be in italics. italics on italics off…

  89. To Paul M re UK sites

    Thanks. I am registering and will contribute.

  90. JohnD says:

    “…one of the worst examples of thoughtless station placement…”

    The assumtion of “thoughtless station placement” may be as misplaced as the station itself.

  91. TheGreenBaron says:

    You know, the idiotic placement of some sensors aside, one thing I have considered extensively is this: the US Navy, Royal Navy, and former USSR’s military have been making fairly accurate measurements of weather, temperature and polar ice density since at least the 50′s.

    Why has neither side of the ‘debate’ published any findings that are based on this? As far as I can tell, no one has viewed this data extensively on either side of the aisle, and one would think that the eternal naval fixation with weather would produce some findings one way or the other.

    BTW: Personally, I find the evidence of man-made global warming somewhat compelling, though not concrete. Trying to draw ridicule to the idea based on the fact that some weather sensors are poorly placed does not make that every weather sensor is poorly placed. There is one not far from my parents house that is in an open field with nothing but hay for company. Does that make it more or less accurate?

    Further, yes, we can alter the planet in large scale and drastic ways and the idea that our atmosphere might react in a similar manner to other planets when the same situation is applied is only common sense. Given that we can observe the effects of large amounts of carbon dioxide gas in other worlds (Venus, Mars), would it not stand to reason that increases in our own levels might not have similar results?

    I doubt that any argument of mine will convince people of my views, as I’ve seen science and cynicism confused on several occasions (I might with glee recall the man in the Royal Society oh so loudly proclaimed that no amount of evidence would convince him that the supernatural was real.) While debunking and fresh views are are a requirement for good science, there comes a point where it becomes like the scientists in the employ of the tobacco companies trying to convince people smoking is healthy, and cancer is good for you.

  92. sod says:

    i was told that hot air tends to RISE. did anything change recently?

  93. George E. Smith says:

    Well GreenBaron just what is your evidence that large amounts of carbon dioxide on Venus and Mars are having effects. By the way; what are those effects that “we can observe” ?
    Venus is about 2/3 of the earth’s distance from the sun, so it receives about 1.9 times as intense sunlight, and it has only 90% of the earth gravity, yet it’s atmospheric pressure is many times that of earth. It clearly isn’t composed of the same gases as earth has. It’s surface temperature is so high that the CO2 absorption mechanism is quite different from what happens on earth.

    CO2 is quite transparent to visible light, yet we don’t see the Venus surface so those clouds that completely cover Venus all the way to the ground, are clearly not CO2 and they can’t be water either because the temperature is too high for any liquid or solid water to exist.

    I could go on and on; but I am sure you can google Venus as easy as I could, and find out the properties of that planet.

    We have absolutely no historic data for either Mars or Venus, that would show that CO2 has had any historic effect on either planet’s climate.

    If you find the evidence for man made global warming somewhat compelling, perhaps you could enlighten the rest of us who see no evidence whatsoever; let alone anything compelling.

    ALL of the peer reviewed evidence relating earth’s global surface temperatures, and the atmospheric CO2 concentration, shows unequivocally that it is the warming or cooling that produces the increased or decreased CO2 and not the other way round.

    You see the wolf and the lamb drinking in the river, and you blame the lamb for the disturbed muddy water; even though it is the wolf that is upstream.

    How compelling is that kind of argument?

    George

  94. Tom Jefferson says:

    Yes SOD, hot air still rises… at least on earth without microclimate related downdrafts.

    sod (07:34:01) :

    i was told that hot air tends to RISE. did anything change recently?
    ________________________________________
    Also, George stated it very well indeed…

    George E. Smith (08:38:15) :

    Well GreenBaron just what is your evidence that large amounts of carbon dioxide on Venus and Mars are having effects. By the way; what are those effects that “we can observe” ?
    Venus is about 2/3 of the earth’s distance from the sun, so it receives about 1.9 times as intense sunlight, and it has only 90% of the earth gravity, yet it’s atmospheric pressure is many times that of earth. It clearly isn’t composed of the same gases as earth has. It’s surface temperature is so high that the CO2 absorption mechanism is quite different from what happens on earth.

    CO2 is quite transparent to visible light, yet we don’t see the Venus surface so those clouds that completely cover Venus all the way to the ground, are clearly not CO2 and they can’t be water either because the temperature is too high for any liquid or solid water to exist.

    I could go on and on; but I am sure you can google Venus as easy as I could, and find out the properties of that planet.

    We have absolutely no historic data for either Mars or Venus, that would show that CO2 has had any historic effect on either planet’s climate.

    If you find the evidence for man made global warming somewhat compelling, perhaps you could enlighten the rest of us who see no evidence whatsoever; let alone anything compelling.

    ALL of the peer reviewed evidence relating earth’s global surface temperatures, and the atmospheric CO2 concentration, shows unequivocally that it is the warming or cooling that produces the increased or decreased CO2 and not the other way round.

    You see the wolf and the lamb drinking in the river, and you blame the lamb for the disturbed muddy water; even though it is the wolf that is upstream.

    How compelling is that kind of argument?

    George
    _____________________________________

    I would like to add that “compelling” is quite easy for a those unwilling to observe and review with an open mind.

    Tom

  95. Steve M. says:

    Yet another case of “make the past cooler, to make the present seem warmer.” I dont’ suppose that GISS has ever released the algorithm used to “homogenize” the data.

Comments are closed.