Weather Channel Founder John Coleman: There is no significant man-made global warming at this time

johncoleman_TWCJohn Coleman, the founder of Weather Channel, has written an open letter, in which he claims the theory of anthropogenic climate change is no longer scientifically credible. So far The Express, a major British newspaper, and the American news service WND, have provided favorable coverage of the letter. The Express article has also been linked to by the Drudge Report, giving it wide exposure. The full text of the letter is as follows:
_______________________________________

Dear UCLA Hammer Forum officials,

There is no significant man-made global warming at this time, there has been none in the past and there is no reason to fear any in the future. Efforts to prove the theory that carbon dioxide is a significant “greenhouse” gas and pollutant causing significant warming or weather effects have failed. There has been no warming over 18 years. William Happer, Ph.D., Princeton University, Richard Lindzen, Ph.D., Massachusetts Institute of Technology, Willie Soon, Ph.D., Harvard Smithsonian Observatory, John Christy, Ph.D., University of Alabama and 9,000 other Ph.D. scientists all agree with my opening two sentences.

Yet at your October 23 Hammer Forum on Climate Change you have scheduled as your only speakers two people who continue to present the failed science as though it is the final and complete story on global warming/climate change. This is major mistake.

I urge you to re-examine your plan. It is important to have those who attend know that there is no climate crisis. The ocean is not rising significantly. The polar ice is increasing, not melting away. Polar Bears are increasing in number. Heat waves have actually diminished, not increased. There is not an uptick in the number or strength of storms (in fact storms are diminishing). I have studied this topic seriously for years. It has become a political and environment agenda item, but the science is not valid.

I am the founder of The Weather Channel and a winner of the American Meteorological Society honor as Broadcast Meteorologist of the Year. I am not a wacko flat Earther. Nor am I a “paid shill” (as has been claimed) of the Koch Brothers. I am a serious Professional. I am strongly urging you to reconsider your plan.

I can be reached at 858-xxx-xxxx (redacted by Anthony) and will be pleased the discuss this matter with you and answer questions. I will be happy to provide links to all of the points I have made in this email. As a quick scientific reference you may wish to look at the website of the Non-governmental Panel on Climate Change. http://climatechangereconsidered.org/

My best regards,
John Coleman

A copy of this email has been supplied to The LA Times, KCBS/KTLA and NBC4 Los Angeles

(h/t to Eric Worrall for the reminder. John sent me the text of the letter two days ago)

In The Express article they add:

Climate expert William Happer, from Princeton University, supported Mr Coleman’s claims.

He added: “No chemical compound in the atmosphere has a worse reputation than CO2, thanks to the single-minded demonization of this natural and essential atmospheric gas by advocates of government control and energy production.

“The incredible list of supposed horrors that increasing carbon dioxide will bring the world is pure belief disguised as science.”

In 2010 a high-level inquiry by the InterAcademy Council found there was “little evidence” to support the IPCC’s claims about global warming.

It also said the panel had purposely emphasised the negative impacts of climate change and made “substantive findings” based on little proof.

166 thoughts on “Weather Channel Founder John Coleman: There is no significant man-made global warming at this time

  1. Here’s proof, this is only surface measurements.

    The warming we’ve seen in temp series is due to the actual processing of the data.

    • Mark Elliot, of TWC, when making his debut with TWC was forced to make a statement that he converted from science of “it’s not man made global warming” to the dark side of CAGW…over the air! It was an awkward moment; the look on his face told the story “I’m getting paid for this.” I was a fan of TWC until John Coleman “retired” from it. Now it is a political arm of the worst kind.

      Good for Mr. Coleman to continue to speak out. Same with Joe Bastardi et al with WeatherBell and others who keep the path of honesty and integrity in their science.

      • Thom commented

        Micro: Surface temperature of what (US/World?). Also link so I can post. Thanks.

        95 million surface station records, from around the globe, from the NCDC Global Summary of Days dataset.
        This graph is fresh from this morning where I added average temps to min and max. For this, just right click and copy url link, the data (previous version without average temps) at the url in my name, and you can follow the science 2.0 link for more wordy stuff.
        http://www.science20.com/virtual_worlds

      • milodonharlani commented

        Before or after adjustments?

        NCDC says here ftp://ftp.ncdc.noaa.gov/pub/data/gsod/readme.txt they do some QA

        As for quality control (QC), the input data undergo extensive
        automated QC to correctly ‘decode’ as much of the synoptic data as
        possible, and to eliminate many of the random errors found in the
        original data. Then, these data are QC’ed further as the summary of
        day data are derived. However, we expect that a very small % of the
        errors will remain in the summary of day data.

        I have found some bad temps that I filter out (above and below 199F/-199F, so I presume they are actually bad). I do no adjustments.

    • Thanks. Always wondered why we were never shown a minimum temperature map. We are in a cooling trend are we not? At first cursory look, seems the warmer periods follow a colder minimum. So the word compensates for a maximum trough by overheating for a period.

      • Yup. Climate is cyclic on various time scales. On the multidecadal scale during the secular Modern Warm Period, there was an initial warming, pro-trend cycle in the mid-19th century, followed by a counter-trend cooling in the late 19th to early 20th century, then a pro-trend warming in the 1920s to ’40s, followed by a counter-trend cooling from the ’40s to ’70s, then the recent pro-trend cycle of c. 1977-96 (coincident with the warm phase of the PDO), followed by the present counter-cyclic cooling trend.

        Similar cyclic fluctuations have been observed before, during & since the hot Holocene Climatic Optimum & in the secular, centennial scale cooling of the LIA, secular warming of the Medieval WP, secular cooling of the Dark Ages CP, secular warming of the Roman WP, secular cooling of the Greek Dark Ages CP, secular warming of the Minoan WP, etc. The same pattern emerges from paleoclimatic records of previous interglacials & glacials as well.

      • Jack commented

        Thanks. Always wondered why we were never shown a minimum temperature map. We are in a cooling trend are we not? At first cursory look, seems the warmer periods follow a colder minimum. So the word compensates for a maximum trough by overheating for a period.

        A global view distorts the minimum temp trends (which is another way they’re lying to everyone).
        What looks to be happening is max temps seem to be “flat”, min temps fluctuates regionally.






        Also note, I don’t use a baseline, I start at 0

      • Man Bearpig commented

        Interesting chart. In 1972/3 the minimum is greater than max. And again in 1949-1948

        I hadn’t paid much attention to that. But remember this is a “Anomaly” graph.
        72/72 for some reason the number of surface records dropped , then rebound. The 40’s weren’t a lot better.

        Normally I don’t even include the 40’s.

    • Yep, GIGO, aided and abetted by ailing algore-ithms that “Harry Readme” was forced to port/maintain/etc.

      I guess, in other words, if the only tool you have is a Hammer (conference), every problem looks like a nail CO² induced globull warming…

      Good on John Coleman for standing up to the warmistas and bringing out true science!

      • Jeff commented

        Yep, GIGO

        The data is lacking, but it’s more than enough to disprove AGW, if you look at the actual data and don’t butcher it.

        It’s also interesting that GISS and BEST know about the work I’ve done, and while Mosh has been quick to tell me it’s wrong, they have to know exactly the same thing I’m showing, and they still keep shoveling $hit out to the public.

        To be clear, I subtract yesterday’s temp (min/max and now average) from today’s station by station, then average these values together after defining an area to examine and selecting the stations in that area that meet the sampling requirements I specify (samples/year and number of years that have at least that many sample, these were 240 samples per year for at least 10 years). Nothing more. I’ve posted my code, and the data it generates that I use to create graphs.

        That’s it.

      • Doesn’t subtracting yesterday’s temp. mean you are looking at change day-to-day? Changeableness is not an issue; cumulative change is. You are differentiating rather than integrating.

      • Brian H commented

        Doesn’t subtracting yesterday’s temp. mean you are looking at change day-to-day? Changeableness is not an issue; cumulative change is. You are differentiating rather than integrating.

        I’m going to try and do this, adequately, but with the least amount of typing that I can do.

        15+ years ago I was “skeptical” about CAGW. A few years later I started doing astrophotography with a digital camera.

        Part of that process was to capture the sensor noise due to temperature dependent leakage currents in the photodiode that’s the basis of the camera I was using. You did that by taking an exposure of the same length as your image sub, but with the lens cap on while the camera was at the same temp. Since that impacted imaging time by 100%, I started using a library of darks, but you had to log temps, so you could find a matching dark frame of the same temp. After a while of this, it dawned on me just how much temps dropped when the Sun set, through the Co2 blanket (W.E. :) ). 10-20F from Sunset to midnight was possible. It seem ludicrous that a clear sky that could lose 2-3 degrees per hour was able to cause “unprecedented warming”. I decided to go in search of nightly rate of change info to see if I could find a loss of nightly cooling out of the temperature record. I was planning of finding millions of clear sky night time cooling temps from around the world, binning them into similar conditions one year to another. What I found was the GSoD data set with min/max temps. So it wasn’t an hourly data set, but one based on a 24 hour period instead of an hour.
        I set about creating a set of yesterday’s Rising temps, and last nights Falling temp for every station, for every pair of days (yesterday and today). But that differences is the same as today’s min – yesterday’s min. So, a rate of change on a daily basis, but on a annual basis, the warming ups, should cancel the cooling downs, the remainder is the yearly warming when it’s summed, and a measure of rate of change when averaged. With max temps, while not what I was originally planning on looking at, I was amazed at how small the average of difference was (which remember is a sum of diff/count of samples).
        Because of the way I did this, you can easily switch between Sums and and Averages.
        The 74 year average of average temp difference is -0.00035F, the sum of 95 million samples is -0.0257. If you divide the Sum by the number of years you get the average, and so on.

        Now, I also build a daily value report, same basic process, average the difference of all of the stations in an area by day, a daily rate of change. If the area is large enough you can average out weather, if that area is say a range of latitude into say 10 degree bands for all longitudes, in the extratropics you get a sign wave that is the rate of daily change, since there is then a uniform period, and the period of daylight changes through out the year, you can see how much warming there is as the ratio of day to night changes, where about the middle of the year this average changes from being positive to negative, basically a rate of change, which you can get a slope.

        I did this for the US (as it’s the most sampled location on the planet) for each year, this slope does have a slight trend.

        But it looks like near the end of the data there might be an inflection point where it’s changing direction, about the time the pause started. So while max temps day over day for all of the included stations is near zero (-0.00035F), it is possible that the integral days temps are a little larger than they use to be, but this might just go up and down, for instance if the warming is due to an increase in warm water vapor from the oceans moving over land and taking longer to cool off. But this is getting into attribution, and I just don’t know.

    • As a side comment, I can see when the files on sourceforge get downloaded, and they tell me where in the world I presume the ip address is from.
      Someone in Spain has been busy today, whoever that is, if you don’t mind, raise your hand (or shoot me an email, which you should be able to do from SF).

      • Mi,

        If you’ve taken data from the NCDC and differentiated the day by day temps to get a measure of the change, your data should essentially be flat – because you’ve decreased the signal to noise ratio. Anyone studying the changes in global average temperature know that you’re looking for a very small effect – so you shouldn’t be dropping the SNR in your data. If you do that you end up with flat graphs.

        BTW, there was a much simpler analysis done here that looked at record low temps vs record high temps for nighttime measurements for the same NCDC dataset:

        “The study also found that the two-to-one ratio across the country as a whole could be attributed more to a comparatively small number of record lows than to a large number of record highs. This indicates that much of the nation’s warming is occurring at night, when temperatures are dipping less often to record lows. This finding is consistent with years of climate model research showing that higher overnight lows should be expected with climate change.”

        http://www.climatecentral.org/blogs/record-warm-nighttime-temperatures-a-closer-look

        –CG

      • c grier commented on

        If you’ve taken data from the NCDC and differentiated the day by day temps to get a measure of the change, your data should essentially be flat – because you’ve decreased the signal to noise ratio. Anyone studying the changes in global average temperature know that you’re looking for a very small effect – so you shouldn’t be dropping the SNR in your data. If you do that you end up with flat graphs.

        Isn’t cancelling out all of the noise exactly what you’d want to do to detect a very small signal?
        That said, I believe what I do offers great precision, is there any retention of yesterdays warming. And the answer is no based on 95 million measurements.

        BTW, there was a much simpler analysis done here that looked at record low temps vs record high temps for nighttime measurements for the same NCDC dataset:
        “The study also found that the two-to-one ratio across the country as a whole could be attributed more to a comparatively small number of record lows than to a large number of record highs. This indicates that much of the nation’s warming is occurring at night, when temperatures are dipping less often to record lows. This finding is consistent with years of climate model research showing that higher overnight lows should be expected with climate change.”

        This isn’t the same, this is all based on absolute temperatures, my procedure is all relative. But these yearly graphs are only part of what you can look at, and that the rate of change in temps as the length of day changes throughout the year, and compare that to previous years. This is the crux of the climate change issue, and it’s imminently available in the data we have, not the data we’d like to have.

        The problem as I see it is that it shows the effect of Co2 is so small it’s barely measurable, it’s certainly nothing to worry about. What good is that?

      • Let me start by saying that I’m not an expert. But no, you don’t want to use differentiation to try and reduce the noise. When looking for a small effect you need to integrate the data over time. Brian H was on the right track. By doing a “diff” analysis you are filtering the data and coming out with noise – all the signal is gone. If you want me to find a citation for this I’m sure there’s on on a statistics website somewhere.

        If you look at the first graph you posted on Oct. 23 at 9:43 you’ll see what I mean. The Max temp anomaly and the Average Temp Anomaly are almost *exactly the same*. This means that the high averages are being “cleaned up” in a different way than the low averages. (Or that the Max and Ave calculations are not doing what they are supposed to do) The Min. Temp Anomaly (whatever that really means) is showing big spikes because of bad data points in the data set.

        To put it a different way, truly random noise is hard to find, but eventually it should return to zero – if it is summed over a sufficiently long period. If you start looking for signals inside of noise you’ll be surprised what you find, or what you think you’ve found:
        http://noosphere.princeton.edu/

        I guess what I’m saying is that your specific findings should be reproducible by other teams, or using other data sets. So if there is a significance to the odd behavior in the early 70s, based on your analysis, it should show up when comparing to other work. Have you found any other studies by skeptics showing similar flat results over the same periods?

        –CG

      • c grier
        October 29, 2014 at 5:05 pm

        Let me start by saying that I’m not an expert. But no, you don’t want to use differentiation to try and reduce the noise. When looking for a small effect you need to integrate the data over time. Brian H was on the right track. By doing a “diff” analysis you are filtering the data and coming out with noise – all the signal is gone. If you want me to find a citation for this I’m sure there’s on on a statistics website somewhere.

        Then think of it as an anomaly. It’s basically daily sampling with the daily 24hr solar cycle removed. What’s left is the seasonal signal + the weathers signal + Co2 warming + any other underlying trends.
        Then it’s all integrated together.

        If you look at the first graph you posted on Oct. 23 at 9:43 you’ll see what I mean. The Max temp anomaly and the Average Temp Anomaly are almost *exactly the same*. This means that the high averages are being “cleaned up” in a different way than the low averages. (Or that the Max and Ave calculations are not doing what they are supposed to do) The Min. Temp Anomaly (whatever that really means) is showing big spikes because of bad data points in the data set.

        Except they are not, and that Min temp data is the basis of the average value as it’s calculated from the min and max values.
        Again what I was digging for is any sign of a reduction in night time cooling. But calculating a Diurnal swing based on a single calendar day (which is what they’re doing) totally misses the point, what matters is how much temps went up yesterday and how much it falls that night, which requires data from 2 calendar days. Day over day min temp difference is this value.

        To put it a different way, truly random noise is hard to find, but eventually it should return to zero – if it is summed over a sufficiently long period. If you start looking for signals inside of noise you’ll be surprised what you find, or what you think you’ve found:

        Long period of time or a lot of samples, I’m working with over 120 million samples, for most of the last 30 or so years there’s 2 to 3 million samples per year.

        I guess what I’m saying is that your specific findings should be reproducible by other teams, or using other data sets. So if there is a significance to the odd behavior in the early 70s, based on your analysis, it should show up when comparing to other work. Have you found any other studies by skeptics showing similar flat results over the same periods?

        The early 70’s is an artifact of a very large reduction in samples.

        Other teams could probably reproduce this if they tried. The only real caveat is I don’t infill for missing samples and un-sampled areas. For instance there is no arctic amplification in the actual surface measurements, so the only place they are coming from is the temp model being used, it’s made up. Now there might be arctic amplification, but it’s not in the measurements we have.

        So, I don’t think your observation is correct, But I’m not an expert either, so I am looking for an expert who can either help make this better, or who can articulate some fatal flaw that I haven’t thought of over the last 5 or so years I’m been working on this, please speak up. I’ve been writing on this topic for a while, did you look through much of that, or gotten any of the data from sourceforge?

      • If you’d like to take this off line, I’ll be happy to do that. You’ve obviously spent a lot of time running these numbers, so I’m sure you want to know if there’s something that’s gone missing.

        I think we’re talking past each other a bit, as you don’t seem to be checking my points against your data, or you don’t think I understand how you calculated the temperature changes. From the very top-most graph you posted on this page the Average and Max lines look nearly identical. That must mean there’s a problem in the graph or in how you are manipulating the data. Can you explain why those lines are essentially the same?

        But what you are really missing is that the very very small changes to the maximum rate of cooling during the night are not present in the data after you do your number crunching.

        To find small effects it is necessary to look for places where they are most apparent. In the case of your theory that would be an observed change in maximum cooling during the night periods. Looking at other cooling periods won’t show small effects as well. Take for instance the chart on your website that shows the temperature variations throughout a typical day. https://wattsupwiththat.files.wordpress.com/2013/05/clip_image00211.jpg

        Let’s use that graph: find the maximum cooling rate during the night, which looks like the rate at about 9:00 (if we can manage to filter out the noisy signals around 6:00pm). That’s your most significant number, and the place you are most likely to find that very very small influence of CO2. But note that in the NCDC data you use there are only two numbers per day – the whole day gets averaged so you have one up slope and one down slope. How can you ever pick out that very very small signal that was present at 9:00pm? The data you are looking for is swamped by other factors like clouds, rain, cold fronts, snow, and probably some man made effect on occasion.

        I tried to come up with an analogy, but this is the best I could do: Suppose you wanted to find out if my new car was having some kind of problem that was robbing power from the engine, and creating a very small, but very real deficit in the ability to accelerate. We know how far it takes for me to drive to work, so we should be able to tell if the car is slightly less powerful if we look at the average speed in the morning and the average speed in the afternoon. If we know these, and we know the distance, we can integrate to get the time. It should take a longer time if the engine is slightly less powerful, right? Obviously this approach would be silly, since having two data points simply isn’t enough data to allow us to integrate the numbers and get meaningful time measurements. You could do it, but your analysis would just have random values in the output, since there is sometimes traffic, or the car can run out of gas, or I end up stopping at every red light.

        You had a good thought about looking for rate-of-change in the data, but what you failed to do was to check that the numbers your algorithm produced were not overwhelmed by noise in the original data-set. I do think your graphs of Summer Slope and Winter Slope are very interesting, and might be showing some kind of 60 year cycle. Note also that the signal in those calculations is several orders of magnitude larger than CO2 warming.

      • c grier commented

        If you’d like to take this off line, I’ll be happy to do that.

        Here’s fine, until someone says otherwise I guess.

        You’ve obviously spent a lot of time running these numbers, so I’m sure you want to know if there’s something that’s gone missing.
        I think we’re talking past each other a bit, as you don’t seem to be checking my points against your data

        I pretty much have the code memorized, and I’ve tested it a lot. I did find a mistake a year ago, thought I was looking at min diff, but was actually looking at max diff. I’ve been even more careful reviewing it since then. But I checked it again this morning for you :)
        There’s 2 areas that do the work. During read for each station I read all of that stations data files in order and construct the differences by saving the first temp record to use when I insert the data row, I can calculate the difference value.

        ymxtemp – ymntemp,
        ymxtemp – to_number(trim(SubStr(file_line,111,6))),
        ydate,
        to_number(trim(SubStr(file_line,103,6))) – ymxtemp,
        ymxtemp,
        trim(SubStr(file_line,111,6)) – ymntemp,
        (ymxtemp – ymntemp) – (ymxtemp – to_number(trim(SubStr(file_line,111,6)))),
        ymntemp,
        trim(SubStr(file_line,25,6)) – ytemp,
        ytemp);
        ydate := to_timestamp(SubStr(file_line,15,8),’YYYYMMDD’);
        ymxtemp := trim(SubStr(file_line,103,6));
        ymntemp := trim(SubStr(file_line,111,6));
        ytemp := trim(SubStr(file_line,25,6));
        else
        ydate := to_timestamp(SubStr(file_line,15,8),’YYYYMMDD’) + 1 ;
        ymxtemp := trim(SubStr(file_line,103,6));
        ymntemp := trim(SubStr(file_line,111,6));
        ytemp := trim(SubStr(file_line,25,6));

        end if;

        else
        if ymxtemp -199
        and trim(SubStr(file_line,111,6)) -199
        and trim(SubStr(file_line,103,6)) -199
        then
        ydate := to_timestamp(SubStr(file_line,15,8),’YYYYMMDD’);
        ymxtemp := trim(SubStr(file_line,103,6));
        ymntemp := trim(SubStr(file_line,111,6));
        ytemp := trim(SubStr(file_line,25,6));
        else
        ydate := to_timestamp(SubStr(file_line,15,8),’YYYYMMDD’) + 1 ;
        ymxtemp := trim(SubStr(file_line,103,6));
        ymntemp := trim(SubStr(file_line,111,6));
        ytemp := trim(SubStr(file_line,25,6));

        end if;
        end if;

        The y variables are “yesterday”, ymntemp is yesterday’s min temp. I check each min and max temp to make sure it’s in the range of -199 F, if both aren’t valid, I skip them, go to the next record. Mean Temp, is calculated from min and max, they are identical for all records (ie mean temp = min + max/2). In the data record temp (mean temp) is the 6 characters starting at position 25, max is 103, and min is 111. This line

        trim(SubStr(file_line,25,6)) – ytemp

        creates the Average temp diff, today’s temp – yesterday’s temp. This is the only code that creates this value.

        Then, I average them together here.

        when Type_ = ‘Y’
        then
        Selectfield := ‘avg(max_temp) as MaxTemp,
        avg(min_temp) as MinTemp,
        avg(rising_temp_diff) as Rising,
        avg(falling_temp_diff) as Falling,
        avg(Diff) * 365.25 as YrDiff,
        avg(Diff) as Diff,
        var_pop(Diff) as V_Diff,
        0.141 * count(Diff) * 2 / power(count(Diff) * 2,2) as Diff_Error,
        avg(case when temp > -199 and temp < 199 then temp else null end) as Temp,
        0.316 * count(temp) / power(count(temp),2) as Temp_Error,
        avg(MNDiff) as MNDiff,
        avg(MXDiff) as MXDiff,
        avg(AVDiff) as AVDiff,
        sum(MNDiff) as MNSum,
        sum(MXDiff) as MXSum,
        sum(AVDiff) as AVSum,
        avg(case when dewpoint < 9999 then dewpoint else null end) as DewPoint,
        avg(DewPt_to_RelHumidity(temp,dewpoint)) as RelH,
        avg(case when SEA_LEVEL_PRESSURE < 9999 then SEA_LEVEL_PRESSURE else null end) as SeaLevelPressure,
        avg(case when STATION_PRESSURE < 9999 then STATION_PRESSURE else null end) as StationPressure,
        avg(case when PRECIP < 99 then PRECIP else null end) * 365.25 as Rain,
        Count(temp) as Sample';
        Selectlist := 'avg(maxtemp) as MaxTemp,
        avg(mintemp) as MinTemp,
        avg(rising) as Rising,
        avg(falling) as Falling,
        avg(YrDiff) * 100 as YrDiff,
        avg(Diff) as Diff,
        var_pop(V_Diff) as V_Diff,
        avg(Diff_Error) as Diff_Error,
        avg(temp) as Temp,
        avg(Temp_Error) as Temp_Error,
        avg(MNDiff) as MNDiff,
        avg(MXDiff) as MXDiff,
        avg(AVDiff) as AVDiff,
        sum(MNDiff) as MNSum,
        sum(MXDiff) as MXSum,
        sum(AVDiff) as AVSum,
        avg(dewpoint) as DewPoint,
        avg(relh) as relh,
        avg(SEALEVELPRESSURE) as SeaLevelPressure,
        avg(STATIONPRESSURE) as StationPressure,
        avg(Rain) * 365.25 as Rain,
        sum(sample) as Sample';
        Grouplist := 'year';
        Insertlist := 'MaxTemp,
        MinTemp,
        Rising,
        Falling,
        YrDiff,
        Diff,
        V_Diff,
        Diff_Error,
        Temp,
        Temp_Error,
        MnDiff,
        MxDiff,
        AvDiff,
        MnSum,
        MxSum,
        AvSum,
        DewPoint,
        relh,
        SeaLevelPressure,
        StationPressure,
        Rain,
        Sample';
        InGrouplist := '''9999'' as year';
        GroupBy := '' ;

        If it’s a Y yearly report, I create a table of these values, and group them year, I do this on a set of stations I select by lat, lon and date.
        A single line makes the data in that chart.

        avg(AVDiff) as AVDiff

        This BTW is the same as how min and max temps are processed, I also do some different things trying to get an idea of error bars, but I don’t know what I’m doing to I don’t “publish” those. But this is the same source data that the core of all of the temp series made, there just aren’t that many stations. But if any of them are correct, which I think they are, they are all correct, or at least a faithful representation of the measured value. You’re welcome to get the code yourself, just follow the url in my name.
        So, I am comfortable that graph represents the data, I was surprised when I saw it, I don’t particularly understand it, same with the regional swings in min, but that is what the average of those values is, that I’m sure about. This is why I give Mosh such a hard time, these are the fracking measurements, what everyone else does with it looks completely different, I have to presume it’s because they decompose the temp by location, and then use what they have to fill in everywhere they don’t have a value. What they do is so complicated they have to make up fake data to try and test it. What I’m doing a patient 4th grader could do on paper. What they have yet to figure out is the more complicated something is like this, the more creative they have to be in coming up with ways that it might go wrong, and have to make specific test for them, but they can only test what they think of. All I got to do is check some simple math. I also presume I know less about how the value might be wrong than the people who recorded it and don’t try to fix it.

        But what you are really missing is that the very very small changes to the maximum rate of cooling during the night are not present in the data after you do your number crunching.

        Yes, had I found hourly data instead of min/max data I would have. But I accepted the longer integration time of min and max, and fundamentally they are doing the same. But, hourly data would have it own difficulties, how to you know if the swing in temps is the storm, or the clear skies after the passage of a cold front? Early on I spent a lot of time pondering just these differences, and decided what I was doing the superior analysis. Plus, I can turn the Sun off and see it’s impact, no one else does that. I do the same processing, just aggregate it daily. You can see how much a handful of seconds changes the surface temp, with all of the other things that clutter up this measurement averaged out.

        Oh, perfect analogy, from mph through the traps, and weight of the car, you can tell how much hp it has, and how quick you should be able to go through the qtr mile, with the only real variable being tire spin, which artificially raises mph some. I love how a 8,000+ HP car can break 5-10 feet from the starting line, and still go quicker than really fast cars.
        So you tell if you have a performance problem.

      • I think we’re still talking past each other. I do understand your program, and how you crunched the numbers. What I don’t think is happening is that meaningful data is coming out the other side. If you admit you don’t really know what all the output means, then you might might want to look for help from someone with specific knowledge.

        You didn’t explain why the Max Anomoly and Average Anomoly are the same on the top graph? That looks really strange. Can you help me understand?

        I hate to get all technical, but there are techniques for reducing noise in a large dataset, particularly the “outliers” that seem to be showing in your graphs:

        In such cases it’s common to use the “interquartile range” (IQR), defined as the difference between the upper and lower quartiles, instead of the standard deviation, because the interquartile range is not effected by a few outliers. For a normal distribution, the interquartile range is equal to 1.34896 times the standard deviation. A quick way to check the distribution of a large set of random numbers is to compute both the standard deviation and the interquartile range; if they are roughly equal, the distribution is probably normal; if the standard deviation is much larger, the data set probably contains outliers and the standard deviation without the outliers can be better estimated by dividing the interquartile range by 1.34896.

        Your numbers obviously have a large amount of noise. You have note mentioned noise ANYWHERE in your response to my posts. It is the central problem. Anyone that’s done basic signal analysis knows that you can have a signal present, but if there’s too much noise the signal gets lost. If you don’t think the day to

        You said:

        Yes, had I found hourly data instead of min/max data I would have. But I accepted the longer integration time of min and max, and fundamentally they are doing the same. But, hourly data would have it own difficulties, how to you know if the swing in temps is the storm, or the clear skies after the passage of a cold front? Early on I spent a lot of time pondering just these differences, and decided what I was doing the superior analysis

        You actually think that less data is better than more? Are you serious? You are looking for a vanishingly small effect here. Remember that the temperature, on-average, is changing by 1-2 deg per CENTURY. You think that you can pull that out of a dataset that has numbers dominated by clouds, rain, snow, air movement, humidity, and barometric pressure? When I say more, I mean orders of magnitude greater.

        I think you misunderstand the analogy I tried. A dragster performance test tries to minimize variables so they can get good data out – since the effects of hundredths of a second matter. Can you imagine if there were random puddles of water on the track? What if they used dirty air filters, or had another uncontrolled variable. Could the performance test pull out the very slight variations in engine performance?

        you said

        I was surprised when I saw it, I don’t particularly understand it, same with the regional swings in min, but that is what the average of those values is, that I’m sure about.

        Find someone that knows about signal analysis – maybe an RF engineer familiar with low-power radio communications. Ask them for help comparing your SNR.

        Again, I don’t need you to explain what you did or how you crunched numbers. I think you don’t understand what’s come out the other side and I’m trying to help you to see what’s there and what’s not there.

        –CG

    • Mi Cro, it evades me how min temp anomalies could vary so much compared to max — an order of magnitude in some spots. Inexplicable.

      • “Mi Cro, it evades me how min temp anomalies could vary so much compared to max — an order of magnitude in some spots. Inexplicable.”
        That doesn’t surprise me as much as average temps having so similar a trend as max temps, while being so different from min temps. But i am confident that’s what the data says.

        Now, I do leave open the possibility it’s from adjustments, though I would be surprised they spread them around to different regions, so I go back to min temps are impacted by ocean temps, and for some reason not max temps.

      • Yep, I agree. Those spikes need some kind of explanation. They may point to a deeper problem with the analysis – possibly that noise is the dominant term in the charted data.

  2. Well roared, lion. This will be another hammer blow against the failing CAGW theory. We skeptics are delivering a steady drumbeat and I think the block is cracking nicely. Good for Mr. Coleman.

  3. One hopes that this will be taken up by media organizations that have more credibility than WND and Drudge. One hopes!

      • Yes, it, tends to go like this: is the person making the statement more qualified than I am?

        If yes, insinuate he or she is bought and sold by Big Oil.

        If no, just call them stupid and unqualified and hope nobody actually looks at my Potemkin village data.

  4. If this letter from such an honorable and knowledgeable man as John Coleman is ignored by the news agency’s it was sent to then it is proof that they are not interested in the truth and are involved with this huge fraud.

    Jim Francisco

    • I wonder if TWC will say anything about it?
      John plus others built it into something that could be trusted and relied upon. That reputation made it worth buying.
      Those who bought it made it into something that can’t be relied upon and are parasites feeding off the reputation John Coleman and others earned.
      I wonder what they will say?

  5. One can only hope that after the 2016 elections the US National Science Foundation and its equivalents in Australia, Canada, China, Russia and maybe even India (home of the UN IPCC’s chief) will join the InterAcademy Council in accepting reality. Such actions might make even the hardcore pro-CACA regimes of Western Europe rethink their disastrous policies.

    A good start on the composition of a climate change “reanalysis” by the NSF would be “William Happer, Ph.D., Princeton University, Richard Lindzen, Ph.D., Massachusetts Institute of Technology, Willie Soon, Ph.D., Harvard Smithsonian Observatory, John Christy, Ph.D., University of Alabama” and some of the 9,000 other Ph.D. scientists who approved this letter. Plus maybe R. G. Brown, PhD, Duke University.

    • Even the election this year might make a difference, if Republicans regain control of the Senate. The GOP-majority US House of Representatives has already considered legislation to cut off funding for the UN’s corrupt IPCC.

      I’d also suggest shutting down Schmidt’s GISS and Trenberth’s NCAR as equally corrupted. NCAR unfortunately is located in Colorado, a swing state.

      • I advocated deporting as undesirable aliens both Gavin & Kevin, back to their native lands from which they could search for the missing heat in the waters around their home islands. A New Zealand commenter suggested a famously active volcanic island there as the location for Kevin’s new research center.

      • LLNL (under the US DoE) near Berkeley California would be my target, if I were on the House Appropriations committee. LLNL is where Ben Santer and his supercomputer-driven CMIP3/5 model fantasies are generated and warehoused. The GCM’s are a colossal waste of taxpayer money. The supercomputers and programmers and data analysts should be solving problems in hard sciences: nuclear physics, protein folding, molecular interactions, geomagnetic dynamo simulations, etc.

        As RGB has often pointed out, our climate is likely chaotic, nonlinear. We’ll never know all the initial conditions or even all the variables at play to properly initialize a climate model run. The DOE has thrown billions of tax dollars into that super computer CMIP furnace. That must end.

        Next up would be GISS for the budget chopping block, and all the watermelon jobs there. The unemployment line for them is my answer. Besides, I can’t think of a more expensive place in the US than New York City to put a major NASA “lab”. Columbia University that GISS collaborates with is well known throughout US academia circles as probably the most communist-socialist faculty and staff in the US.

        After LLNL’s GCM activities and GISS (in its entirety) would come EPA staff reductions of about 40% with amendments to the Clean Air Act to remove CO2 as a regulatable emission.

      • Joel,

        I couldn’t agree more. I’ve also spoken with my Congressman (OR Senators are hopeless) about cutting EPA & bringing it under Interior to rein in its excesses as an independent agency & law unto itself, rather lawless.

        IMO the few useful functions of NCAR could be combined with the NSIDC, to limit worse than worthless computer modeling. Remember that the “nuclear winter” scam also came out of NCAR.

        All the resources squandered on climate modeling, each run designed to be scarier & more remote from reality than the last, remind me of the money wasted in the war on cancer in the 1970s before basic science had advanced enough to warrant such expenditures.

      • milodonharlani commented

        Alright then, White Island it is for Kevin & the presumably safer “Isle of Mann” for Gavin.

        Both of those places seem far to nice, they need to spend some quality time in a desert, both to learn what warm is, second you’d get a object lesson on how quickly it cools when the Sun sets through all of that dangerous warming Co2.

      • White Island is in a nice place, but is also an active volcano, so does get pretty warm. Isle of Man however, you’re right is far too cool.

      • milodonharlani commented

        White Island is in a nice place, but is also an active volcano

        Think setting him up on a lava flow is [too] harsh?
        Nahhhhhhh!

      • White Island was suggested by commenter from New Zealand, although I concurred. Kevin would probably have time to evacuate before a major eruption. In the meantime he could happily find lots of heat hiding in the surrounding ocean.

      • White Island is in a nice place, but is also an active volcano

        Think setting him up on a lava flow is [too] harsh?

        I wouldn’t worry about it being too harsh. The lava might get a bit upset, but it will get back to normal after a while.

      • I think CA would be just perfect. Send all the climate watermelons there. And no they can’t have any water piped out of the Columbia or Snake. The icing on the cake: Anth*** is welcome to bring his family out of CA to NE Oregon. We have lovely homes and acreage here.

        [Or Colorado. .mod]

      • Kevin would probably have time to evacuate before a major eruption.

        Well, he wouldn’t have to worry about his laundry ;)

      • OK, I started us down the slippery slope, so please let that be the point at which we reach bottom. So to speak.

  6. Too bad John Coleman is an established skeptic (quick search on Youtube revealed he’s critical of global warming hype for more than 7 years) so he also has established audience and people who already chose to not listen to him will simply continue doing so.

  7. Easy to dispute:

    He’s a weatherman, not a climate scientist. Soon/Christy/Lindzen are all flat earthers and they are in the pay of the Koch brothers, even if Coleman isn’t.

    Not that any of that is relevant, but that is what will come out of this. Just give the NASA boys enough time to change the data and all will be well with the theory.

  8. This is a very big oops moment for the AGW congregation, the return to sanity continues – thanks Tony Abbott for sticking to your guns on the maniacal “Carbon Tax” surely even the most rabid greenut must begin to see a glimmer of light?

  9. Carbon Dioxide is the base molecule for photosynthesis. Photosynthesis is the source of all life on Earth.
    It is highly ironic that the “enviro movement” demonizes CO2, the basic building block for all life on Earth. The political claim that CO2 is “pollution” is simply insane.
    People should be putting as much CO2 as possible into the atmosphere.

    • It,s just a trace gas.after all.
      Literally all life on Earth is dependent on CO2 at a paltry 400ppm. A 400ppm concentration still limits the growth of plants. If CO2 were to double, the scaremongers will tell you the planet is doomed.
      When the truth is “the planet”, Humanity included, will thrive. Mother Gaia needs more CO2, not less.

      • On the whole, I’m a sceptic.
        But I’m really embarrassed by the content of some of the comments we get.
        “It’s just a trace gas, after all” and then you remind us, that all life on earth is dependent on it (not quite true, perhaps, but we get the idea).
        ?Any disjunct there.

      • Mothcatcher,

        Not necessarily a disconnect there. If CO2 doubled to 800 ppm, it would still be a trace gas with little effect on air temperature or climate, yet most plants & other photosynthetic organisms would flourish as a result.

      • Maybe a trace gas by some volume definition you have, but it is used with the implication ‘how can it have a significant effect, because it is a trace gas’ and then shows a significant – indeed an overwhelming -effect.
        Can’t see why that isn’t a nonsense.But maybe that was RobRoy’s point, in which case I apologise.

      • A Rob Roy is about 33% alcohol by volume, while CO2 is 0.04% of dry air.

        The climatic effect of CO2 above around 200 ppm is not significant (unlike the first 200 ppm), but the effect of higher concentrations on plant life is important.

        IMO, it’s a valid distinction.

  10. “In 2010 a high-level inquiry by the InterAcademy Council found there was “little evidence” to support the IPCC’s claims about global warming.”

    Erm, I think that might merit a [Citation Needed] tag. The InterAcademy Council report can be found here for reference: http://reviewipcc.interacademycouncil.net/report/Climate%20Change%20Assessments,%20Review%20of%20the%20Processes%20&%20Procedures%20of%20the%20IPCC.pdf

    The report isn’t about the science of climate change at all, but rather is about how to improve the IPCC process after criticism surrounding the 2007 report (e.g. the Himalayan glacier incident). It certainly doesn’t find “little evidence” to support the IPCC report contents, concluding that:

    “The Committee concludes that the IPCC assessment process has been successful overall and has served society well. The commitment of many thousands of the world’s leading scientists and other experts to the assessment process and to the communication of the nature of our understanding of the changing climate, its impacts, and possible adaptation and mitigation strategies is a considerable achievement in its own right. Similarly, the sustained commitment of governments to the process and their buy-in to the results is a mark of a successful assessment. Through its unique partnership between scientists and governments, the IPCC has heightened public awareness of climate change, raised the level of scientific debate, and influenced the science agendas of many nations.”

    • That’s revolting in & of itself, but made more so by the fact that that’s about as condemnatory as the public face of Big Science gets.

    • “their buy-in to the results is a mark of a successful assessment.”
      Governments read the political version given to them, not the scientific papers. Who writes the political version? WWF or Greenpeace or other avowed socialist one world order nuts. Ask Dr Patrick Moore for a fuller answer.
      So quoting your own political conclusion is hardly evidence. More post normal junk.

  11. John Coleman is great. I’m a Brit so I only became aware of him a few years ago, through this excellent website. I can’t remember if I saw him in The Great Global Warming Swindle, Doomsday Cancelled or some other movie/interview.

    I just so much respect and admire a person who will have the strength of their own conviction and “stick to their guns” and the truth, no matter what, through thick and thin.

    Great to see so many other real scientists rallying to his side also – most encouraging (while still being aware of what happened to the likes of David Bellamy in the UK).

  12. John Coleman was a great weatherman. He is missed on the airwaves here in San Diego. Hope he enjoys a long and fulfilling retirement, nobody deserves it more….

  13. …the IPCC has heightened public awareness of climate change, raised the level of scientific debate, and influenced the science agendas of many nations.”

    NO! The IPCC has spread pre-determined “science” propaganda. Propaganda readily diseminated by a fawning, Progressive main-steam media..
    The IPCC has no debates on the validity of it’s prime theory or efficacy of its global climate models.

    The IPCC has indeed influenced the agendas of nations, Western nations. This was the IPCC’s aim all along.

    • When I say ,i include the Contributers as well. As the IPCC itself doesn’t claim the “science”. They embrace the correct thinking The IPCC welcomes any research as long as it agrees with their AGW propaganda. After all, the science is settled.

    • The IPCC’s Charter is predicated on the idea that CAGW is real. The idea that they ever existed to debate or advance science is laughable.

    • Well, the IPCC has heightened public awareness of “global warming/climate change”. The fact that we haven’t been warming lately and that the climate always changes are two items that the IPCC has somewhat hidden from that same public.

      The IPCC has also influenced the “science” agendas of many nations, too. Not necessarily real science, but what they term “science” nonetheless.

      However, “raised the level of scientific debate”?! Well, from the level of “no debate” because there is no need to debate the catastrophism except perhaps as a thought exercise to any discussion or debate at all is a raise of sorts. Since so much of the IPCC reports are environmentalism based, what they have done is mostly stifled actual science debates while encouraged debate on “what we must do to avoid terrible things”.

    • I would edite “the IPCC has heightened public awareness of climate change, raised the level of scientific debate, and influenced the science agendas of many nations.”
      to
      “the IPCC has heightened public confusion such that they think a normal climate does not change unless CO2 is added, raised the level of scientific misinformation, and influenced the political science agendas of many nations.”

      And I stand by that edit!

  14. “There is no significant man-made global warming at this time”

    There has never been any significant “man-made” global warming unless you count the bogus figures coming from the government funded “data” sets.

    No one has ever proved that adding a tiny amount of CO2 to the atmosphere (at ground level by the way) can do anything on net to the temperature of the earth. Further, no one has proven that we can even measure the temperature of the earth to a resolution good enough to gage the effect of a tiny addition of CO2.

    It get tiresome to listen to those alarmist bozos.

    • Besides which, other human activities would cool the earth if their effect ever got big enough to measure. No one can even known the sign of possible human effect on climate, if any.

  15. His open letter on the Express already made Drudge Report yesterday. The Leftist-Progressive-Greens are really going to start hating the free-speech enabling internet where they can’t control message to the masses.

    They are really sharpening their knives for the open internet and free speech/free press.

  16. The photo of John Coleman next to the Weather Channel banner must make blood shoot out of the eyes of the current WC crew.

    • Dunno.

      I suspect those Weather Channel folks who have been there a long time may be looking at that picture and remember (or long for) the good old days when the channel reported on weather rather than politics disguised as weather reporting.

  17. I’ve been a fan of John Coleman since he was the weatherman on ABC 7 in Chicago (boy, was THAT a long time ago!) and I was thrilled to see him take up the fight for science and against “consensus” in the CAGW battles.

    Yes, he did invent TWC, but was screwed out of his due compensation by some corrupt “business partners”. For those who use Weather.com on their digital devices, you might want to support Coleman’s new venture, http://weathernationtv.com/ instead.

  18. Due to Coleman’s open letter, Mann will be more on the defensive at his talk tonight. That is good news.

    Within the next few years Mann eventually will have to start debating the failure of his theory of significant AGW from fossil fuels. Ultimately, he needs to debate to remain relevant at all. He has already lost effective initiative in the face of critics.

    John

    • Hope you’re right, but I’d be surprised if Mikey ever engages in a public debate. Gavin refuses to do so, & he’s ostensibly a public servant.

    • milodonharlani on October 23, 2014 at 12:21 pm

      – – – – – – – – –

      milodonharlani,

      It would be good if we get several reports from critics who attend tonight’s Mann speech.

      It is a possibly that Mann will be forced to debate eventually because just look at his speech sponsored by the Cabot Institute at the University of Bristol recently (the one our WUWT host attended). Mann spent ~50% of that speech bashing USA republicans. It is likely that he we use tonight at UCLA the same basic speech portion where he bashes republicans; I do not think he can help himself from doing so. Given that, should both house of Congress be controlled by republicans after the November elections and should Obama weaken even further than his current very weak rating . . . . Mann may need to debate just to have any public chance to defend himself against open criticism by the government and public.

      Get the popcorn!

      John

  19. The response to this is easier to predict than the weather. He will answered with character assassination not science. Good for him. Great letter.

  20. EXPRESS reporter Jason Taylor said,

    “[. . .]

    Climate expert William Happer, from Princeton University, supported Mr Coleman’s claims.

    He added: “No chemical compound in the atmosphere has a worse reputation than CO2, thanks to the single-minded demonization of this natural and essential atmospheric gas by advocates of government control and energy production.

    “The incredible list of supposed horrors that increasing carbon dioxide will bring the world is pure belief disguised as science.”

    In 2010 a high-level inquiry by the InterAcademy Council found there was “little evidence” to support the IPCC’s claims about global warming.

    It also said the panel had purposely emphasised the negative impacts of climate change and made “substantive findings” based on little proof.”

    Indeed, the IPCC’s post-modern philosophy based view of science is exactly what Dr. Happer describes it to be, namely, the IPCC’s philosophy of science is “pure belief disguised as science”.

    As to EXPRESS reporter Jason Taylor’s assessment of what the IAC report on the IPCC said, I agree that the thrust of the IAC report definitely found the IPCC significantly lacking in scientific discipline and professionalism. But, Taylor’s two statements wrt the IAC are condensations of the IAC report not literal statements from the report. Yes, one can read the whole IAC report and in substance agree with Taylor; I do. But also one could disagree with Taylor as well because the IAC report was ambiguously wordsmithed to not openly question the failing theory of significant AGW from fossil fuels.

    John

  21. Aside from the lack of warming indicated by satellite data (and satellites, by the way, provide the preferred data sets for alarmists attempting to scare us about sea level change and arctic/antarctic ice loss, among many other things), there is a profound lack of any harmful impacts of climate change that can be proven to be linked to humans. Where are the increases in tornadoes, hurricanes? Where are the sinking islands? Any year to year weather-related crises cannot be blamed on climate because we need two decades of data to establish any link. Yet, the crisis-addicted scientists and politicians continue with their ridiculous proclamations about climate change!

    By the way, EVERYONE can do their part to STOP the climate industry from taking away our freedoms. Drop CAGW-sympathetic media like CNN, MSBNC, The Weather Underground, The Weather Channel etc. from your computer, laptop and smart phone. Register your anger with their advertisers, who perhaps don’t realize the damage these people are doing to both science and society. And finally, for those folks in the U.S., PLEASE vote appropriately in November. This is perhaps one of the last elections where we can attempt to reverse the societal damage brought upon us by the extreme left-wing progressives. If we can win back the Senate, we stand a very good chance of defunding CAGW climate “science” once and for all. Thanks.

  22. Hmm… interesting.
    Intruiging from a UK point of view. It seems that the sceptic lobby has many competent (and of course, many less so) adherents, but the warmists have a near-monopoly of policy-makers. My job here is usually (having no technical expertise to offer) to ask questions which I hope will encourage explanations.

    The sceptics need a flag-bearer. Who is this guy? What weight will his words carry? My experience of the Weather Channel on US visits is of excruciatingly repetitive and irritating hypes of ordinary weather variations. In UK, the Daily Express is a toilet-paper publication whose main claim to fame is a full front-page headline warning of extreme weather every time we are likely to have a hot day, or an Atlantic depression is approaching ( which it does about 30 times a year).

    Is John Coleman a one-day tabloid headline or will it mean anything more?

    • Among skeptical policy makers are the PMs of Canada & Australia & a number of members of the US Congress. There’s a good chance that the Republican nominee for president in 2016 will also be a skeptic.

    • People in the UK lead a sheltered life! John Coleman was revered by my grandfather in his later years when The Weather Channel was just a series of forecast slides and very little watchable “programming”. Still Grandpa loved it. I started watching it too, becoming one of my favorite must see programs that flickered across my TV screen every day. Given that I last saw my turn of the century a few years ago, I can attest to its John Coleman beginnings, meteoroic rise, and then subsequent corruption by watermelons.

  23. I expect the UK’s largest independent news outlet (BBC) will be on to this like a flash ready to broadcast it far and wide.

    Oh God here comes the nurse again with the tablets.

  24. I’m curious what was Mr. Coleman’s role, if any, in the Weather Channel’s official statement on global warming:

    “More than a century’s worth of detailed climate observations shows a sharp increase in both carbon dioxide and temperature. These observations, together with computer model simulations and historical climate reconstructions from ice cores, ocean sediments and tree rings all provide strong evidence that the majority of the warming over the past century is a result of human activities. This is also the conclusion drawn, nearly unanimously, by climate scientists.”

    http://www.weather.com/encyclopedia/global/

  25. Cutting CO2 emissions prescribed by our climate “doctor” is akin to bloodletting by medical doctors centuries ago because of ignorance.

    It would be like a prescribing a fast for a patient starving or cutting back on fluids for a dehydrated patient.

    Increasing CO2 has been the best thing humans have ever done for plants and all the creatures living on this planet.

    The latest USDA crop estimates for US corn and soybeans production from this years growing season are more record crops……….by a wide margin.
    You can thank the increase in CO2 for part of that record…….and this Summer’s cooool weather and timely rains.

    http://usda.mannlib.cornell.edu/usda/current/CropProd/CropProd-10-10-2014.txt

    “Corn production is forecast at 14.5 billion bushels, up less than 1 percent
    from the previous forecast and up 4 percent from 2013. Based on conditions as
    of October 1, yields are expected to average 174.2 bushels per acre, up
    2.5 bushels from the September forecast and 15.4 bushels above the 2013
    average. If realized, this will be the highest yield and production on record
    for the United States”

    “Soybean production is forecast at a record 3.93 billion bushels, up slightly
    from September and up 17 percent from last year. Based on October 1
    conditions, yields are expected to average a record high 47.1 bushels per
    acre, up 0.5 bushel from last month and up 3.1 bushels from last year.”

    Not one of the gloom and doom projections over the last 3 decades has occurred. In fact, just the opposite has happened…………the earth is greening up, vegetative health and biosphere is booming and crop yields/world food production is soaring.

  26. “…I can be reached at 858-xxx-xxxx (redacted by Anthony)…” And no email address either. Does he really want to reject everyone who has anything to say?

    REPLY: Don’t be foolish, I’m not in the mood- Anthony

  27. “There is no significant man-made global warming at this time”

    The past 12 months—October 2013–September 2014—was the warmest 12-month period among all months since records began in 1880, at 0.69°C (1.24°F) above the 20th century average.

    This breaks the previous record of +0.68°C (+1.22°F) set for the periods September 1998–August 1998, August 2009–July 2010; and September 2013–August 2014.

    http://www.ncdc.noaa.gov/sotc/global/2014/9

    And to think there was no El Niño during this period!

      • Even if those 12 months were as warm as pretended (which I grant you they weren’t), it wouldn’t matter. The trend is what matters, & from 2000 to 2013 that was cooling, even in HadCRU’s heavily bent, folded, spindled, mutilated, abused, molested & manhandled until its mother wouldn’t recognize it series.

        http://notrickszone.com/2013/09/12/no-warming-left-to-deny-global-cooling-takes-over-cet-annual-mean-temperature-plunges-1c-since-2000/

        And that within an even longer period of flat GASTA “data” (as “adjusted”) without any statistically significant global warming, starting, depending upon series source, c. 1997.

        Moreover, the world has been in a pronounced cooling trend at least since the Minoan Warm Period, c. 3300 years ago, & arguably since the peak of the Holocene Climatic Optimum. The centennial scale ups & downs (such as the Medieval WP & LIA) since then have just been fluctuations around the declining trend line.

      • According to the measurements, there isn’t a cooling trend in max or it looks like average temps either. There is a big cooling trend in min temps though.
        My record on attribution isn’t very good in my opinion, so I’m trying to avoid that.
        But, I’ve been professionally doing database work for almost 2 decades now, so I am confident this is what the data says. And it’s all available for anyone who wants to check my work.

      • Mi Cro

        Am fascinated by what you have done and still trying to get my head around it. Would love for Anthony to allow you to post an article so it can be explored by all.

        Two questions;
        The 1997/8 global T anomaly just shows up as a “normal” small blip. Why? (ie what are the implications statistically given the large jump on all databases).

        What sort of graph would ensue with the adjusted data? (wondering if a similar graph evolves and hence am trying to understand what info your method brings to the fore.)

        Fascinating and thanks for the insight.

      • TonyM,
        I’ll do a better job of an explanation in the morning, but I’d suggest following the url to the reports folder and get the continents zip, and there should be a zip of daily reports and look at those.
        Those and my description of the process above will be something to ponder on :)

      • tonyM commented

        Am fascinated by what you have done and still trying to get my head around it. Would love for Anthony to allow you to post an article so it can be explored by all.

        He did, it was based on earlier work, but the basics are the same, I’ve just tried to reduce the influence of partial years data to make sure that can’t be used as an excuse for the fact it doesn’t show any warming.
        https://wattsupwiththat.com/2013/05/17/an-analysis-of-night-time-cooling-based-on-ncdc-station-record-data/

        Two questions;
        The 1997/8 global T anomaly just shows up as a “normal” small blip. Why? (ie what are the implications statistically given the large jump on all databases).

        So, while pondering this I realized the anomaly I’ve created is different than GAT anomaly charts. They are based (I think) on anomaly against a base line temp. These are based on the temp the station measured yesterday. I started this because of how quickly it cools once the Sun sets. I was trying to see if when looking at very large numbers of station, if today was warmer than yesterday, at least to the resolution of the measuring equipment (the station). These charts show there isn’t in Max temps, but there is differences in min temp, some years up, some down.

        What sort of graph would ensue with the adjusted data? (wondering if a similar graph evolves and hence am trying to understand what info your method brings to the fore.)
        Fascinating and thanks for the insight.

        This I don’t have an answer to. I suppose it would have a lot to due with how the adjustments are made.

      • Have you taken your average daily data and then plotted the cum value vs time (running sum)? This is a way of seeing if there is buried in the “anomaly” data a small signal of temp change over time. If it is truly random noise fluctuations in the data, the cum should show a fairly flat line with no significant rise or fall.

      • Hi MikeO,

        I think that Mi Cro needs some reference work to help him understand the data. I’ve been trying to explain the low signal to noise ratio problem with differentiating two temperatures to determine a rate of change for the entire day. Since global temps are rising at about 1-2 C per century, it would seem that the dominant, unaccounted terms in the day/night cooling are orders of magnitude larger than a small change due to CO2 increases year to year.

        Is there any good on-line resource for understanding how to calculate the ratio of Noise to signal in the temperature data, and the subsequent rate of change as calculated by Mi?

        –CG

      • c grier commented

        Since global temps are rising at about 1-2 C per century, it would seem that the dominant, unaccounted terms in the day/night cooling are orders of magnitude larger than a small change due to CO2 increases year to year.

        When it’s 10 or 20 below for a week in January, come August when it’s the statically 2 degrees warmer, when was the energy that warmed the surface thermalized? 60 years ago, or this Summer?

      • Could you give a link to the data? just going to sourceforge is not specific enough and I cannot the file referenced.

      • MikeO commented

        Have you taken your average daily data and then plotted the cum value vs time (running sum)? This is a way of seeing if there is buried in the “anomaly” data a small signal of temp change over time. If it is truly random noise fluctuations in the data, the cum should show a fairly flat line with no significant rise or fall.

        I don’t think I’ve done exactly this, but I have added a Sum by period, and I’ve done an average and now sum for the entire report run.
        If you haven’t, it might help if you go to sourceforge and get the continents.zip, start looking at some of the report data. It has both yearly and daily averages, and the sums can be calculated by multiplying the averages by the counts.

    • Martin,
      Are you seriously getting your panties in a twist about 1/100th of a degree? Not to mention that this means that the temperature now is pretty much the same as it was in 1998…..16 years ago. Hey, its your numbers, I’m just pointing out what your numbers say. No significant warming, just like Coleman said.

      But I’d like to draw your attention to the balance of his statement, which included the words “man-made”. The earth has been warming up for the last 400 years at more or less the same rate. So, with only 1/100th of a degree in the last 16 years (again, your number not mine) we can only conclude that if there is a man-made component, then it is very, Very, VERY small.

      • Not to mention that earlier in the Holocene & prior interglacials have been much warmer than now, without benefit of man-made GHGs. Nothing unusual has happened since 1977, 1945, 1900, 1850 or 1750, thus the null hypothesis of normal natural climatic fluctuations cannot be rejected.

        The warming cycle in the early 18th century, coming out of the depths of the LIA during the Maunder Minimum, was both higher in amplitude & longer in duration than the warming of the late 20th century (1977-96). The warming of the early 20th century was virtually identical to that of the late.

      • The thing is 1998 was a super El Nino. Right now we are seeing temps above 1998 without an El Nino…

        “The hottest years on record were in 2005 and 2010, which just pipped the “super El Nino” year of 1998 – a year often used by climate change sceptics to claim global temperatures haven’t increased in as long as 18 years.

        The fact 2014 may challenge for the hottest year even with at most a weak El Nino is one reason climatologists warn action must be taken to curb the rise of greenhouse gas emissions that trap ever more heat from the sun.

        The bureau’s Dr Watkins said heat records could be broken even without a “full-blown El Nino” because of the planet’s broadscale warming. Sea-surface temperatures in the central Pacific, for instance, had increased by about 0.5 degrees since the 1950s.”

        Read more: http://www.smh.com.au/environment/weather/pacific-warms-towards-el-nino-levels-as-australia-heats-up-20141021-119bzl.html#ixzz3H2Eo6GCf

      • Martin,

        Your first bite of the poison apple [is] believing that this was as hot as 1998. It wasn’t. The snows say so.

        The surface data are being cooked via changes of methods, and the past is regularly made colder. Compare GHCN v1 and v3 for the same stations and the same times and you find that what ought to be the same data has a colder past in the v3 version.
        http://chiefio.wordpress.com/2012/06/20/summary-report-on-v1-vs-v3-ghcn/
        Reality is reaching the point where it is no longer possible to hide the fudge in the historical numbers. We’ve got crop failures in snow all over the N.H. and we have overlapping ski seasons in N and S hemispheres. As we continue cooling, this will get stronger.

        We’ve got glacial ice growth in many places. We’ve got record Arctic rate of ice growth, and a return to prior area / extent of ice. We’ve got Antarctic record ice extent.

        Wake up and smell the coffee, see the snow, feel the cold. If you still think it’s the ‘warmest ever’, I suggest medication adjustment…

  28. Martin October 23, 2014 at 8:57 pm

    Martin, I suggest you peruse the various posts by Bob Tisdale on sea surface temps, El Nino and other ocean processes. You will come away with a considerably better understanding of the data and how to interpret it.

  29. Martin;
    “The hottest years on record were in 2005 and 2010, which just pipped the “super El Nino” year of 1998 – a year often used by climate change sceptics to claim global temperatures haven’t increased in as long as 18 years.
    >>>>>>>>>>>>>>

    Do the math Martin. 1998 was only 16 years ago. The lack of statistically significant warming extends to two years before the 1998 super El Nino. So, by your own assertion, 1998 is NOT what skeptics are using to calculate the 18 year period during which temps haven’t increased.

    But let’s get back to perspective. You still have your panties in a twist about a very small number. If you put $1 in a jar every day for 100 days, and after that you put a penny in every day, you could rightfully say that on day 101 you had more money in the jar than ever. And you could make the same for day’s 102, 103…but would you have enough more money to change what you could buy with it in any meaningful manner? No, you could not. So let’s put the temperature record into the same perspective:

    noaa_gisp2_icecore_anim_hi-def3.gif

    • Mods ~ interesting result in pasting this link in. I meant it only as a link. But it is displaying the animated gif instead, but only the first two slides. Is there a way to either enable all the slides or else reduce it to just a link?

      [Made it a link. Seems to give all the slides for me (may need a click?) -ModE]

  30. Coleman, he of the journalism degree with a career in being a weatherman is your Patron saint?
    It’s laughable – he worked in weather, not climate science yet you want to claim him as some climate science expert. He showed his colors when he lost his marbles in 2007 – he’s just another pro-biz conservative.

    • Rob Gunderson:
      Your statement sounds like some sort of appeal to authority, which precludes a substantive thought.

      Look, If we believe the 97% of climate scientists claim (and I do not), then, 97% of climate scientists could not and still can not accurately predict neither weather nor climate.

      Weatherman, who work in the field of climate, have a good track record of predicting weather, and by an large have also had a better track record with regard to climate predictions.

      This particular weatherman is perhaps one of the best in the field.

      I trust this gives you some insight as to why Coleman is so well haled.

      • Not an appeal to authority – an outright ad hominem attack, the last refuge of a scoundrel who can’t muster an argument on its own merit.

      • Vince Causey October 24, 2014 at 12:05 pm
        Not an appeal to authority – an outright ad hominem attack, the last refuge of a scoundrel who can’t muster an argument on its own merit.
        ++++++
        I can live vicariously through a straight shooter like you Vince. I pointed to the old argument, that only the opinions of (fill in the blank) Climate Scientists matter. We agree that this is utter nonsense.

    • Of course he worked in weather. And he did so considerably longer than Mann and Jones worked in their spoof climatology. Mann, Jones et al. are mainstream profiteers who make their living out of warm-mongering. Coleman is an independent mind and can afford to call a spade a spade!

      • What Jones & Mann practice is the misnomer “climate science”, not climatology. Bogus, GIGO GCM-reliant “climate science” displaced genuine, observation-based, hypothesis-testing, falsifiable climatology in the ’80s. “Climate science (TM)” is anti-scientific.

  31. You really need to fix the claim WND is a “news service”. Google News lists the Coleman story in:
    Express, Daily Mail, Washington Times, Breitbart News. Have to wait for tommorow’s Telegraph to see what Booker says.

  32. Missing from most discussions is thermalization of EMR energy. When a gas molecule absorbs a photon of EMR energy and conducts the energy to other gas molecules before it emits a photon, the absorbed EMR energy has been thermalized. If no time passed between absorbing and emitting, there would be no way to tell that it occurred and there would be no ‘greenhouse effect’.

    Calculations show that the interval between absorption and emission is very short, 10 microseconds. However, the time for conduction to take place, which is the time between impacts of gas molecules, is much shorter, about 0.0001 microseconds at sea level conditions. Thus it takes about 100,000 times longer to emit an absorbed photon than to thermalize the absorbed energy. Obviously absorbed terrestrial EMR energy is thermalized. References to the calculations are included in the Science explains… section of http://agwunveiled.blogspot.com

    If the absorbed photons were emitted, the flux would not decline. Instead the flux at 15 microns goes to zero in 300 meters or so. The energy has to go someplace. It is conducted to other molecules. It is thermalized. At TOA, the flux near 15 microns appears to be mostly S-B radiation from clouds. The ‘line’ at 15 microns is from the sparse CO2 molecules that have been excited to radiate by reverse thermalization from (collision with) non ghg molecules.

    Thermalized energy carries no identity of the molecule which absorbed the photon. In the terrestrial radiation spectrum (nearly all 5-50 microns) water vapor is about 15,000 ppmv with 465 absorption ‘lines’ per molecule while the 100 ppmv CO2 increase has only 1 absorption ‘line’ per molecule. Because the increase in absorption opportunities of a 100 ppmv increase in CO2 is only about 1 in 70,000, the effect of this increase in CO2 is insignificant. The process that eventually results in the energy being radiated from the planet is described in Science explains…

  33. What is even more surprising than the UK Daily Express’s publication of this article is that it is still online after three days. Articles like this normally disappear by noon on the same day after a phone call from the University of East Anglia to the digital editor of the Daily Express. Are UEA losing their touch?

  34. On the lighter side of science, I have been in horticulture and bonsai for years. I can tell you this, plants grow better when it rains, than if you just use a garden hose. Why? Rain brings down gases from the atmosphere that plants love. Arn’t plants a great example of life on this planet. If they get more rain than usual they grow and reproduce and transpire more CO2 and oxygen! And there is little we humans can do about changing it. Gud on you Mr Coleman, I have faith in you and your companions and human’s ability to adapt.
    When we can’t for some reason, we will perish.

Comments are closed.