Guest essay by Joseph D’Aleo, CCM
Last month was the hottest June since record keeping began in 1880, the US National Oceanic and Atmospheric Administration (NOAA) said Monday. It marked the third month in a row that global temperature reached a record high. According to the NOAA data, April and May were also global record-breakers. The combined average temperature over global land and ocean surfaces for June 2014 was record high for the month at 16.22 degrees Celsius, or 0.72 degree Celsius above the 20th century average of 15.5 degrees Celsius,’ the NOAA said in its monthly climate report. “This surpasses the previous record, set in 2010, by 0.03 degrees Celsius.”.
Nine of the ten hottest Junes on record have all occurred during the 21st century, including each of the past five years, the U.S. agency said.
However as we have shown here, the warming is all in the questionable adjustments made to the data, with a major cooling of the past and allowance for UHI contamination in recent decades. The all time record highs and days over 90F tell us we have been in a cyclical pattern with 1930s as the warmest decade.
NOAA and NASA (which uses data gathered by NOAA climate center in Asheville) has been commissioned to participate in special climate assessments to support the idealogical and political agenda of the government. From Fiscal Year (FY) 1993 to FY 2013 total US expenditures on climate change amounted to more than $165 Billion. More than $35 Billion is identified as climate science. The White House reported that in FY 2013 the US spent $22.5 Billion on climate change. About $2 Billion went to US Global Change Research Program (USGCRP). The principal function of the USGCRP is to provide to Congress a National Climate Assessment (NCA). The latest report uses global climate models, which are not validated, therefor speculative, to speculate about regional influences from global warming.
The National Climate Data Center and NASA climate group also control the data that is used to verify these models which is like putting the fox in charge of the hen house. At the very least, their decisions and adjustments may be because they really believe in their models and work to find the warming they show – a form of confirmation bias.
Please note: This is not an indictment of all of NOAA where NWS forecasters do a yeoman’s job providing forecasts and warnings for public safety.
NCEP gathers real time data that is used to run the models. When we take the initial analyses that go into the models and compute monthly anomalies, we get very small departures from normal for the 1981 to 2010 base period on a monthly or year to date basis.
The satellite data from RSS and UAH only available since 1979 also shows no warming for over a decade (two in the RSS data). It needs no adjustments that NOAA claims are required for station and ocean data.
This government manipulation of data may be simply a follow up to the successful manipulation of other government data that has largely escaped heavy public scrutiny.
Over the last 12 months, the CPI has increased 2.1%. Real inflation, using the reporting methodologies in place before 1980, hit an annual rate of 9.6 percent in February, according to the Shadow Government Statistics newsletter. The BLS U6 measure, the total unemployed, plus all persons marginally attached to the labor force, plus total employed part time for economic reasons, as a percent of the civilian labor force plus all persons marginally attached to the labor force is 12.1%.
CPI is used to adjust social security benefits and military pay and to a large degree as one factor in industry wages. if you are feeling you are falling behind, it is because the real costs of goods and services have risen more than any income or benefits you receive. That is why the GDP actually fell early this year – between the high cost of energy and food, the discretionary income for spending retail and in restaurants fell.
Unemployment fell to 6.1% according to the government but the real unemployment is much higher. Inflation, using the reporting methodologies in place before 1980, hit an annual rate of 9.6 percent in February, according to the Shadow Government Statistics newsletter. Using the employment-population ratio, the percentage of working age Americans that actually have a job has been below 59 percent for more than four years in a row. That means that more than 41 percent of all working age Americans do not have a job.
The sad news is if NOAA keeps providing the government with tainted data to justify its EPA assault on our country’s only reliable energy sources, inflation will skyrocket and unemployment will follow.
![201406[1]](http://wattsupwiththat.files.wordpress.com/2014/07/2014061.gif?resize=640%2C494)
![screenhunter_1225-jul-22-08-14[1]](http://wattsupwiththat.files.wordpress.com/2014/07/screenhunter_1225-jul-22-08-141.gif?resize=640%2C544)
![ncep_cfsr_t2m_anom_062014[1]](http://wattsupwiththat.files.wordpress.com/2014/07/ncep_cfsr_t2m_anom_0620141.png?resize=640%2C512&quality=75)
![Screen_shot_2014-07-21_at_11.38.43_PM[1]](http://wattsupwiththat.files.wordpress.com/2014/07/screen_shot_2014-07-21_at_11-38-43_pm1.png?resize=576%2C437&quality=75)
![ncep_cfsr_t2m_anom_ytd_%281%29[1]](http://wattsupwiththat.files.wordpress.com/2014/07/ncep_cfsr_t2m_anom_ytd_281291.png?resize=640%2C512&quality=75)
![cfsr_t2m_2005[1]](http://wattsupwiththat.files.wordpress.com/2014/07/cfsr_t2m_20051.png?resize=640%2C480&quality=75)
![Screen_shot_2014-07-16_at_10.47.07_AM[1]](http://wattsupwiththat.files.wordpress.com/2014/07/screen_shot_2014-07-16_at_10-47-07_am1.png?resize=618%2C463&quality=75)
///////////////
http://ww2.tnstate.edu/ganter/BIO311-Ch6-Eq1.gif
…
Which means if “n” is 3000, and you want a 0.01 error bound, as long as “s” (the standard deviation) of the AGRO’s bouys is less than 0.5477…….you get 0.01 accuracy.
///////////////
Not so fast….Sure, that’s the case in basic statistics in a book. But, the bouys drift and thermometer accuracy drifts and they dive and come back up, and old stations die and new ones come online, and then transmit data which gets loaded into a computer and code adjusted by some grad student and an algo applied to correct for all these things and a million other little factors come into play that we haven’t thought of yet.
So, when you roll dice or flip coins in a stats book, it’s all neat and clean. In the real world, the errors are much larger that what this equation…
http://ww2.tnstate.edu/ganter/BIO311-Ch6-Eq1.gif
…predicts.
In my many years of statistical estimation and forecasting in real time in the real world, I find the uncertainty is almost always greater than what I originally assumed. Often this is because I made assumptions without even realizing them.
If the errors were as small as predicted by the standard stats formulas, then we wouldn’t get widely varying readings from month to month in the different temp data sets.
Chuck,
a post from RGB at Duke University (physics):
= = = = = = = = = =
N[o]te well that the situation with this data is far worse than even this suggests, because while we have comparatively dense surface station readings (at least in some heavily oversampled regions like the US and Europe) the surface area of the Earth is 70% Ocean and our knowledge of that 70% sucks, especially in the era preceding the satellite record (which started to give us accurate SSTs).
If there were an honest human being working in climate science today, they would stop posting two decimal points — for example, 287.16 — for the Earth’s mean temperature. They would stop posting one decimal point — 287.2. They would post no decimals at all, and they would add a confidence interval such that it is e.g. 95% likely that the true “mean temperature” of the Earth’s surface (averaged over God knows what for God knows how long, given that it is a coarse grained average in space and time and not a relevant measure for dynamic energy balance — for that one would like the fourth-root-T-to-the-fourth “average”) lies within the confidence interval, or otherwise post a meaningful error bar.
http://wattsupwiththat.com/2012/12/25/bethlehem-and-the-rat-hole-problem/#comment-1183054
= = = = = = = = = = = = =
plus:
===
“Numbers are often rounded to avoid reporting insignificant figures. For instance, if a device measures to the nearest gram and gives a reading of 12.345 kg (which has five significant figures), it would create false precision to express this measurement as 12.34500 kg (which has seven significant figures).”
http://en.wikipedia.org/wiki/Significant_figures
===
and if all that didn’t help, maybe this will:
= = = = = = = = = =
The idea is this: Suppose you measure a block of wood. The length is 5.6 inches, the width is 4.4 inches, and the thickness is 1.7 inches, at least as best you can tell from your tape measure. To find the volume, you would multiply these three dimensions, to get 41.888 cubic inches. But can you really, with a straight face, claim to have measured the volume of that block of wood to the nearest thousandth of a cubic inch?!? Not hardly! Each of your measurements was accurate (as far as you can tell) to two significant digits: your tape was marked off in tenths of inches, and you wrote down the closest tenth of an inch that you could see. So you cannot claim five decimal places of accuracy, because none of your measurements exceeded two digits of accuracy. You can only claim two significant digits in your answer. In other words, the “appropriate” number of significant digits is two, and you would report (in your physics lab report, for instance) that the volume of the block is 42 cubic inches, approximately.
http://www.purplemath.com/modules/rounding3.htm
= = = = = = = = = =
That’s from a high-school lesson in mathematics.
Khwarizmi says:
July 29, 2014 at 10:52 pm
The above example of significant digits is a good indication of how poorly we understand accuracy. If a block of wood was measured to within +/- 0.1 in. to be 5.6 x 4.4 x 1.7, there would be two end members, namely
5.5 x 4.3 x 1.6 (the smallest volume) and
5.7 x 4.5 x 1.8 (the largest volume).
The former is 37.84 cu in. (38) and the latter is 46.17 (46), compared to the measured 41.888 (42). This variation is +/- 4 cu in. or a 10% error of the volume estimate. The appropriate number would be 42 cu in +/- 4 cu in. (one significant digit) and it wouldn’t even include all possible outcomes (the end members).
You cannot take several hundred temperature measurements using different instrumentation located in different places and reduce the error in the overall measurement by combining them. On a good day the accuracy of the earth’s temperature is +/- 0.5 deg and the trend is measurable to half a century.
Steve from Rockwood says:
July 30, 2014 at 5:27 am
“You cannot take several hundred temperature measurements using different instrumentation located in different places and reduce the error in the overall measurement by combining them.”
…
That is true..
However an “average” is not a temperature measurement, it’s a descriptive statistic for a set of numbers. Don’t confuse the apples with the oranges.
..
The Standard Error measurement of a population mean can be made more accurate by increasing the number of observations.
Khwarizmi says:
July 29, 2014 at 10:52 pm
…
“That’s from a high-school lesson in mathematics.”
…
I have made no statement regarding the accuracy of any specific measurement. You, like others are confusing your apples and oranges.
..
The calculated average of a set of numbers is ***NOT*** a measurement of a physical quantity. It is a number that describes the SET of numbers it is derived from. Estimating a population mean from a sample gets closer and closer to the real value of the population mean as the number of observations increases. In the limit, you have EXACT precision in measuring the population mean when the number of observations equals the number of elements in the population.
Mary Brown says:
July 29, 2014 at 9:49 pm
..
“n the real world, I find the uncertainty is almost always greater”
…
Is the standard deviation for the temp (& salinity) measurements greater or less than 0.5477? You are free to include any and all sources of measurement errors that you can think of to answer this question. You must also take into account that the instruments in each buoy are calibrated before being used.
dbstealey says:
July 29, 2014 at 7:58 pm
” That you cherry-pick factoids like any other alarmist”
..
A swing and a miss, strike one.
…
You have made the biggest invalid assumption you could possibly make
..
What is your evidence of me being an “alarmist?”
dbstealey says:
July 29, 2014 at 7:58 pm
” Tiny anomalies are not worth worrying about.”
..
Yes they are, especially when you are measuring the heat content of the oceans, or the mass of a Higgs boson You seem to forget when you multiply a very “tiny” anomaly with a very large number (quantity of water in oceans) the result is significant.
chuck,
You seem to be in an argumentative mood. In fact, you’re arguing with everyone here.
Let me cut to the chase:
Global warming has stopped. And not just yesterday — global warming stopped almost twenty years ago.
Everything else is obfuscation.
dbstealey…
” Tiny anomalies are not worth worrying about.”
Chuck…
Yes they are, especially when you are measuring the heat content of the oceans, or the mass of a Higgs boson You seem to forget when you multiply a very “tiny” anomaly with a very large number (quantity of water in oceans) the result is significant.
Mary…
If you start with a tiny number that may have a large error and multiply “with a very large number (quantity of water in oceans)” the error can be enormous.
The standard error in ARGO data and sfc temperature would be miniscule in a perfect world of statistical sampling. But mentioned above are all the examples of why this is not the case.
For global sfc temps, the measurement error is often quoted as 0.15 deg C. which seems reasonable using the gut check and the monthly disagreements in data set. Pure statistical number crunching would put the number much less than .01 deg C
dbstealey…
” Tiny anomalies are not worth worrying about.”
My mom taught me this concept. I would be praying for snow and would turn on the flood lights at night and squint and then yell “Mom! It’s snowing”.
She would glance out and say “I don’t see anything”
“Look closer. In the light. Right there !”
Mom…”If I have to look that close, it’s not worth seeing”.
Much of the global warming debate is like that. If it has taken my entire (long) life to warm just a fraction of a degree, is it really worth looking out the window?
dbstealey says:
July 30, 2014 at 9:11 am
“Everything else is obfuscation.”
…
Trying to educate people that don’t understand statistics or the mathematics behind measurement is not “arguing”
…
Please note, I have not mentioned ANYTHING about warming or cooling. Do yourself a favor and stop making rash assumptions
Mary Brown says:
July 30, 2014 at 9:29 am
” that may have a large error ”
Of course they MAY have a small error. The error bars and relevant parameters are included in the analysis of the data, in fact the “error” is already known.
chuck argues:
Trying to educate people that don’t understand statistics or the mathematics behind measurement is not “arguing”.
chuckles doesn’t understand the central issue:
Global warming has stopped. Everything else is obfuscation.
dbstealey says:
July 30, 2014 at 10:17 am
..
“Global warming has stopped.”
…
Some people say that, some say the opposite……time will tell.
..
In the meantime, please try to understand the issues underlying the use of statistics in science.
Chuck says…
“The error bars and relevant parameters are included in the analysis of the data, in fact the “error” is already known.”
The true error of data in climate science can be estimated but certainly not ever ‘known’. And you are completely ignoring my point…that there can be substantial errors which cannot be crunched away with textbook statistics. Like Antarctic sea ice coding errors and satellite drift errors in UAH data, and TOB errors in sfc temps, and urban heat island contamination, and instrument manufacture changes and station site changes and on and on and on.
And then there was the old LFM model which simply multiplied the precip by two due to coding error. Nobody noticed it for years but everybody knew that it had a nasty “wet bias”. Stuff happens. Nobody can convice me of ARGO or sfc temp data errors less than 0.01 deg C. No way. Not even close.
chuck says:
Some people say that, some say the opposite…
And which group do you fall into? I think we know.
Satellite measurements — the most accurate data we have — shows conclusively that global warming has stopped. Those who question that data show that they cannot accept what Planet Earth is clearly telling us.
Because if they accepted the fact that global warming has stopped, then all the things they have been telling their friends, and writing in blogs, and that they believe themselves, has been flat wrong. Not because skeptics tell them they were wrong, but because the ultimate Authority — the real world — is showing that their beliefs are wrong.
That’s hard for some folks to take. Skeptics are used to it, because skeptics want knowledge. Being right has a much lower priority. But climate alarmists need to be right, more than anything else. So they say things like, “Some people say that, some say the opposite.” They will not let go of their debunked beliefs.
Being wrong is hell on their egos. So they obfuscate. But climate alarmists set the ground rules, not scientific skeptics. Live by your conjecture, die by your conjecture.
dbstealey says:
July 30, 2014 at 11:36 am
“And which group do you fall into? I think we know.”
…
I’m glad you “know”
…
It’s amazing that you can “know” something you have no evidence of. Nothing beats evidence-less proof of your “belief”. I guess you are the “religious” type.
Mary Brown says:
July 30, 2014 at 11:33 am
” Nobody can convice me ”
…
You are free to ignore how mathematics works if you so choose. But the numbers tell a different story. The standard deviation of the temp sensors is significantly below 0.5477
..
” Nobody can convince me ”
…
You are free to ignore how mathematics works if you so choose. But the numbers tell a different story. The standard deviation of the temp sensors is significantly below 0.5477
Ok Chuck. You follow you rigid stat formula and all its rigid assumptions that don’t exist in the climate system.
I’ll continue to do stats in the real world where I’m compensated according to my success.
End of discussion
chuck says:
I’m glad you “know”
It’s easy-peasy. You cannot bring yourself to admit that global warming stopped many years ago.
Ergo: alarmist.
Got your number, chuckles.
You can make me wrong. Just admit that global warming stopped, therefore all the wild-eyed alarmist predictions were wrong, and there is nothing to support the “carbon” scare. It was a complete false alarm.
Go on, say it. ☺
dbstealey says:
July 30, 2014 at 1:25 pm
“You cannot bring yourself to admit that global warming stopped many years ago.”
..
It may have, it may not have. You can conclude nothing when I do not answer the question one way or the other.
…
Besides math, you don’t do well at logic either.
Mary Brown says:
July 30, 2014 at 1:19 pm
“You follow you rigid stat formula ”
…
You betcha.
The funny thing about mathematics is that the rules and formulas ARE absolute, therefore very “ridged”
chuckles says:
It may have, it may not have.
BINGO!
Alarmist exposed. Can I smoke ’em out, or what? ☺
dbstealey says:
July 30, 2014 at 3:31 pm
“BINGO!” ???
..
A swing and a miss.