Climate Science Double-Speak

A Quick Note from Kip Hansen

 

questionsA quick note for the amusement of the bored but curious.

While in search of something else, I ran across this enlightening page from the folks at UCAR/NCAR [The University Corporation for Atmospheric Research/The National Center for Atmospheric Research — see pdf here for more information]:

What is the average global temperature now?

We are first reminded that “Climate scientists prefer to combine short-term weather records into long-term periods (typically 30 years) when they analyze climate, including global averages.”  As we know,  these 30-year periods are referred to as “base periods” and different climate groups producing data sets and graphics of Global Average Temperatures often use differing base periods, something that has to be carefully watched for when comparing results between groups.

Then things get more interesting, in that we get an actual number for Global Average Surface Temperature:

“Today’s global temperature is typically measured by how it compares to one of these past long-term periods. For example, the average annual temperature for the globe between 1951 and 1980 was around 57.2 degrees Fahrenheit (14 degrees Celsius). In 2015, the hottest year on record, the temperature was about 1.8 degrees F (1 degree C) warmer than the 1951–1980 base period.

Quick minds see immediately that 1.8°F warmer than 57.2°F is actually 59°F [or 15° C]  which they simply could have said.

UCAR/NCAR goes on to “clarify”:

“Since there is no universally accepted definition for Earth’s average temperature, several different groups around the world use slightly different methods for tracking the global average over time, including:

    NASA Goddard Institute for Space Studies

    NOAA National Climatic Data Center

    UK Met Office Hadley Centre”

We are told, in plain language, that there is no accepted definition for Earth’s average temperature, but assured that it is scientifically tracked by the several groups listed.

It may seem odd to the scientifically-minded that Global Average Temperature is measured and calculated to the claimed precision of hundredths of a degree Celsius without first having an agreed upon definition for what is being measured.

When I went to school, we were taught that all data collection and subsequent calculation requires the prior establishment of [at least] an agreed upon Operational Definition of the variables, terms, objects, conditions, measures, etc. involved.

A brief of the concept: “An operational definition, when applied to data collection, is a clear, concise detailed definition of a measure. The need for operational definitions is fundamental when collecting all types of data.  When collecting data, it is essential that everyone in the system has the same understanding and collects data in the same way. Operational definitions should therefore be made before the collection of data begins.”

Nonetheless, after having informed the world that there is no agreed upon definition for Global Average Temperature, UCAR assures us that:

“The important point is that the trends that emerge from year to year and decade to decade are remarkably similar—more so than the averages themselves. This is why global warming is usually described in terms of anomalies (variations above and below the average for a baseline set of years) rather than in absolute temperature.”

In fact, the annual anomalies themselves differ one-from-another by > 0.49°C — an amount just slightly smaller than the whole reported temperature anomaly from 1987 to date (a 30-year climate period).  [The difference between GISS June 2017 and UAH June 2017].

So, let’s summarize:

  1. We are told that 2015, the HOTTEST year ever, was …. what? ….. 59°F or 15° C – which is not hot except maybe in the opinion of the Inuit and other Arctic peoples — which may be a clue as to why they really talk in anomalies instead of absolute temperatures.
  2. Although a great deal of fuss is being made out of Global Average Temperature, there is no agreed upon definition of what Global Average Temperature actually means or how to calculate it.
  3. Despite the problems of #2 above, major scientific groups around the country and the world are happily calculating away on the as-yet undefined metric, each in a slightly different way.
  4. Luckily (literally, apparently) the important point is that although all the groups get different answers to the Global Average Surface Temperature question – we suppose it’s because of that lack of an agreed upon definition of what they are calculating — the trends they find are “remarkably similar”. [That choice of wording does not fill me with confidence in the scientific rigor of the findings — it so sounds like my term – “luckily”].  Even less reassuring is being told that the trends are “more [remarkably similar] … than the averages themselves.
  5. And finally, because there is no agreed upon definition of Global Average Temperature and the results for the undefined metric from varying groups are less [remarkably] similar than the trends; even the calculated anomalies themselves from the different groups are as far apart from one another as the entire claimed temperature rise over the last 30 year climatic period.

# # # # #

 

Author’s Comment Policy:

Although some of this brief note is intended tongue-in-cheek, I found the UCAR page interesting enough to comment on.

Certainly a far cry from settled science — both parts by the way — not settled — and [some of it] not solid science.

I’m happy to read your comments and reply — but not to Climate Warriors.

# # # # #

 

 

 

0 0 votes
Article Rating
206 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
gator69
August 16, 2017 10:15 am

Using anomalies to study temperatures, when we have such a ridiculously small set of numbers for an almost incomprehensible reality, is simply nonsense.
1- There is no such thing as “normal” in climate or weather.
2- What exactly am I supposed to expect in the future based upon the range of possibilities we see in the geologic record? Are the changes we see happening really all that extreme?
3- No.
a·nom·a·ly
əˈnäməlē/
noun
1. something that deviates from what is standard, normal, or expected.
Anomalies only exist in the minds of the creators of “normal”.

August 16, 2017 10:16 am

There is a way to calculate a meaningful average and this is the Stefan-Boltzmann temperature of an ideal black body emitting the average emissions of the surface and is the convention used for establishing temperatures from satellite measurements whose sensors only measure emissions. Averaging emissions is far more valid then averaging temperature, since Joules are Joules and each is capable of the same amount of work, in fact the units of work are Joules. The system quite linear in Joules which should be expected based on the constraints of COE. The problem is the temperature centric sensitivity used by the consensus is intrinsically nonlinear, where emitted energy is proportional to the temperature raised to the forth power.
The reason they do this is that expressing the sensitivity as 0.8C per W/m^2 sounds plausible, while expressing the same thing in the energy domain becomes 4.3 W/m^2 of incremental surface emissions per W/m^2 of forcing and is obviously impossible. The 1 W/m^2 of forcing must result in 3.3 W/m^2 of ‘feedback’ and any system who’s positive feedback exceeds the input is unconditionally unstable. The climate system is quite stable, otherwise we wouldn’t even be here to discuss it.

Reply to  co2isnotevil
August 16, 2017 10:57 am

Climate science has abused the meaning of Black Body. What it means in climate science is not what it means to Kirchhoff.

Reply to  co2isnotevil
August 16, 2017 10:59 am

and Kirchhoff has been proven wrong too, in the lab. The material makeup of the “black body” matters. Kirchhoff claimed it did not.

pochas94
Reply to  Mark - Helsinki
August 16, 2017 10:53 pm

Kirchoff’s radiation law refers to bodies in local thermal equilibrium. Inside your refrigerater, or your living room, all items come to the same temperature eventually, regardless of emissivity. That is Kirchoff’s law. Now, a far-from-equilibrium situation like sun, earth, space, where equilibrium does not occur, then emissivities do make a difference, unless the bodies concerned are greybodies, which also follow Kirchoff’s law at thermal equilibrium.

pochas94
Reply to  Mark - Helsinki
August 16, 2017 10:58 pm

The earth is sometimes considered a blackbody as an approximation. The sun is a blackbody.

Reply to  Mark - Helsinki
August 17, 2017 9:07 am

Kirchhoff’s Law of Thermal Emission is proven false. Labratory experiments have invalidated it.

Reply to  co2isnotevil
August 16, 2017 11:00 am

I’ll say it gain, EARTH IS NOT A BLACK BODY AND CANNOT BE TREATED AS ONE

The Reverend Badger
Reply to  Mark - Helsinki
August 16, 2017 11:58 am

Correct! It surprises me they have got away with such nonsense for so long.

george e. smith
Reply to  Mark - Helsinki
August 16, 2017 12:20 pm

Well we can be sure that the earth is NOT a ” Black Body “.
There is NO such thing as a black body, in ANY sense of the term.
It is a fictional abstract that simply cannot exist anywhere, so since it cannot exist, it must not exist anywhere.
There is NO material or object OF ANY KIND that can and does absorb 100.000000..% of even ONE single frequency or wavelength of Electro-Magnetic Radiation energy that falls upon it. And a ” Black Body ” is required to do that for ALL frequencies and wavelengths from zero frequency up to zero wavelength.
In order to have zero reflectance, a black body would have to have the exact same permeability and permittivity as free space (Munought, and Epsilonnought) and at all frequencies and wavelengths.
So the Stefan-Boltzmann law, and the Planck Radiation formula for the spectral radiant emittance of a black body, are all simply theoretical.
But REAL bodies, that are quite good approximations over limited ranges, to what a BB is supposed to do, can be constructed, and those theories are very useful for doing practical calculations, and designs. We do know that no body can due solely to it’s Temperature, emit any wavelength or frequency of radiation energy at a higher rate that predicted for a black body.
So BB radiation theory provides a boundary envelope constraining REAL THERMAL radiation spectral radiant emittance. NON-THERMAL sources of radiation energy can’t be compared to BBs, because their radiances are not driven by any Temperature, but by material specific properties.
BB radiation is entirely independent of ANY material considerations, and depends only on Temperature.
G

Reply to  Mark - Helsinki
August 16, 2017 12:37 pm

Mark,
I never said the Earth is a black body. Only that the surface itself (excluding the effects of the atmosphere) has an emissivity close enough to 1 that we can consider it to be a black body without introducing any significant error. The equivalent surface temperatures calculated from satellite data based on considering the surface an ideal BB match measured surface temperatures quite well and track changes in surface temperatures even better.
Nothing is an ideal bb, but there’s a lot that is approximately a bb, including the surface of the Earth (actually the top of the oceans and bits of land that poke through), the surface of the Moon and much more.
The Earth itself, at least viewed from space, looks more like a gray body relative to its surface temperature than a black body with an effective emissivity of about 0.61.
We can also use the concept of an equivalent black body to establish an energy domain equivalence. For example, the average of 239 W/m^2 emitted by the Earth has an EQUIVALENT black body temperature of 255K. What this means is that the total radiation (energy) leaving the Earth is the same amount of as an ideal BB at 255K.
I
don’t get why so many people get so bent out of shape regarding the concept of EQUIVALENCE. It all boils down to EQUIVALENT energy and since Joules are Joules, why is this such a big deal?

Clyde Spencer
Reply to  Mark - Helsinki
August 16, 2017 2:00 pm

George,
Many years ago I read an article in Scientific American (back when it was worth reading) about an ‘invention’ that came very close to being a Black Body. It was a stack of double-edged razor blades bolted together. The sharp edges allowed light to enter between blades, but got trapped in the ‘canyons’ between. Thus, it came very close to absorbing all the light that impinged on the razor blade edges.

Mike B - Toronto
Reply to  Mark - Helsinki
August 16, 2017 3:38 pm
skorrent1
Reply to  Mark - Helsinki
August 16, 2017 6:28 pm

cnoevil,
Do I understand you to say both that satellite EST, based on BB radiation, is very close to measured surface temperature (14C or 287K) and that BB EST is really 255K. Please explain.

graphicconception
Reply to  Mark - Helsinki
August 17, 2017 2:59 am

“It was a stack of double-edged razor blades …”
Will any razor blades work or do they need to be some of Occam’s?

sycomputing
Reply to  Mark - Helsinki
August 17, 2017 8:04 am

@graphicconception
“Will any razor blades work or do they need to be some of Occam’s?”
Ha!

Reply to  Mark - Helsinki
August 17, 2017 8:39 am

Skorrent1,
Yes, that is correct. The surface temperatures derived from observed emissions at TOA track surface measurements very well.
The total radiation at TOA is not measured by weather satellites, which only measure emissions in a couple of bands in the transparent region of the atmosphere and a narrow band that’s sensitivity to water vapor content. However, the same analysis that applies a radiative transfer model to measured results to establish the surface temperature can be used to determine the total power leaving the planet and this is very close to the 240 W/m^2 corresponding to a 255K average temperature.

Reply to  Mark - Helsinki
August 17, 2017 9:13 am

There is such thing as a black body, take a box and coat inside with graphite or soot. Seal it and the soot or graphite will create a radiation curve, and induce thermal equilibrium.
This is essential to the gaseous sun model.
Kirchhoff claimed it does matter what material constitutes the black body and used soot or graphite to speed up the process of thermal equilibrium as those materials aborb and emit.
But Kirchhoff was incorrect, it does indeed matter what the cavity consists of, this has been done in labratory experiment and the results of using different materials produce different results.
Kirchhoff assumed any material would produce the same results, but obviousy he might have to wait 10 days for thermal equilibrium and the radiation curve he wanted so he covered the inside with soot or lamp black.
The thing is, it was the lamp black that produced the results he wanted NOT the cavity.
This means THE GASEOUS SOLAR MODEL IS DEAD

Reply to  Mark - Helsinki
August 17, 2017 9:16 am

If one understand’s the standard model for a gaseous sun and the implications of Kirchhoff’s law being wrong.. it meas the sun cannot be a gaseous body.
Furthermore, if the sun has even a tiny amount of condensed matter, it cannot collapse into a black hole, ever.
Black holes are anti science nonsense.
For the earth to be treated as a “black body” there would have to be NO inputs and NO outputs.
Anti science JUNK

Reply to  Mark - Helsinki
August 17, 2017 9:52 am

It’s a lattice, similar to this, which is not possible with the gaseous suncomment image

Reply to  Mark - Helsinki
August 17, 2017 9:54 am

Best guess, like Jupiter the sun has liquid metallic hydrogen. If true, a black hole can never form from a star.

Reply to  Mark - Helsinki
August 17, 2017 10:00 am

Liquid profile, matter. Not plasma or gas. You really have to reach for straws to claim this is not a liquid form of matter.
http://www.espritsciencemetaphysiques.com/wp-content/uploads/2016/01/article-0-1885c54e000005dc-549_634x632.jpg

Clyde Spencer
Reply to  Mark - Helsinki
August 17, 2017 4:09 pm

graphicconception,
Occam only had one razor. You will need multiple Occams.

Reply to  Mark - Helsinki
August 18, 2017 1:07 pm

Here is Kirchhoffs’ law tested in the lab. If this law falls, MOST of current astronomy will have to be redone.
Pierre Marie Robitaille.
https://youtu.be/YQnTPRDT03U

Reply to  Mark - Helsinki
August 18, 2017 1:13 pm

Kirchhoff’s law is incorporated into Planck’s law, which is the birth of quantum physics. This will also fall.
Most of what we think we know is wrong.

george e. smith
Reply to  co2isnotevil
August 16, 2017 12:02 pm

It is well understood that Temperature varies from time to time, and from place to place on the earth.
For example, during any ordinary week, it is not at all uncommon in Downtown Sunnyvale, where I live, to have a 24 hour variation in Temperature of 30 deg. F or more. It is quite routine. So 30 deg. F is 16.7 deg. C which is much greater than the 12 to 24 deg. C range that the entire global mean Temperature has remained within for about the last 650 million years.
Since it is known that Temperature varies with time, then the only way to get a VALID global MEAN Temperature is to SAMPLE a Nyquist VALID sampling of global locations, ALL AT THE VERY SAME INSTANT OF TIME.
Measuring one temperature here now, and another temperature over there at some other time, and maybe that place there next week, is total BS. You are simply gathering up instances of total noise.
And when you don’t even do that but read the difference from this thermometer, and what it might have said some 30 years ago, is also complete rubbish.
So we have rubbish to the nth degree.
And throughout all of this there could be on any northern midsummer day, places on earyth with Temperatures that can range over an extreme range of about 150 deg. C, and a routine range of 120 deg. C.
And due to a clever argument, by Galileo Galilei, we know that at any time there can be an infinite number of places on earth having any specific Temperature in that range that you want to pick.
And throughout all of this, the planet pays no heed to these machinations, and could care less what GISS thinks the TEMP is anywhere at any time. It’s ALL ” Fake NEWS ”
G

Reply to  george e. smith
August 16, 2017 12:27 pm

George
The only data I consider suitable for establishing an average is satellite data sampled every 4 hours covering the entire glob with a grid size of 30km or less.

goldminor
Reply to  george e. smith
August 16, 2017 1:54 pm

They had talking thermometers back in the old days, “…read the difference from this thermometer, and what it might have said some 30 years ago,…”?

Clyde Spencer
Reply to  george e. smith
August 16, 2017 1:55 pm

” ALL AT THE VERY SAME INSTANT OF TIME.”
I was watching a PBS program on Einstein last night where they claimed that Einstein proved that there was no such thing as syncronicity.
However, even more challenging is that because half the globe is dark and half light at what is approximately the same time, one really would need to take two global readings, approximately 12 hours apart, to get Tmax and Tmin. But, then one would be dealing with possible changes due to weather! So, I think that it is really better to take an annual time series where every station is sampled in the late afternoon for Tmax, and again just before sunrise for Tmin — or do continuous sampling in case a moving cold front would change the times of Tmax and Tmin.

Reply to  george e. smith
August 16, 2017 3:10 pm

Clyde,
” …one really would need to take two global readings …”
Modern satellite data is getting close to continuous in real time and the resolution keeps getting better and better. Most of the available data is aggregated as 3 hour samples where most points on the globe are sampled 8 times a day from geosynchronous satellites and twice a day from polar orbiters which is far more samples then required by the Nyquist rate to determine min/max accurately. Spacial resolution from the earliest satellites is about 30 km pixels, but newer data is available with < 10 km resolution. At the very least, going back about 3 decades, each point on the planet has been sampled twice a day by at least 1 polar orbiter and there are usually at least 2 of these at any one time. The biggest problem with weather satellite data is that it comes from many different generations of satellites with different fields of view and sensor characteristics all at the same time and merging them together is not a trivial task, but this is a solved problem.
Anything based on sparse, homogenized surface measurements is GIGO, but results arising from satellite data are generally much more representative of reality.

D. J. Hawkins
Reply to  george e. smith
August 16, 2017 5:05 pm

George;
What would you consider a “Nyquist valid sample of global locations”? I wonder what it would cost to provide that coverage?

Reply to  george e. smith
August 16, 2017 5:56 pm

A bigger issue is that you’re not asking a thermometer from 30 years ago but listening to Chinese whispers.
Or should that be Hansen whispers?

Reply to  george e. smith
August 16, 2017 9:35 pm

Averaging the high and low for a 24 hour period does not make much sense if one wants to know what the actual average temp is, or was.
During summer, the temp in a place like Southwest Florida may have been in the 88-92 F range for 12-15 hours of the day, and may have gotten as low as 75 or so for a half an hour during a brief downpour before warming right back up after the sun came back out, and then cooled gradually after sunset to get to about 80 by dawn. In fact that was about what happened at my house today.
The halfway point between the daily high and low is not at all what one gets if one records to temp in 15 minute increments and then averages those readings out.
Different sorts of trends at other places and times of year are very easily shown to give a similar disparity between the median of the high and low and the average of what the temp was during those 24 hours.
In winter lots of places are very cold all during the long night time and warm up briefly in the afternoon, for example.
And then given the illogical and error introducing process by which they round F degrees and then convert to C degrees (or whatever the actual process is) makes it even more ridiculous.
Unless the temps are measured on some sort of grid pattern in three dimensions, I doubt the number they give for global average temp means much of anything.
Certainly not fit for scientific discussion or understanding of the atmosphere, let alone policy making that affects our entire economy over many years.
That these pulled from a hat numbers are then compared to some constantly adjusted from actual measurements 30 year period makes the entire exercise of climate science in 2017 more of a joke than a serious avenue of scientific investigation.

Bill Murphy
Reply to  george e. smith
August 17, 2017 5:52 am

“Averaging the high and low for a 24 hour period does not make much sense if one wants to know what the actual average temp is, or was.”
Exactly. Whenever I hear a discussion about “average” temps I think about the events of Nov. 11, 1911, a day when a stable high was sitting over the central US setting record high temps and was followed by a fast moving Arctic cold front that set record lows the same day. Oklahoma City went from a record 83°F down to 17°F in a few hours. Springfield, MO dropped 40°F in 15 minutes and another 27°F by midnight. That was 80° at 3:45PM, 40° at 4PM and 13° by midnight.
So feel free to talk about 0.01° differences and TOBS etc. Granted 11/11/1911 was exceptional (and probably removed as an outlier from the record) but similar events happen almost every year. If there is such a thing as an “average temp” nobody knows what it is or ever was.

Reply to  george e. smith
August 17, 2017 8:41 am

DJ,
The Nyquist rate tells us that we can resolve a periodic function with only 2 samples per period, which considering diurnal variability to be periodic, requires only 2 samples per day.

D. J. Hawkins
Reply to  george e. smith
August 17, 2017 9:36 am


This is what George wrote, in part:

..a Nyquist VALID sampling of global locations, ALL AT THE VERY SAME INSTANT OF TIME

It appears that George is suggesting that there is a Nyquist sampling interval over a geographical area (one thermometer per 10, 100, or 1,000 square miles, for example) that is necessary to properly resolve the global average temperature. This would be separate from the minimum number of samples per day that would be required. And to the extent that air temperatures are non-sinusoidal, you may need more than 2 per day to capture the signal. Even with 2 per day on a sinusoid, that only means you can resolve the frequency. It certainly doesn’t guarantee you’ll catch the max and min.

Walter Sobchak
Reply to  co2isnotevil
August 16, 2017 8:07 pm

“any system who’s positive feedback exceeds the input is unconditionally unstable. The climate system is quite stable, otherwise we wouldn’t even be here to discuss it.”
Ergo, the claims of the warmunists that a feed back mechanism will kick into drive temperatures high that their estimate of the CO2 equilibrium sensitivity (which are way too high) are twaddle.

Walter Sobchak
Reply to  Walter Sobchak
August 16, 2017 8:44 pm

higher.
Anthony: Edit function please.

August 16, 2017 10:17 am

Regarding the large difference between the GISS and UAH anomalies for June 2017: These datasets have different baseline periods. GISS uses 1951-1980, and only two of those years are in the US record. I don’t remember UAH’s baseline period, but I doubt it begins before 1985.

Editor
Reply to  Donald L. Klipstein
August 16, 2017 10:19 am

UAH is 1981-2010

Sheri
Reply to  Donald L. Klipstein
August 16, 2017 11:00 am

That’s part of the problem. If you use different base periods, you get different answers. That’s not really science—I’m not sure what it is. Also, as temperatures level off, the anomalies decrease, so using an older base period gives more warming.

DD More
Reply to  Sheri
August 16, 2017 11:57 am

” in that we get an actual number for Global Average Surface Temperature:”
But we did have that, from NCDC/NOAA. Try this.
(1) The Climate of 1997 – Annual Global Temperature Index “The global average temperature of 62.45 degrees Fahrenheit for 1997″ = 16.92°C.
http://www.ncdc.noaa.gov/sotc/global/1997/13
(2) http://www.ncdc.noaa.gov/sotc/global/199813
Global Analysis – Annual 1998 – Does not give any “Annual Temperature” but the 2015 report does state – The annual temperature anomalies for 1997 and 1998 were 0.51°C (0.92°F) and 0.63°C (1.13°F), respectively, above the 20th century average, So 1998 was 0.63°C – 0.51°C = 0.12°C warmer than 1997. 1998″ = 16.92°C + 0.12°C = 17.04°C.
(6) average global temperature across land and ocean surface areas for 2015 was 0.90°C (1.62°F) above the 20th century average of 13.9°C (57.0°F) = 0.90°C + 13.9°C = 14.80 °C
http://www.ncdc.noaa.gov/sotc/global/201513
Now the math and thermometer readers at NOAA reveal – 2016 – The average global temperature across land and ocean surface areas for 2016 was 0.94°C (1.69°F) above the 20th century average of 13.9°C (57.0°F), =0.94°C + 13.9°C = 14.84 °C
https://www.ncdc.noaa.gov/sotc/global/201613
So NOAA says the results are 16.92 & 17.04 are less than 14.80 & 14.84. Which numbers do you think NCDC/NOAA thinks is the record high? Failure at 3rd grade math or failure to scrub all the past. (See the ‘Ministry of Truth’ 1984).
For all the data adjusters, please apply Nils-Axel Mörner’s quote
“in answer to my criticisms about this “correction,” one of the persons in the British IPCC delegation said, “We had to do so, otherwise there would not be any trend.” To this I replied: “Did you hear what you were saying? This is just what I am accusing you of doing.”
http://www.21stcenturysciencetech.com/Articles_2011/Winter-2010/Morner.pdf

A C Osborn
Reply to  Sheri
August 16, 2017 1:29 pm

DD, I have been quoting those figures, as I did about an hour before you, waiting for the usual suspects to come along and explain what was done and why in their twisted scientific world.
But Nick, Steve & Zeke have ignored it for the last 2 years.

Reply to  Sheri
August 16, 2017 10:10 pm

Simply stated, their methodology gives them the ability to make the numbers and thus the trend anything they want it to be.
The question of objective physical reality seems a quaint and disimportant notion to them, as they work to pound the square peg of reality into the round hole of warmista dogma.

TA
Reply to  Sheri
August 17, 2017 4:59 pm

Great post, DD More.

Clyde Spencer
Reply to  Kip Hansen
August 16, 2017 2:03 pm

“When you have many standards, you do not have a standard.”

Reply to  Kip Hansen
August 18, 2017 9:25 am

“Doublespeak” is synonymous with “equivocation.” An equivocation is an argument in which a term changes meaning in the midst of the argument. While an equivocation looks like a syllogism it isn’t one. Thus, while it is logically proper to draw a conclusion from a syllogism it is logically improper to draw a conclusion from an equivocation. To draw such a conclusion is “equivocation fallacy.”
If a change in the meanIng of a term is made impossible through disambiguation of the language in which an argument is made then the equivocation fallacy cannot be applied. Thus I long ago recommended to the USEPA that they disambiguation their global warming argument. They did not respond to my recommendation. The National Climate Assessment did not respond to the same recommendation. Review of the arguments of both organization reveals that they are examples of equivocations. We’re they to disambiguate their arguments these arguments would be revealed to be nonsensical.

Editor
August 16, 2017 10:19 am

And they work out their average, when half of the world has no data!
https://www.ncdc.noaa.gov/temp-and-precip/global-maps/

1saveenergy
Reply to  Paul Homewood
August 16, 2017 10:41 am

So what…50% of the population is below average intelligence (:-))

Reply to  1saveenergy
August 16, 2017 10:59 am

actually slightly more than that

john harmsworth
Reply to  1saveenergy
August 16, 2017 1:55 pm

The real problem is the 10-15% of the people who think they are smarter than they actually are. These are the ones who like to assimilate facts so they sound like they know what they’re talking about but never actually look into the things they think they already know.
Then there’s liars, like Michael Mann and the hockey team, who think they are smart enough to get away with lies!

Clyde Spencer
Reply to  1saveenergy
August 17, 2017 4:17 pm

At the antipode for Lake Woebegone.

Sheri
Reply to  Paul Homewood
August 16, 2017 11:03 am

Is that a problem? Can’t they just make it up? (/sarc)

RWturner
Reply to  Paul Homewood
August 16, 2017 11:29 am

Interpolation of data is common in science. However, when your interpolation algorithm creates a blotchy pattern, as the anomaly maps do, that in no way reflects reality, you should change the algorithm.
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/enso_update/gsstanim.gif
Notice how the actual data is what you’d expect? The anomaly, or the current temperatures difference from “average” creates a blotchy unrealistic pattern that you never see. If you were to reconstruct what NOAA claims is average, you would have this blotchy pattern of temperature that you never see in reality, clearly the algorithm parameters to create the “average” is erroneous. I have one guess on which way the algorithm skews past data.

Reply to  Paul Homewood
August 16, 2017 1:55 pm

Didn’t Hansen/Lebedeff ‘fix’ this with homogenization? /sarc
So much of what’s wrong with climate science can be traced back to one individual who’s either the most incompetent or the most malevolent scientist there ever was. I prefer to think that it’s just incompetence driven by confirmation bias and group think rather than a vindictive quest driven by ego…

Reply to  co2isnotevil
August 16, 2017 10:30 pm

If you go back to the beginning of the whole thing, there was no group think…that came later.
And it is difficult to imagine where confirmation bias crept in, since there has been no confirmation other than what has been invented to match the original assertions.
I personally prefer to take an unflinching look at what sort of person he has demonstrated himself to be.
Giving the benefit of the doubt is fine for an initial assumption…but we have lots of evidence now with which to draw a conclusion.
When someone who makes clear predictions on a regular basis and has never been right yet, maintains an air of confident self-assurance…what should one think then?
His track record proves his incompetence, his unwillingness to accept being incorrect shows he is not fact but ego driven, and his vindictive attitude towards those who have been correct where he has been wrong shows he is not scientific and not a humble person.
Dissecting his character flaws is beside the point however.
He is wrong.

Jeff Alberts
Reply to  Paul Homewood
August 19, 2017 8:53 am

Although a great deal of fuss is being made out of Global Average Temperature, there is no agreed upon definition of what Global Average Temperature actually means or how to calculate it.

Calculating it isn’t the issue. Whether or not it has any physical meaning is the issue. And the answer is, it doesn’t. It’s a useless calculation no matter how it is derived.

Robert of Texas
August 16, 2017 10:24 am

🙂 This one made me laugh… Its the precision they claim to have that drives me nuts. It’s just not possible given the data, the instruments used over the time period used, and the methods they use to calculate an “average”.
So now I have a new thought to make me chuckle – they don’t even know what it is they are measuring. That actually is not surprising – it explains how they (the activists) can so easily change their story – just go with the data and methods that back your story line instead of committing to a method.
I have a suggestion, why not just use the raw data as is and accept the error margin that goes with it? So you have data starting in the 1800’s with a wide error and better data as you get to present. I bet used in such a way, the entire story of AGW disappears into the margin of error.

Latitude
Reply to  Robert of Texas
August 16, 2017 10:49 am

Tony’s graph….comment image

Bryan A
Reply to  Latitude
August 16, 2017 12:33 pm

Now THAT really sheds light on the Adjustments and Manipulations NASA made to the Past Data

Reply to  Latitude
August 16, 2017 1:43 pm

I suspect that the same errors that cause the models to run hot in the future makes them run cold in the past, so they adjust past temperatures to match the models elevating the legitimacy of the models over the ground truth.

Reply to  Latitude
August 17, 2017 9:02 am

Kip,
I’ve done a lot of modelling of both natural and designed systems. One of the first tests I apply to any model is to vary the initial conditions to make sure the results converge to the same answer. Otherwise, it most likely indicates an uninitialized variable.

1saveenergy
Reply to  Robert of Texas
August 16, 2017 10:53 am

“RAW DATA !!!”
You cant trust raw data, you don’t know if the people picking it washed their hands after going to the toilet !!
Raw data needs cleaning & cooking before consumption.
(:-))

Reply to  1saveenergy
August 16, 2017 1:53 pm

“RAW DATA !!!”
You cant trust raw data, you don’t know if the people picking it washed their hands after going to the toilet !!

I always forget whether I’m supposed to wash my before or after I’ve gone to the bathroom when I handle food. That’s why my wife does the cooking. (Kidding!)

Raw data needs cleaning & cooking before consumption.

Don’t you mean, “Raw data needs cleaning & cooking before presumption.”?
(Or maybe the “pesumption” comes first? I always forget.8-)

Reply to  1saveenergy
August 16, 2017 4:45 pm

Raw data needs cleaning & cooking before consumption.
“… before corruption

buggs
August 16, 2017 10:29 am

And in spite of calculating an actual number there is never any discussion of error measurement. Anyone that has actually done any sort of statistical analysis knows this is possible to do and it will give remarkably useful information about your average from year to year. Specifically it will tell you whether the variability from year to year is within relative norms expected. We could then actually have a discussion of whether we should be using 1-2-3 standard deviations in the discussion depending on the degree of confidence you wanted in those numbers (99-95-68%). But we don’t do that. Why? Because it would detract from the anxiety.

john harmsworth
Reply to  buggs
August 16, 2017 1:57 pm

I like the overall approach they use.
“We prefer to look at long term averages of approximately 30 years.
But Hey! LOOK AT THIS NUMBER!

Mariano Marini
August 16, 2017 10:34 am

In 2015, the hottest year on record, the temperature was about 1.8 degrees F (1 degree C) warmer than the 1951–1980 base period.

This mean that 2015 was hottest than every preceding year and warmer of 1 degree C than the 30 years mean? I think that mean must be compare to other mean. Or single warm data with mean of warm data.
Where am I wrong in this?

Sheri
Reply to  Mariano Marini
August 16, 2017 11:06 am

It’s warmer than the base period of 1951 to 1980. Yes, the 30 year mean. It’s a mean for 2015 compared to the mean of the base period—mean compared to mean.

Mariano Marini
Reply to  Sheri
August 16, 2017 12:56 pm

mean compared to mean

No is 39 years nean to 1 year mean!

marianomarini
Reply to  Sheri
August 16, 2017 12:58 pm

Sorry. 30 years not 39.

Sheri
Reply to  Sheri
August 16, 2017 2:27 pm

That’s what I said: The mean for 2015 is compared to the mean of the base period 1951 to 1980.

A C Osborn
August 16, 2017 10:34 am

I keep belabouring this point, so here goes again.
The hottest year ever was 2015 and it was 59F or 15C and yet in 1998 it was and still is clearly stated that the temperature in 1997 was 62.45F or 16.92C and in 1999 they said 1988 was even hotter.
They currently show 1998 as being 58.13F or 14.53C just by changing baseline – Yeah Right!

Latitude
August 16, 2017 10:34 am

Anomalies make it easier to hide the cheating……

August 16, 2017 10:44 am

There is no such thing as an average temperature. You can’t even measure the average temperature in a room (without measuring the temperature of every molecule), let alone a planet’s atmosphere.

The Reverend Badger
Reply to  Phillip Bratby
August 16, 2017 12:04 pm

+ 1,000,000 . Average annual temperature of the Earth is an utterly nonsensical concept. It’s hard to believe that hundreds of thousands of University Professors have not got the cojones to make an hoinest statement like this. It’s basic and elementary. Undergraduate physics really.

Sheri
Reply to  The Reverend Badger
August 16, 2017 2:28 pm

Apparently it is not undergraduate physics anymore…..

Sheri
Reply to  The Reverend Badger
August 16, 2017 5:47 pm

Kip: That is truly disturbing.

scraft1
Reply to  The Reverend Badger
August 17, 2017 5:38 am

Kip – to be fair, what you are taught in college – undergrad – depends on your major. For example, engineering students take heavy duty math and physics in freshman year. They’re used as weed-out courses to separate the sheeps from the goats. Majors in pre-med and the basic sciences, as a rule, have to pass a calculus course. Business majors, at least at UNC and NC State, have to do the same.
But for sure, rigor in math and sciences is restricted to engineering and science majors.
In my semi-rural county on the NC coast, every high school kid on a graduation track has to pass Algebra II. Anyone showing an interest and facility for math takes calculus.
English/humanities majors in college get a pass in math if they place out. Seems reasonable to me.

MarkW
Reply to  Phillip Bratby
August 17, 2017 7:43 am

What you do is measure multiple points, and then give error bars to account for the fact that the molecules you didn’t measure might be different from the ones you did.

Joe Armstrong
August 16, 2017 10:45 am

I’m reminded of a story about the difference between accuracy and precision.
Seems a recently minted engineer was doing calculations out to several decimal places when his boss looked over his shoulder and observed, “I don’t know about that last number, but the first one is wrong.”

Sheri
Reply to  Joe Armstrong
August 16, 2017 11:06 am

+1

commieBob
Reply to  Joe Armstrong
August 16, 2017 11:51 am

Yep

1.8 degrees F (1 degree C) warmer

Someone obviously calculated a figure approximately 1 deg. C. Someone else converted that to Fahrenheit and in the process implied an accuracy that simply isn’t justifiable.
As Kip points out the various global averages are …

… as far apart from one another as the entire claimed temperature rise over the last 30 year climatic period.

Specifying the anomalies to two decimal places is risible.

rocketscientist
August 16, 2017 10:47 am

I believe Norman Augustine had a law regarding “On making a precise guess.”
“The weaker the data available upon which to base one’s conclusion, the greater the precision which should be quoted to give the data authenticity.”

Reply to  rocketscientist
August 16, 2017 6:19 pm

True story:
8 people around a table 40 years ago.
Private engineer pulls a statistic out of the air and states, “80% of I & I (inflow/infiltration) comes from roof drain connections (so our proposal is obviously the best….)”
Head of State agency turns to agency engineer and asks, “is that correct?”
Agency engineer doesn’t know but answers “Yes” to avoid admitting he doesn’t know something.
Three weeks later Head of State agency is quoting the made up stat at various meetings.
Stat was educated guess for specific location and wasn’t far off, but it became accurate and authentic for a number of years in all locations.

TA
Reply to  rocketscientist
August 17, 2017 5:10 pm

“Norman Augustine”
There’s a name I haven’t heard in a while.

August 16, 2017 10:48 am

IPCC AR5 Annex III: Glossary
Energy budget (of the Earth) The Earth is a physical system with an energy budget that includes all gains of incoming energy and all losses of outgoing energy. The Earth’s energy budget is determined by measur¬ing how much energy comes into the Earth system from the Sun, how much energy is lost to space, and accounting for the remainder on Earth and its atmosphere. Solar radiation is the dominant source of energy into the Earth system. Incoming solar energy may be scattered and reflected by clouds and aerosols or absorbed in the atmosphere. The transmitted radiation is then either absorbed or reflected at the Earth’s surface. The average (WAG) albedo of the Earth is about 0.3, which means that 30% of the incident solar energy is reflected into space, while 70% is absorbed by the Earth. Radiant solar or shortwave energy is transformed into sensible heat, latent energy (involving different water states), potential energy, and kinetic energy before being emitted as infrared radiation.
With the average surface temperature of the Earth of about 15°C (288 K), (SEE THAT??!!)
the main outgoing energy flux is in the infrared part of the spectrum. See also Energy balance, Latent heat flux, Sensible heat flux. (That was back in ‘13 ‘14 when AR5 was being published.)
Global mean surface temperature An estimate of the global mean surface air temperature. However, for changes over time, only anomalies, as departures from a climatology, are used, most commonly based on the area-weighted global average of the sea.
Land surface air temperature The surface air temperature as mea¬sured in well-ventilated screens over land at 1.5 m above the ground.
The SURFACE IS NNNOOOOTTTTT the GROUND!!!!!!!! In fact most weather measuring stations do not even measure or record GROUND temperature.
Now, during the day the earth and air both get hot but objects sitting in the sun get hotter than both. Once the sun goes down the air cools rapidly, the ground does not so the idea that the air warms the ground is patently BOGUS!!!!
The genesis of RGHE theory is the incorrect notion that the atmosphere warms the surface. Explaining the mechanism behind this erroneous notion demands RGHE theory and some truly contorted physics, thermo and heat transfer, energy out of nowhere, cold to hot w/o work, perpetual motion.
Is space cold or hot? There are no molecules in space so our common definitions of hot/cold/heat/energy don’t apply.
The temperatures of objects in space, e.g. the earth, moon, space station, mars, Venus, etc. are determined by the radiation flowing past them. In the case of the earth, the solar irradiance of 1,368 W/m^2 has a Stefan Boltzmann black body equivalent temperature of 394 K. That’s hot. Sort of.
But an object’s albedo reflects away some of that energy and reduces that temperature.
The earth’s albedo reflects away 30% of the sun’s 1,368 W/m^2 energy leaving 70% or 958 W/m^2 to “warm” the earth and at an S-B BB equivalent temperature of 361 K, 33 C cooler than the earth with no atmosphere or albedo.
The earth’s albedo/atmosphere doesn’t keep the earth warm, it keeps the earth cool.

Dave
Reply to  Nicholas Schroeder
August 16, 2017 12:02 pm

Shouldn’t the energy budget of Earth include heat from the core?

The Reverend Badger
Reply to  Dave
August 16, 2017 12:10 pm

If the heat from the core comes from radioactive decay then YES, but lets make it small so we can virtually ignore it.
If the heat from the core comes from the gravito-thermal effect in solids then NO because then it will come from the sun and that will COMPLETELY mess up the K-T diagram.
Perhaps it comes from somewhere else?
(/s, possibly somewhere in the above)

BillRocks
Reply to  Dave
August 16, 2017 3:12 pm

Heat from the “ground”, called heat flow, is very small compared to heat from the sun. It is measured in milliwatts and is less than 100 milliwatts/m2. Wiki says it is 91.6 mW/m2. Geologists, geophysicists, engineers work with heat flow routinely.

Reply to  Nicholas Schroeder
August 16, 2017 10:46 pm

“Once the sun goes down the air cools rapidly, the ground does not ”
Respectfully disagree, based on observations anyone can make nearly every day in every location.
Surfaces in the sun warm far more rapidly than the air does…the air never gets as hot as surfaces than have the sun shining on them.
And likewise, after sunset, surfaces cool far more rapidly than air.
This is why dew forms long before fog.
Dew can form on grass and hoods of cars and such before twilight is even over, while fog might not set in for several hours more.
At colder temps and drier air, frost can form and maintain at an air temp of 38 degrees when there is no wind (below 5 mph).
This shows again that surfaces in fact cool far more rapidly than air does.

Reply to  Menicholas
August 17, 2017 3:29 am

>>
And likewise, after sunset, surfaces cool far more rapidly than air.
<<
And if fog forms, it’s called “radiation fog.” I still remember the five types of fog from my meteorology training many, many years ago.
Jim

August 16, 2017 10:48 am

the average annual temperature for the globe between 1951 and 1980 was around 57.2 degrees Fahrenheit (14 degrees Celsius). In 2015, the hottest year on record, the temperature was about 1.8 degrees F (1 degree C) warmer than the 1951–1980 base period.
It’s false analysis to compare a 30-year average with the measurement of a single year.
The valid comparison is 1951-1980 with 1996-2015. That difference is about 0.5 C = 0.9 F, half of what NCAR reports.

john harmsworth
Reply to  Pat Frank
August 16, 2017 2:12 pm

Agreed Frank. The caveat to your 30/30 comparison should also be that this is a recognized cold period compared to a recognized warm period. A further comparison of the previous warm period would therefore be enlightening in terms of cyclicality. Something like 1910- 1940 perhaps.

August 16, 2017 10:51 am

Nobody is interested in coming up with a standard definition of what constitutes “the globe” that is being averaged because it is the equivalent of setting the goalposts in cement. Remember when the global average temperature wasn’t rising, so they had to add in heat from the deep oceans? A loose definition allows for easy adjustment.

The Reverend Badger
Reply to  Hoyt Clagwell
August 16, 2017 12:15 pm

Badgers don’t have Kangos.

Reply to  Hoyt Clagwell
August 16, 2017 1:28 pm

Yes, the IPCC’s self serving consensus needs all the wiggle room they can fabricate in order to provide what seems like plausible support for what the laws of physics precludes. The more unnecessary levels of indirection, additional unknowns and imaginary complexity they can add, they better they can support their position.

Peter Muller
August 16, 2017 11:05 am

Speaking of operational definitions, how about the term anomaly. Standard dictionary definitions such as the OED on-line definition “Something that deviates from what is standard, normal, or expected” does not seem applicable to annual global mean temperatures. Can someone please explain to me what is the “standard” or “normal” or “expected” annual global mean temperature? The use of the term anomaly in this case seems to continue the recent “trend (sic)” of diluting the specificity of language.

Sheri
Reply to  Peter Muller
August 16, 2017 11:08 am

I believe that was exactly the idea.

Reply to  Kip Hansen
August 16, 2017 10:51 pm

Plus it sounds more sciencier.

Clyde Spencer
Reply to  Peter Muller
August 16, 2017 2:12 pm

Peter,
Yes, the term “residual” might be a better choice than “anomaly.”

August 16, 2017 11:11 am

In aviation the standard atmospheric temperature is 15 C. It is now and it was 30 years ago.

Alan McIntire
August 16, 2017 11:14 am

I found this blogpost about the difference between average temperature and average irradiance interesting and informative.
http://motls.blogspot.com/2008/05/average-temperature-vs-average.html?m=1

Reply to  Alan McIntire
August 16, 2017 11:35 am

Trenberth et al 2011jcli24 Figure 10
This popular balance graphic and assorted variations are based on a power flux, W/m^2. A W is not energy, but energy over time, i.e. 3.4 Btu/eng h or 3.6 kJ/SI h. The 342 W/m^2 ISR is determined by spreading the average discular 1,368 W/m^2 solar irradiance/constant over the spherical ToA surface area. (1,368/4 =342)
There is no consideration of the elliptical orbit (perihelion = 1,415 W/m^2 to aphelion = 1,323 W/m^2) or day or night or seasons or tropospheric thickness or energy diffusion due to oblique incidence, etc.
This popular balance models the earth as a ball suspended in a hot fluid with heat/energy/power entering evenly over the entire ToA spherical surface. This is not even close to how the real earth energy balance works. Everybody uses it. Everybody should know better.
An example of a real heat balance based on Btu/h is as follows. Basically (Incoming Solar Radiation spread over the earth’s cross sectional area, Btu/h) = (U*A*dT et. al. leaving the lit side perpendicular to the spherical surface ToA, Btu/h) + (U*A*dT et. al. leaving the dark side perpendicular to spherical surface area ToA, Btu/h) The atmosphere is just a simple HVAC/heat flow/balance/insulation problem.
Latitude Range Net Area Incident Power Flux Cos ϴ Power In
0 to 10 3.875E+12 1,362.8 5.280E+15
10 to 20 1.151E+13 1,321.4 1.520E+16
20 to 30 1.879E+13 1,239.8 2.329E+16
30 to 40 2.550E+13 1,120.6 2.857E+16
40 to 50 3.143E+13 967.3 3.041E+16
50 to 60 3.642E+13 784.7 2.857E+16
60 to 70 4.029E+13 578.1 2.329E+16
70 to 80 4.294E+13 354.1 1.520E+16
80 to 90 4.429E+13 119.2 5.280E+15

Alan McIntire
Reply to  Nicholas Schroeder
August 17, 2017 7:59 am

You might note that the average temperatures of those latitude zones are not proportional to the fourth root of the radiation flux they receive. Thanks to Hadley circulation, regions closer to the equator are cooler than predicted, regions closer to the poles are warmer than predicted. Average temperatures equal predicted temperatures around 40 degrees north and south.

RWturner
August 16, 2017 11:34 am

On a side note, check out the equatorial Pacific heat trend. A weak La Nina is forming me thinks.
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/enso_update/heat-last-year.gif

pbweather
Reply to  RWturner
August 17, 2017 3:29 am

Actually this is a very good point. The real SSTs show a La Nina like pattern with anomalously cool tongue of Trade wind induced cool water in the eastern tropical pacific. This is clear to see. What hides this La Nina like SST pattern is anomalies…..those normals are not what they seem, because each Nino has the strongest warm pool in different locations, so the normals have huge errors spatially and spread out the “so call normal SST pattern”.
You can clearly see that the tropical eastern Pacific is much cooler than elsewhere at similar latitudes, but anomalies say nothing to see here out of the ordinary. However, Out going Long wave Radiation (OLR) also shows a lack of convection associated with these cool tropical SSTs hence tropical convection has adopted a La Nina like pattern over the Pacific and has done so for many months now.
The only question is not whether a La Nina like pattern is occurring but really more a question of about how strong it gets.
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/enso_update/sstanim.gif
OLR
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/enso_update/olra-30d.gif

MrGrimNasty
August 16, 2017 11:39 am

Seeing as the temperature in 2015 is 1 year/noise, not climate (by their own definition), they should be saying how the period 1986 to 2015 compares to 1951-1980, to be scientific instead of misleading and political!

August 16, 2017 11:43 am

Using 1951-1980 as the base period is interesting, since temperature was falling and CO2 rising during that period. Why not 1911-1940, which was warmer than 1951-1980 and had a warming rate quite similar to post 1980? Using 1911-1940 would isolate CO2 increase effects (?) from temperature and provide a better baseline for comparison with the current temperature trend. Until 2014, two of our major northern California cities, Santa Rosa and Ukiah, showed cooling compared to the 1930s, and only exhibited warming after “homogenization.” Michael Crichton found the same for Alice Springs, Australia, in his novel “State of Fear,” http://www.thesavvystreet.com/state-of-fear-gets-hotter-with-global-warming/ which I just reads again and find its observations and conclusions as timely as in 2004.

commieBob
Reply to  majormike1
August 16, 2017 12:15 pm

re. State of Fear

In Appendix I, Crichton warns both sides of the global warming debate against the politicization of science. Here he provides two examples of the disastrous combination of pseudo-science and politics: the early 20th-century ideas of eugenics (which he directly cites as one of the theories that allowed for the Holocaust) and Lysenkoism.

He also points out that scientists are far from impartial observers.

As a result of potential conflicts of interest, the scientists conducting research on topics related to global warming may subtly change their findings to bring them in line with their funding sources. Since climatology can not incorporate double-blind studies, as are routine in other sciences, and climate scientists set experiment parameters, perform experiments within the parameters they have set, and analyze the resulting data, a phenomenon known as “bias” is offered as the most benign reason for climate science being so inaccurate.

Scientists know how ‘human’ they are but they won’t admit that in public. Instead, we have folks like Dr. Michael Mann claiming godlike certainty. It’s actually evil.

Most of the greatest evils that man has inflicted upon man have come through people feeling quite certain about something which, in fact, was false. Bertrand Russell

the other Ed Brown
August 16, 2017 11:53 am

Thank you, Kip, for another thought provoking article.
Re Global Averages, I am curious if anyone has updated that now decades old question: What is the Global Average telephone number?

commieBob
Reply to  Kip Hansen
August 16, 2017 1:42 pm

If we ignore access codes, international calling codes and area codes, we are left with seven digit subscriber numbers. The largest possible seven digit number is 999 9999 or, expressed differently, 9,999,999. That’s one less than 10,000,000.
000 0000 isn’t a possible phone number, but for ease of arithmetic, let’s pretend it is. That makes the average 4,999,999.5 which isn’t a valid phone number except at Hogwarts. Possibly it’s 499 9999 ext. 5.
I’m guessing that a seven digit phone number can’t have a leading zero because of the way the old mechanical switches worked. That means the lowest possible phone number would be 100 0000. In that case the average would be 449 9999.5
Hope that helps. 😉

john harmsworth
Reply to  Kip Hansen
August 16, 2017 2:17 pm

Whatever it is, it’s a government number and it’s warmer than it was 20 years ago. If you phone it they will want some tax money. It was 1.5 less 200 years ago.

john harmsworth
Reply to  Kip Hansen
August 16, 2017 2:18 pm

Call immediately as it will be ringing under water by next week, or so.

Clyde Spencer
Reply to  Kip Hansen
August 16, 2017 2:18 pm

Kip,
Of course it isn’t! Because 555 555 555 is the phone number given out by good looking women in the bar that don’t want to be bothered by ugly geeks.

Reply to  Kip Hansen
August 16, 2017 4:32 pm

My old phone number was 41W; the W indicated that it was a party line, and we shared our with the Methodist minister. Private lines only had numbers, like my best friends had 2 and 45. These numbers would not fit in an average very well.

MarkW
Reply to  Kip Hansen
August 17, 2017 7:50 am

commie, not all phone numbers are possible. In the US, there are no phone numbers that start with 0, 911, 811 or 411.

jclarke341
August 16, 2017 12:02 pm

“When I went to school, we were taught that all data collection and subsequent calculation requires the prior establishment of [at least] an agreed upon Operational Definition of the variables, terms, objects, conditions, measures, etc. involved.”
Yes…that is required for good science, but AGW is not about science. It is about politics and advocacy. In those realms it is a great advantage to NOT have agreed upon operational definitions. It is very important to be able to make things mean anything you want them to mean any time you want. The latest example is the use of the terms alt-right and alt-left. No one can define those terms, but they can be thrown around very effectively in the realms of politics and persuasion.

Tom Dayton
August 16, 2017 12:09 pm

A good (for the umpteenth time) explanation of the use of anomalies versus absolutes is a recent post at RealClimate: http://www.realclimate.org/index.php/archives/2017/08/observations-reanalyses-and-the-elusive-absolute-global-mean-temperature/

Reply to  Tom Dayton
August 16, 2017 6:54 pm

The global anomalies are calculated with a theoretical even spread of data that is (very loosely it appears) based on the actual data. If it really was the result of only averaging the records and weighting then he would have a point and we would have a global TA that was similar to that in the 70 s that induced a global cooling scare.

billw1984
Reply to  Tom Dayton
August 17, 2017 5:30 am

Hey, don’t complain! He was being honest about errors. I think it was sort of like tiptoeing through a minefield 🙂

John W. Garrett
August 16, 2017 12:22 pm

Now I’m really confused 😉
Thank you (as always), Kip.

Gamecock
August 16, 2017 12:29 pm

“I believe that climate scientists put decimal points in their forecasts to show they have a sense of humor.”
H/T William Gilmore Simms

Chris4692
August 16, 2017 12:34 pm

An average is not necessary. Think more in terms of an index such as the Dow Jones, or S&P500 stock indexes. It does not matter what the number is when finding a trend, it only matters that the index is consistently calculated the same way each time it is calculated.

MarkW
Reply to  Chris4692
August 17, 2017 7:54 am

Chris, if the index you calculate doesn’t at least approximate the movement of the whole, then it is worthless.
That’s one reason why the DOW has fallen out of favor. 100 years ago, the 50 top companies represented the bulk of the total value in the market. Today the fraction of total wealth represented by the top 50 companies is only a tiny fraction of total market value.

Chip
August 16, 2017 12:35 pm

I’ve been a science geek all my life with hobbies ranging from astronomy to ornithology, so initially I took the climate scientists at face value. Then I stumbled into Climate Audit and WUWT, and was blown away by the politics and deceipt.
It’s very unfortunate that such a new and promising scientific field was hijacked by politicians and ideologues. Caution, skepticism and moderation are no match for fear and paranoia. But I think we’ve passed peak hysteria, even if the politicians and media will refuse to let go.

David Cage
August 16, 2017 12:42 pm

If the earth approximates to a black body or is even remotely close to it are all the photographs of it from space shown by NASA faked in a studio or photoshopped?

Reply to  David Cage
August 16, 2017 1:21 pm

David,
The Earth, as viewed from space is not a black body, but the Moon is once you subtract the reflected energy and the Earth would be too if not for its atmosphere. Relative to the emission behavior of something like a planet or moon, reflected light is irrelevant to the radiant balance, except indirectly by its absence. It might seem that the Moon is very bright, but its albedo is only about 0.12, where if it’s albedo was 1.0, it would be as bright as the Sun with a temperature of absolute 0 and no emissions in the LWIR!
If you look at Earth in the LWIR and were only concerned about the total average emissions, it would be indistinguishable from an ideal BB at about 255K. If you further examined the emitted spectrum, you would notice that the peak average emissions (color temperature per Wein’s displacement) for clear skies corresponds to the average temperature of the surface below and for cloudy skies corresponds to the temperature of the cloud tops when adjusted for non unit cloud emissivity, but in both cases, the spectrum has gaps arising from GHG absorption, reducing the total emitted energy to what an ideal BB at 255K would emit.
It’s important to point out that the emission temperature of Earth is dominated by the emission temperature of clouds covering about 2/3 of the planet, which for Earth clouds is about 262K, so the NET absorption band attenuation required by GHG’s for the emissions to be equivalent to BB at 255K is not a whole lot.

Lance Wallace
August 16, 2017 1:34 pm

The first (and only) time I ever looked at all 40 or so of the CMIP models, I made the “mistake” of calculating absolute temperatures rather than anomalies. I was astounded to note that the different models varied by about 3 degrees C in their baseline absolute temperature for their starting year (1880, I think). Now consider two models differing by 3 C. Each model will include some areas of the globe that are below the freezing point of water, but one will have a much higher area of ice than the other, affecting estimates of albedo, etc. So that alone would lead to major changes in how well each model matches reality. Since all models are tuned, each will adopt a different method of tuning in order to match historical records. So we would see some more or less arbitrary choices of aerosols, clouds, and other items of great uncertainty in order to make the fudge factors work.

Reply to  Lance Wallace
August 16, 2017 1:44 pm

Yes. See Mauritsen 2013 on the absolute temperature disparities in CMIP3 and 5 and the model tuning implications. Discussed in essay Models all the way Down.

Michael Jankowski
August 16, 2017 1:34 pm

Well at least they didn’t try to report it to 0.01.

hunter
August 16, 2017 1:48 pm

Well look to how religions in the past reconciled the contradictory, vague and deceptive aspects of their various scriptures and dogmas.
The climatocracy are doing much the same.

john harmsworth
Reply to  hunter
August 16, 2017 3:07 pm

Who is their God? Al Gore? Lol! He’s big enough.

u.k.(us)
August 16, 2017 2:02 pm

Who wants to get rich beyond their wildest dreams ?
Invent an a/c compressor that isn’t so loud/annoying that the cicadas compete with the noise.

August 16, 2017 2:12 pm

Independent of incomparable baselines, the global anomalies aren’t fit for purpose for a basic reason, inadequate coverage. This is true for land only, where large swaths of Africa, South America, and northern Eurasia either have no data or no long term data. The same is true except moreso for yhe oceans in the pre float/Argo era. Best would be to create a Dow Jones like global index of good, well maintained stations with long records. For example RutherGlen Ag and Darwin in Australia, DeBilt Netherlands, Sulina Rumania, Armaugh Ireland, Hokkaido Japan, Lincoln (University station) Nebraska, Rekjavik Iceland, Durban South Africa. Note not all are GHCN. No homogenization. Perhaps coverage area weighted. That way one has a land record unbiased anomaly trend. Why has this not been done? I suspect because it would show little or no warming, just like each of the named candidates for the index..

August 16, 2017 2:23 pm

When will the next “base” period be defined and used?

Malrob
August 16, 2017 2:29 pm

Is global average temperature a meaningful concept? A bit like averaging all the numbers in the phone book – only one phone will answer.

Reply to  Malrob
August 16, 2017 2:44 pm

The global average in C is not particularly useful because there is too much latitudinal and regional variation. But a correctly computed global anomaly is (for climate trends) because it refers to change over time relative to each specific station independently. That change over time can meaningfully be averaged globally. The residual big problem is individual station quality. As said above, most GHCN is not for purpose. And there are many fit for purpose stations not in GHCN. Rutherglen Australia, University of Nebraska at Lincoln, and Univeristy of Durban, South Africa are examples noted above.

MarkW
Reply to  Malrob
August 17, 2017 7:59 am

A global average with absolute precision is impossible.
However you can get an average, it’s just that the error bars will depend on the number and distribution of your sensors.
The more sensors you have and the more complete the distribution, then the lower your error bars will be.
The error bars for the current climate network would have to be at least 5C, given the paucity of sensors and the extremely poor distribution. (Most are in N. America and W. Europe)
As you go back into the past, both the quality, number and distribution of the sensors gets worse.

knr
August 16, 2017 3:10 pm

these 30-year periods have no scientific meaning or value , this period cam about because it was hopped that given this long the failure or reality to match models regards the relationship CO2 and temperature increases would be overcome by a change in reality .
It simply has no meaning , no value , no validity other than as a political tool . It could have easily been 40 or 35 years without making any difference at all .
It is indeed a classic example o the area where numbers are picked out of thin air and whose only value comes there perceived impact in supporting ‘the cause ‘

Gary Pearse
Reply to  knr
August 17, 2017 12:32 am

60yrs would be a better base, then we have the other half of the sine wave on the main sub-century natural variability curve. This helped warmer disaster proponents over the first half of the wave but now its peaked and going down again much to the their chagrin. You will see this base period changed before they endure the return of the Pause.

D. J. Hawkins
Reply to  knr
August 17, 2017 10:48 am

The concept of climate normals goes back to the 1930’s in the US. I suspect the interval was chosen partly due to limited coverage and more so due to the onerous task of doing the necessary calculations by hand.

Robber
August 16, 2017 3:27 pm

Great news. The world’s average temperature is 15 degrees, and that is the hottest the world has been since the industrial revolution began. Global warming? Still a bit chilly isn’t it? Where’s Josh with an appropriate cartoon?

August 16, 2017 5:27 pm

“In fact, the annual anomalies themselves differ one-from-another by > 0.49°C —”
Apologies for not being to put up the plot now but it would be good to see a moving SD for 120 months of differences. Places like Argentina ( hardly a backwater in the early 20th C) has only data for Buenos Aires until 1960. Surely the different methods mean that the spread of differences decreases with time?

Reply to  Robert B
August 16, 2017 10:03 pm

comment image
And it does, until mid century to what is expected for monthly uncertainties of 0.1 C (or √2 for the difference). This is for the difference in BEST and CRUTemp.
Why does it get worse as third-world countries start taking temperatures seriously?

billw1984
Reply to  Robert B
August 17, 2017 5:38 am

Do both of these either have or leave out oceans? Need to compare two similar things. Also, not sure you can really do standard deviation with just two numbers. Comparison with 3-4 land only data sets would be more informative. But, I understand what you are getting at.

Reply to  Robert B
August 17, 2017 8:21 pm

They’re both land only and SD of 60 values ( difference in 60 consecutive months) the 60 is arbitrary choice as indicator of more precise measurements as more and better data comes in. That they seem to correlate better when the 40s blip needs to go rather than for the past 30 years is a concern.

Robert from oz
August 16, 2017 6:31 pm

How dare you guys sending the hockey schtick Mann to oz , can’t you keep him over there somehow , we have enough fake scientists here already .

Pamela Gray
August 16, 2017 7:01 pm

In geological time scales we are discussing in this post an indiscernible rise in temperature at a time we should be warm anyway. Carry on.

Gary Pearse
August 17, 2017 12:21 am

Kip, that’s for starters. When they systematically through an algorithm keep changing the past data, both their global temperatures and anomalies of previous base periods have a life of only one month. As Mark Steyn said at the Senate Committee, how can one consider what the temperature will be in 2100 when we still don’t know what it will be in 1950! This means even their models are tuned to something that doesn’t exist anymore.

J Streb
August 17, 2017 3:58 am

Standard day sea level definition = 59 deg F / 15 deg C, 1013 millibars / 14.2 psi.

Mark Rae
August 17, 2017 5:51 am

Thanks! Interesting article

tom0mason
August 17, 2017 7:05 am

“Although some of this brief note is intended tongue-in-cheek, I found the UCAR page interesting enough to comment on.
Certainly a far cry from settled science — both parts by the way — not settled — and [some of it] not solid science.”

But is it not the same with so much in climate science.
Just testing to see how long before this comment is deleted.

D. J. Hawkins
Reply to  Kip Hansen
August 17, 2017 10:51 am

It’s simple projection, Kip. It’s SOP for the warmunists, so they believe everyone must do it.

tom0mason
Reply to  Kip Hansen
August 17, 2017 11:33 am

I’ve had some problems with WordPress and/or Firefox lately. Although it appears my comments were being accepted they were not. Things seemed to have settled down after uninstalling/reinstalling the browser (Firefox).

tom0mason
Reply to  tom0mason
August 17, 2017 1:40 pm

@Forrest Gardener
Basically my comment above was a test as my comments appeared to have been accepted and posted, however after closing Firefox browser then restarting any browser the comments had disappeared.
I finally realized it was probably the Firefox, and remembered it had updated itself twice recently. I can only think something had screwed-up in the update process.
It was not just this site but also on other WordPress sites and only with Firefox.
As I said a complete uninstalling/reinstall of Firefox today and this appears (I hope) to have cleared the issue.
P.S. I am on a Linux system which has been very stable for more than 5 years.

August 17, 2017 7:17 am

Worse than we don’t know the present temperature, the pre-industrial temperature is more uncertain. We are told by COP21 we should not exceed 2 C above pre-industrial temp. But the best temperature record has a range of 7 to 10 C. 3 degrees spread is greater than the target 2 degrees. And no SST data. The uncertainty is unknown. They have no idea what absolute temperature they are aiming for.
http://blogs.nature.com/news/files/2012/07/berkeley.jpg

Reply to  Kip Hansen
August 17, 2017 7:16 pm

Here’s link. Slightly different 1750s range = 6.6 to 9.6 C
http://berkeleyearth.lbl.gov/regions/global-land

tadchem
August 17, 2017 1:24 pm

I was taught that an operational definition is a definition of a term in a manner so explicit that all persons applying the definition as a criterion for identifying something would come to exactly the same conclusion as to whether or not the definition applies in any particular instance of it’s attempted application.
In empirical science an operational definition of a quantity is a definition that references the complete, replicable process for quantifying the result of the operation. This, in principle, allows separate investigators to apply the same process to the determination of a quantity and to directly compare their results. For example, ‘temperature’ can be measured by a process that involves the comparison of voltages between two thermocouples, one of which is in thermal contact with the object of interest and the other is in contact with a specific medium of precisely known reference temperature.
Logically an operational definition identifies a well-characterized parent group to which a term belongs, along with necessary and sufficient criteria to distinguish it from all other members of the same group. For example, to define ‘sanguine’ as ‘the color of blood’ identifies the parent group (‘colors’) and provides a criterion (‘is your color the same color as blood?’) that clearly distinguishes it from other members of the parent group.
The important point of an operational definition is that it completely removes all individual variation among observers from the exercise.

August 18, 2017 8:42 am

Harson:
I have previously criticized you for trying to be a ‘jack of all trades’ writer,
covering too many subjects to be an expert in all of them.
I particularly criticized your article on obesity where you claimed
calories didn’t matter — something fat people love to hear!
To demonstrate that I have nothing against you, and only judge
what you write:
I can’t tell you how disappointed I am after reading this article,
and finding it was better than a related post I made on my climate change blog:
http://elonionbloggle.blogspot.com/2017/08/total-confusion-on-absolute-mean-global.html
I congratulate you on a good article, and selecting a far too often forgotten subject:
What is the absolute mean global temperature?
A secondary question, ignored just as often, and perhaps a subject for your next article, is:
How can one number represent the ever changing climate on our planet?

Reply to  Kip Hansen
August 22, 2017 6:20 am

Harson:
I started reading your “Law of Averages” series when they were published, but stopped reading during Part 2, not satisfied with your understanding of economics. (I’ve written a Finance & Economics newsletter since 1977 as a hobby, and have a Finance MBA).
But … I read your Part 3 yesterday, and it turned out to be the best of the three parts, by far.
I had typed a comment on your Averages Part 2 article right after I read it in June, but never posted it:
I decided to leave you alone after giving you so much grief about your obesity article.
I changed my mind today, because my comments on economic data quality / data adjustments might lead you to write a new article on climate data quality / data adjustments, assuming you haven’t already done that.
There are four types of economics “adjustments”:
1)Needed “adjustments” with good explanations,
2) Needed adjustments that are ignored,
3) Unnecessary “adjustments” with no logical explanation, and
4) “Adjustments” made long after the initial data release
hoping no one will notice.
Harson wrote in Averages Part 2:
“I am not an economist …”
and then proved it.
Your “economics” did not include needed data adjustments in most of the charts
… and there is a better way to compare households.
I wrote an Income Inequality article in my January 2013 economics newsletter
that explained the many data adjustments needed.
I also found a better way to measure “inequality”:
“Spending is easy to measure, and there has been little or no change of Rich vs. Poor spending inequality in the past few decades. Income is hard to measure accurately. The increasing income inequality trend in the past few decades has been greatly exaggerated by “data mining”.”
Four examples of some of the many factors that distort
typical long-term household income analyses
… unless data adjustments are made:
(1) Household size and age has been changing:
Smaller and older.
More single person households = lower household income.
More households with only retired people = lower household income.
(2) 1980’s Adjusted Gross Income (AGI) definition changes:
The upper class shifted where they reported their business income after their personal tax rates suddenly became lower than the corporate tax rates they had been paying. The IRS even warned about that: “(AGI) Data for years 1987 and after are not comparable to pre-1987 data because of major changes in the definition of adjusted gross income.”
(3) People changing income quintiles during their lives:
The Top x% are not the same people / households every year,
especially the Top 1% and Top 5%.
Comparisons usually assume they are.
(4) Middle class income deferred until after retirement:
IRA and 401k retirement savings contributions “hide” middle class income
until withdrawals after retirement.
The maximum contribution limits allow high income households
to “hide” a much smaller percentage of their incomes
than middle-class households.

Reply to  Kip Hansen
August 25, 2017 10:01 am

TIME magazine is as useful and accurate as Wikipedia = hopeless.
.
Left-wing bias and pro-consensus on every subject
I recall your other favorite “source” on obesity was the left-biased New York Times !
Your points on obesity were wrong, even if there was a 99.9% consensus in your favor,
which of course there was not.
Calorie intake and usage is simple physics that you, or any one else, have not proven wrong.
And you also are as stubborn as a junk yard dog,
although I like that characteristic.
I simply pointed out that the economics used in YOUR Averages Part 2 post was biased
in favor of the consensus that income equality has been increasing at a
fast rate in recent decades.
I showed how easy it is to present data / charts that lead readers to a wrong conclusion
if you don’t thoroughly understand the data in the charts.
The same is true in climate science
Understanding the data, and making adjustments needed for a fair comparison,
would show that the income inequality gap has not changed that much in the past 40 years
— and the claim of a rapidly growing income gap is grossly overstated.
Making needed data adjustments (or ignoring needed adjustments) can completely change the conclusion
from raw data.
In climate science I believe “adjustments” have too often been used to make the temperature actuals more closely match the CO2 controls the climate theory / models.

August 28, 2017 9:26 am

I think actually one of the most important points is that 59F is not warm. I.e. The Earth is generally colder than you might think. Yet, this is the peak of an interglacial. In the glacial portions which are the majority of the time for the last 10 million years or so the temperature has been 16F colder or 43F barely above freezing average across the world. About 900 million years ago the Earth was locked in a phase called iceball/snowball earth where it was frozen solid year round for 300 million years.
These are pretty scary to think of.