What Does Gavin Schmidt's 'Warmest' Year Tell Us About Climate Sensitivity to CO2?

Guest post by Jim Steele,

Director emeritus Sierra Nevada Field Campus, San Francisco State University and author of Landscapes & Cycles: An Environmentalist’s Journey to Climate Skepticism

A friend of mine who works for the EPA emailed me a link to NASA’s Earth Observatory page pitching 2014 as the warmest year on record, and asked if “I dismiss their findings.” The following is an edited version of my reply suggesting the Global Average Chimera tells us precious little about the climate’s sensitivity to CO2, and the uncertainty is far greater than the error bars illustrated in Anthony Watts post 2014: The Most Dishonest Year on Record.

I simply asked my friend to consider all the factors involved in constructing the global average temperature trend. Then decide for himself the scientific value of the graph and if there was any political motivation.

1. Consider the greatest warmth anomalies are over the Arctic Ocean because more heat is ventilating through thinner ice. Before the Arctic Oscillation removed thick insulating sea ice, air temperatures were declining. Read Kahl, J., et al., (1993) Absence of evidence for greenhouse warming over the Arctic Ocean in the past 40 years. Nature 361, 335 – 337.

clip_image002

After subfreezing winds removed thick ice, then air temperatures rose. Read Rigor, I.G., J.M. Wallace, and R.L. Colony (2002), Response of Sea Ice to the Arctic Oscillation, J. Climate, v. 15, no. 18, pp. 2648 – 2668. They concluded, “it can be inferred that at least part of the warming that has been observed is due to the heat released during the increased production of new ice, and the increased flux of heat to the atmosphere through the larger area of thin ice.”

CO2 advocates suggest CO2 leads to “Arctic Amplification” arguing dark open oceans absorb more heat. But the latest estimates show the upper 700 meters of the Arctic Ocean are cooling (see illustration below), which again supports the notion ventilating heat raised air temperatures. Read Wunsch, C., and P. Heimbach, (2014) Bidecadal Thermal Changes in the Abyssal Ocean, J. Phys. Oceanogr., http://dx.doi.org/10.1175/JPO-D-13-096.1.

So how much of the global warming trend is due to heat ventilating from a cooling Arctic ocean???

clip_image004

2. Consider that NOAA’s graph is based on homogenized data. Researchers analyzing homogenization methods reported “results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.”

Read Steirou, E., and Koutsoyiannis, D. (2012) Investigation of methods for hydroclimatic data homogenization. Geophysical Research Abstracts, vol. 14, EGU2012-956-1.

So how much of the recent warming trend is due to the virtual reality of homogenized data???

3. Consider the results from Menne. M., (2009) The U.S. Historical Climatology Network Monthly Temperature Data, version 2. The Bulletin for the American Meteorological Society, in which they argued their temperature adjustments provided a better understanding of the underlying climate trend. Notice the “adjusted” anomalies in their graph below removes/minimizes observed cooling trends. More importantly ask why does Menne (2009) report a cooling trend for the eastern USA from 1895 to 2007, but NASA’s graph (below Menne’s) shows a slight warming trend for all of the USA from 1950-2014? Does that discrepancy indicate more homogenization, or that they cherry-picked a cooler period to start their warming trend?

clip_image006

clip_image008

4. Consider that much of the warming in North America as illustrated by Menne 2009 (above) happened in the montane regions of the American west. Now consider the paper Oyler (2015) Artificial amplification of warming trends across the mountains of the western United States, in which they conclude, “Here we critically evaluate this network’s temperature observations and show that extreme warming observed at higher elevations is the result of systematic artifacts and not climatic conditions. With artifacts removed, the network’s 1991–2012 minimum temperature trend decreases from +1.16°C/decade to +0.106°C/decade.

 

So how much of the recent warming trend is due to these systematic artifacts???

 

5. Consider that NOAA’s graph is based on adjusted data and the fact that NOAA now homogenizes temperature data every month and climate trends change from month to month, and year to year. As an example, below is a graph I created from the US Historical Climate Network Cuyamaca weather station in southern California; a station that never altered its location or instrumentation. In 2010 the raw data temperature trend does not differ much from the homogenized trends (Maximum Adj.)

clip_image010

Just 2 years later, the 2011 homogenized century trend (in black) increased by more than 2°F in the 2015 trend (in red.) I have archived several other similar examples of USHCN homogenization causing rapid “virtual warming”. Then ask your self which trend is more real? The more cyclical changes observed in non-homogenized data or the rising trend created by a homogenized virtual reality?

clip_image012

6. Consider that climate change along western North America was fully explained by the Pacific Decadal Oscillation and the associated cycles of ventilation and absorption of heat. Read: Johnstone and Mantua (2014) Atmospheric controls on northeast Pacific temperature variability and change, 1900–2012. Such research suggests non-homogenized data may better represent climate reality.

Knowing that the upper 10 feet of the oceans contain more heat than the entire atmosphere ask yourself if decadal warming trends are simply artifacts of the redistribution of ocean heat.

7. Consider that increasingly temperature data is now collected at airports. A 2010 paper by Imhoff, “Remote sensing of the urban heat island effect across biomes in the continental USA”, published in Remote Sensing of Environment 114 (2010) 504–513 concluded that “We find that ecological context significantly influences the amplitude of summer daytime urban–rural temperature differences, and the largest (8 °C average) is observed for cities built in biomes dominated by temperate broadleaf and mixed forest. For all cities combined, Impervious Surface Area is the primary driver for increase in temperature explaining 70% of the total variance. On a yearly average, urban areas are substantially warmer than the non-urban fringe by 2.9 °C, except for urban areas in biomes with arid and semiarid climates.”

So how much of this recent warming trend can be attributed to increases in Impervious Surface Area in and around weather stations in rural, suburban and urban settings?

8. Consider that direct satellite observations show lost vegetation has a warming effect, and transitions from forest to shrub land, or grassland to urban area raise skin surface temperatures by 10 to 30°F. Satellite data reveals the canopies of the world’s forests averaged about 86°F, and in the shade beneath the canopy, temperatures are much lower. Grassland temperatures are much higher, ranging from 95 to 122°F, while the average temperatures of barren ground and deserts can reach 140°F. Read Mildrexler, D., et al. (2011) A global comparison between station air temperatures and MODIS land surface temperatures reveals the cooling role of forests. J. Geophys. Res., 116, G03025, doi:10.1029/2010JG001486.

Ask yourself, “How much of the warming trend is due to population effects that remove vegetation?” How much is due to citizens of poorer nations removing trees and shrubs for fuel for cooking and heating or slash and burn agriculture?

9. Consider that neither of the satellite data sets suggest 2014 was the warmest ever recorded.

clip_image014

10. Consider that many tree ring data sets show recent warming does not exceed that 1940s as exemplified by Scandinavian tree ring data (from Esper, J. et al. (2012) Variability and extremes of northern Scandinavian summer temperatures over the past two millennia. Global and Planetary Change 88–89 (2012) 1–9.)

clip_image016

Consider international tree ring experts have concluded, No current tree ring based reconstruction of extratropical Northern Hemisphere temperatures that extends into the 1990s captures the full range of late 20th century warming observed in the instrumental record.” Read Wilson R., et al., (2007) Matter of divergence: tracking recent warming at hemispheric scales using tree-ring data. Journal of Geophysical Research–A, 112, D17103, doi: 10.1029/2006JD008318.

In summary, after acknowledging the many other factors contributing to local temperature change, and after recognizing that data homogenization has lowered the peak warming of the 30s through the 50s in many original data sets by as much as 2 to 3°F, (a peak warming also observed in many proxy data sets less tainted by urbanization effects), ask yourself, does NOAA’s graph and record 2014 temperatures really tell us anything about climate sensitivity or heat accumulation from rising CO2? Or does it tell us more about climate politics and data manipulation?

clip_image018

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

141 Comments
Inline Feedbacks
View all comments
jl
January 25, 2015 5:17 pm

“No” and “yes”.

Barry
January 25, 2015 5:24 pm
jmichna
Reply to  Barry
January 25, 2015 5:29 pm

Graphed series ends in 2005. 10 years of flat-line/decline missing…

Barry
Reply to  jmichna
January 25, 2015 5:40 pm

And that data set has a long history of errors and adjustments of its own:
http://en.wikipedia.org/wiki/UAH_satellite_temperature_dataset

AndyG55
Reply to  jmichna
January 25, 2015 6:13 pm

Most climate stuff in Wikipedia is written or edited by W. Connelly.
It is NOT accepted by any scientific institution as a reliable reference.

Reply to  jmichna
January 25, 2015 9:01 pm

OK, here is what Roy Spencer had to say:

They all stem from the fact that there is not a single satellite which has been operating continuously, in a stable orbit, measuring a constant layer of the atmosphere, at the same local time every day, with no instrumental calibration drifts.
Instead, what we have is multiple satellites (we use 14 of them for the UAH processing) with relatively short lifetimes (2 to 16+ years), most of which have decaying orbits which causes the local time of measurement to slowly change over the years, slightly different layers sampled by the earlier (pre-1998) MSU instruments compared to the later (post-1998) AMSU instruments, and some evidence of small calibration drifts in a few of the instruments.
An additional complication is that subsequent satellites are launched into alternating sun-synchronous orbit times, nominally 1:30 a.m. and p.m., then 7:30 a.m. and p.m., then back to 1:30 a.m. and p.m., etc. Furthermore, as the instruments scan across the Earth, the altitude in the atmosphere that is sampled changes as the Earth incidence angle of view changes.
All of these effects must be accounted for, and there is no demonstrably “best” method to handle any of them. For example, RSS uses a climate model to correct for the changing time of day the observations are made (the so-called diurnal drift problem), while we use an empirical approach. This correction is particularly difficult because it varies with geographic location, time of year, terrain altitude, etc. RSS does not use exactly the same satellites as we do, nor do they use the same formula for computing a lower tropospheric (“LT”) layer temperature from the different view angles of AMSU channel 5.

Reply to  jmichna
January 25, 2015 9:18 pm

, I agree with Spencer’s post and I think it illustrates the man’s integrity and his acknowledgement of the uncertainty in satellite data. If only IPCC spokespeople or Gavin Schmidt acknowledged the tremendous uncertainty in the “global average temperature” chimera!. His acknowledgement of that uncertainty makes me trust his graphs more the Schmidt’s.

mpainter
Reply to  jmichna
January 26, 2015 6:44 am

Gavin Schmidt, on the other hand, displays a lack of integrity, publicly and before the media.

Reply to  jmichna
January 27, 2015 1:13 pm

Jim Steele…your comment is right on target.

TRM
Reply to  Barry
January 25, 2015 6:22 pm

The nice thing about the satellite temperature projects is that one (UAH) is run by a skeptic of CO2 controlling the climate and the other (RSS) is run by a scientist who feels CO2 does control the climate. By the way it is 3rd and 6th for those 2.

Christopher Hanley
Reply to  Barry
January 25, 2015 9:35 pm

Irrelevant. it’s the pre-satellite data that’s the issue, for instance the relationship of the current global temperature to the early 40’s.

Jimbo
Reply to  Barry
January 26, 2015 1:54 am

Barry and Nick Stokes – whatever happened to 1998? One good cherrypick does deserve another.comment image
http://www.globalwarmingart.com/images/7/7e/Satellite_Temperatures.png

Reply to  Jimbo
January 26, 2015 3:01 am

Well, what happened to 2010 or 2014 in your second plot? or even 2005 surface? And what is the surface measure?
Troposphere measures give a big response to El Nino.

Bubba Cow
January 25, 2015 5:30 pm

We have got to ride the global warming issue. Even if the theory of global warming and CO2 is wrong, we will be doing the right thing in terms of economic and environmental policy.

Timothy Wirth, U.S Undersecretary of State for global issues, Clinton-Gore administration
Thanks for the data, though, particularly illustrations of the adjusted and unadjusted.

kenin
January 25, 2015 5:35 pm

Somebody decided to cloud the skies with an airplane or two or three over Toronto today for most of the afternoon. Ah, whatever maybe i’m seeing things. Why would anybody do such a thing? lol

Barry
January 25, 2015 5:45 pm

#1 – Yes, arctic oceans cool (release heat) when ice thins, because of increasing air temperatures. But then if warming were not occurring on a global scale, how do the other oceans (not ice covered) warm?
http://mashable.com/2015/01/23/ocean-warming-broke-chart/

Reply to  Barry
January 25, 2015 6:16 pm

For all you ever wanted to know about oceans and ocean heat distribution, go to bobtisdale,worldpress.com .

Robert B
Reply to  Barry
January 25, 2015 6:36 pm

Now if the chart was average degree C anomaly like is used for surface temperatures, the chart wouldn’t have broken. “Ocean warming broke chart” ? “DHead can’t draw a graph” would have been more appropriate.

george e. smith
Reply to  Barry
January 25, 2015 8:15 pm

Cooling rates in the polar regions are as much as a factor of 12 lower than the cooling rates in the hottest deserts. So the polar regions are not a significant factor in global cooling.
In addition ocean and atmospheric circulations constantly move heat energy from the tropics to the poles, which is why the polar regions warm faster than the tropics do.
The southern oceans (Pacific, Atlantic, and Indian) can’t reach much further south than the coast of Antarctica, which is basically the Antarctic circle. But the northern oceans can penetrate all the way to the north pole.
So the southern ocean [areas] which are more water than land can’t cool as fast as the northern ocean [areas] which are more land than water.
That’s why the southern oceans have been warming, but the northern ones haven’t. The Northern hemisphere is where all the hot tropical deserts are, so it can cool radiatively more than the southern hemisphere can.

RACookPE1978
Editor
Reply to  Barry
January 25, 2015 8:50 pm

Barry

#1 – Yes, arctic oceans cool (release heat) when ice thins, because of increasing air temperatures.

Would you explain the heat transfer involved here?

mpainter
Reply to  Barry
January 26, 2015 12:57 am

Barry:
Via insolation.
1. Cloud data show decrease in cloud cover since mid-eighties
2. Increased atmospheric CO2 makes no contribution to SST because water is opaque to far IR.

johnmarshall
Reply to  Barry
January 26, 2015 3:05 am

An object can only loose heat if it is in a cooler area. So the Arctic waters cools because the air is colder not because the air is warmer, this would warm Arctic waters.

Phil Cartier
Reply to  Barry
January 26, 2015 11:47 am

mashable:”We also found out that ocean temperatures reached record levels in 2014, with portions of every major ocean basin attaining new highs.” The authors apparently didn’t notice that the heat collected in certain areas due to ocean currents(weather), the same areas where the majority of Argo floats circulate. One Argo float, on average, represents about 60,000 cubic miles of ocean, measured by a 2ft square crossection 1-2000 meters tall. In the southern oceans they are spread out ~10X as far. There are so few floats relative to the size of the oceans that proper error bars on the graph would probably fill the whole thing!
The other point, the “deep” oceans have possibly risen about 0.02 degC. If and when that water does get pushed to the surface how does it transfer heat to its surroundings. The answer is that it doesn’t. At best the surface temps warm the same 0.02 degC that year and the cold upwelling feeds millions of tons of aquatic life and possibly starts another La Nina cycle.

Reply to  Barry
January 27, 2015 1:36 pm

Keep reading at mashable and you will remain in the dark.

Barry
January 25, 2015 5:51 pm

#2 – Here’s an interesting discussion of the non-peer-reviewed conference presentation (cited as if it were peer-reviewed):
http://variable-variability.blogspot.com/2012/07/investigation-of-methods-for.html

SMC
Reply to  Barry
January 25, 2015 6:15 pm

Barry, your first comment is from a global warming advocacy site.
Your second comment is from Wikipedia.
Your third comment is from a MSM news article.
And your fourth comment is basically an ad hominem against Mr. Watts.
You’re going to have to do better than that if you want to convince people. Data please, and a reasoned analysis would be nice, too.

Barry
Reply to  SMC
January 25, 2015 6:30 pm

I’m not making an ad hominem, just pointing folks to a website. If anything, Mr. Steele at least should have credited Mr. Watts for pointing this out several years ago.

simple-touriste
Reply to  SMC
January 25, 2015 6:54 pm

“I’m not making an ad hominem, just pointing folks to a website.”
with such content as:
“As a goal-oriented guy, Anthony Watts”(…)
Really, no ad hom.
/sarc

mpainter
Reply to  SMC
January 26, 2015 1:01 am

Well Barry, would you care to defend your other sources? Like the mainstream media. Or the wikipedia?

Jeff Alberts
January 25, 2015 6:00 pm

A friend of mine who works for the EPA emailed me a link to NASA’s Earth Observatory page pitching 2014 as the warmest year on record, and asked if “I dismiss their findings.”

The only answer needed was “There is no global temperature”. Anyone who thinks there is needs to explain why averaging intensive properties returns anything meaningful.

JohnWho
Reply to  Jeff Alberts
January 25, 2015 6:34 pm

I wonder if “climate scientists” could even agree on what would be needed to obtain a reasonable global average surface temperature?
What we have now arguably only gives us a number of local/regional average surface temperatures.

george e. smith
Reply to  JohnWho
January 25, 2015 8:34 pm

Well the differences in regional temperature (at any instant) are what drives heat flow from one region to another.
But local temperatures are never recorded all at the same time, so you never ever get a valid temperature map of the world at any instant of time, which would give you a function that you could average.
And of course you do need to have samples that conform to the Nyquist sampling theorem; and that you never do have, either spatially or temporally and by a big enough factor (relative to the signal bandwidth) that even the zero frequency spectral component (average) is corrupted by aliasing noise.
So nyet of getting a real global average temperature.

Reply to  Jeff Alberts
January 25, 2015 11:15 pm

Or refer to the standard method of deriving a global surface average temperature at the International Standards Organisation: http://www.iso.org/iso/home.html
Good luck with that!

Walt D.
Reply to  Jeff Alberts
January 26, 2015 5:06 am

You make a very good point. What type of average you use depends on what you want to use it for. What is the average voltage on a house AC circuit – zero. So we use root mean square. What do you do if you want to falsify cost of living numbers? Use the geometric average. What type of average would you use to calculate the black body radiation from a surface where there is a large variation in temperature?
It is not that there is no global temperature. Rather, that the land and sea measurements do not actually measure it. See the recent post by Tim Ball about data inadequacy. If we go back before 1956, there were no temperature measurements at the South Pole. Yet we hear talk about global records going back to 1880, 30 years before anybody even went to the South Pole.

Barry
January 25, 2015 6:05 pm

#3 – HCN v.3 has been available since 2011. Why not cite that data set?

Reply to  Barry
January 25, 2015 7:16 pm

I used Menne 2009 for the visual. Do you have a similar map for HCN v3.?
But see my point #5. My guess is they have homogenized an increasing warming trend as there are several other examples similar to Cuyamaca where homogenized trends increase yearly as discussed for Death Valley http://wattsupwiththat.com/2015/01/07/peter-miesler-helps-expose-ushcn-homogenization-insanity-and-antarctic-illusions/.
It teems the most rapid climate warming trend is caused by homogenization.

Reply to  Barry
January 25, 2015 11:16 pm

Because ISO doesn’t?

Bill Illis
Reply to  Barry
January 26, 2015 6:25 am

Here is the adjustments to Cuyamaca in GHCN v3.2.2 (why does there need to be new version every few months?).
No station moves, no instrument changes, but the NCDC takes Cuyamaca’s modest warming of +0.5C and increases it to +2.3C of warming.
The more you look, the more examples there are like this. We need forensic auditors to go in and fix this mess.
http://s30.postimg.org/fu0v180z5/Cuyamaca_Adjusted_HCN_42500042239.gif

rooter
Reply to  Bill Illis
January 26, 2015 8:43 am

Why don’t you be that auditor who fix this mess? The better fixing the better homogenization.
How do you know there has been no changes in that location?

Bart
January 25, 2015 6:09 pm

Yes, the trends are likely overestimated. But, it’s moot. Even the greatest trend from the most dubiously processed data is still nowhere near what the models said it would be.

JohnWho
January 25, 2015 6:20 pm

“Or does it tell us more about climate politics and data manipulation?”
Yeah, I’m going with this one.
Although if Gavin had said, “Warmest year on data manipulated record” none of us would have anything to be skeptical of.
/grin

Reply to  JohnWho
January 25, 2015 11:20 pm

Except Gavin would have told you that “warmest” actually meant something completely different and would have provided you with information that was completely irrelevant to the question you had asked.

Barry
January 25, 2015 6:24 pm

#4 – “This network” is vague. NOAA has a few different networks. The HCN has about 1200 stations, and does not include the SNOTEL stations reviewed by Oyler et al. There is also a network of about 10,000 stations that includes the 700 SNOTEL stations. So I would guess that about (700/10000) x 1 C of the minimum temperature trend could be attributed to systematic bias (over a period of 10 years, mid-90s to mid-2000s). An important conclusion of the study was omitted, though: High-elevation minimum temperatures are still warming about as much as minimum temperatures at lower elevations.

John West
January 25, 2015 6:36 pm

“asked if “I dismiss their findings.””
Of course not. I do however put their findings in context. Why, do you suppose, “scientists” would portray inconclusive evidence of a warming trend as absolute proof of dangerous anthropogenically dominated warming?

FrankKarr
January 25, 2015 7:01 pm

Barry, You really need to do a study of past high temperatures during the Holocence, Medieval and Roman periods. Just because the temps are rising by tenths of a degree does not mean its due to us. For example in Central England temps at three widely distributed gauges increased by 2C from the mid 17th Century over 40 plus years. Nothing to do with CO2. Just because there is a rising trend is not proof of AGW. Its all happened before and currently temps have flat lined over 18 years.

Chip
January 25, 2015 7:04 pm

The NOAA reported a record high by .04 degree on the assumption that they knew the temperature to a similar precision over the Pacific in 1890, the Sahara in 1903, the Atlantic in 1920 and a million other assumptions.
Not raw, precise data. But modeled assumptions of a complex and chaotic climate over more than a century.

Reply to  Chip
January 25, 2015 11:21 pm

Never to forget that for half a century Antarctic temperatures were being “measured” in the Falkland Islands.

Mike M.
January 25, 2015 7:35 pm

Mr. Steele asks several times: “So how much of the recent warming trend is due to …”. In each case, I think the answer must be “perhaps some, but not much”. I say this because the ocean covers some 70% of the Earth’s surface. So if you cut the warming over land down to match the warming of the sea surface (land warming is about 70% larger, according to the official figures), you only reduce the global trend by about 14%. So if the sea surface temperatures are OK, then the global trend is not that different from the official values.
I am not saying that the land temperatures are OK; it does seem there are problems. I am just saying that it is sea surface temperatures that really matter.
So is there some good reason I should not trust the sea surface temperatures?

Reply to  Mike M.
January 25, 2015 8:31 pm

No. Trust sea surface temperatures, but dig deeper than the average to wonder why the Indian and south Atlantic are causing all the average warming while all the other oceans are flat or cooling. You are right that it is ocean temperatures that really matter. The average ocean warming virtually guarantees that there must be some atmospheric warming going on, yet warm air rises and the satellites would detect the land surface warming if it were real.

Mike M.
Reply to  gymnosperm
January 26, 2015 8:45 am

Gymnosperm writes “satellites would detect the land surface warming if it were real”. But satellites do detect the warming, although they seem to give a slightly lower amount of warming than terrestrial measurements.
The warming of the oceans means there must be a radiative imbalance at the top of atmosphere. I agree that changes in the temperature distribution of the oceans is very important to understand since that can cause atmospheric temperatures to not track with the ocean temperatures.

Reply to  Mike M.
January 25, 2015 8:46 pm

Mike M. you ask the wrong question. It is not a matter trusting sea surface temperatures but putting those changes into a proper perspective. The question I asked is what does the temperature tell us about climate sensitivity to CO2. A slow down in the winds can cause sea surface temperatures to rise. The redistribution of heat concentrated within a warm pool, can warm surfaces elsewhere without any additional accumulation of heat. Redistribution of heat from a center of stored heat, can cause detectable warming elsewhere without any noticeable cooling within the region of stored heat. So your question is a good one in essence, but better said as “what does a change in sea surface temperatures tell us about ocean sensitivity to CO2”, because often the change is just weather not climate.
The upper 300 meters are a better indicator of climate trends and according to Xue 2010 (A Comparative Analysis of Upper-Ocean Heat Content Variability from an Ensemble of Operational Ocean Reanalyses), there has been no warming or a slight cooling since 2003 in the upper 300 meters.

Reply to  jim Steele
January 25, 2015 11:25 pm

The issue is an imbalance between incoming energy from the sun and outgoing energy from the Earth. The correct metric for this is enthalpy. Temperature is not a measure of enthalpy.

Mike M.
Reply to  jim Steele
January 26, 2015 8:53 am

Jim Steele writes “sea surface temperatures but putting those changes into a proper perspective”. That is what I am trying to do; I think you are the one with the mistaken perspective.
“A slow down in the winds can cause sea surface temperatures to rise.” No, they can’t. Sea surface temperature is normally measured at a depth of 3 meters and is representative of the mixed layer, which has an average depth at least an order of magnitude greater. As you point out yourself, that volume of water has a heat capacity that is large compared to the heat capacity of the atmosphere.
Because of the large thermal inertia of the oceans, ocean temperatures pertain more to climate than weather. It is the response of the ocean to CO2 that ultimately matters.

Mike M.
Reply to  jim Steele
January 26, 2015 8:57 am

The Pompous Git is correct that “The issue is an imbalance between incoming energy from the sun and outgoing energy from the Earth. The correct metric for this is enthalpy. Temperature is not a measure of enthalpy.”
But one can not measure enthalpy, one can only measure changes in enthalpy. One normally measures changes in enthalpy by measuring changes in temperature. So the statement “Temperature is not a measure of enthalpy” is misleading.

Reply to  jim Steele
January 26, 2015 10:24 am

Mike wrote “Because of the large thermal inertia of the oceans, ocean temperatures pertain more to climate than weather. It is the response of the ocean to CO2 that ultimately matters.”
MIke you are engaging in a bit of bait and switch! Or revealing your own lack of perspective.
Ocean surface temperatures are a different ball game the ocean heat content. Indeed the oceans are the real critical piece when analyzing accumulated heat. And to to understand if heat that is ventilating from the oceans was stored this year, last decade or during the Medieval Warm Period, requires a whole lot more observations and an improved understanding than what we currently have. Read Deep Oceans Are Cooling Amidst A Sea of Modeling Uncertainty: New Research on Ocean Heat Content http://landscapesandcycles.net/cooling-deep-oceans.html
On the other hand surface temperatures are greatly influenced by weather, winds speeds, el minos and upwelling. You say the response to CO2 is what ultimately matters? Really? Have you ever considered the sun and clouds? El Ninos and LA Nina. The tropical oceans receive more heat from the sun than temperatures suggest because much of that heat is distributed poleward determining extra-tropical climate. The IPCC’s Scientific Basis admits that added CO2 in the tropics has little significance due to deep convection and the extreme moisture content. (That’s why the focus on the Arctic) Slightly higher periods of solar irradiance are correlated with periods when the tropical Pacific is in a more La Nina like state. During La Ninas ther are fewer clouds in the tropical eastern Pacific and greater insolation. Fewer clouds can allow a increase as much as 200 W/m2.
Mike I suggest you broaden your perspective.

Reply to  jim Steele
January 26, 2015 10:30 am

Mike M.
So does a cubic metre of water at 0°C that is all liquid have the same heat content as a cubic metre of water at 0°C that is mostly solid? Does a cubic metre of air at 0% humidity have the same heat content as a cubic metre of air at 100% humidity? Does a cubic metre of stationary air have the same energy content as a cubic metre of air moving at 100 km/hr? Since energy content is dependent on volume, where is the dependency of temperature on volume? Does adding a lite of water at 50°C to a second litre of water at 50°C give us two litres of water at 100°C? Not in the physics I was taught back in the late 1960s.

Mike M.
Reply to  jim Steele
January 26, 2015 11:14 am

The Pompous Git asks “So does a cubic metre of water at 0°C that is all liquid have the same heat content as a cubic metre of water at 0°C that is mostly solid? ” etc.
Of course not. And how do you demonstrate that fact? Hint: It involves a thermometer.
To say that “Temperature is not a measure of enthalpy” is technically true, as I stated earlier. But to say that as an isolated statement is misleading, since it would seem to ignore the fundamental relationship between enthalpy and temperature.

mpainter
Reply to  jim Steele
January 26, 2015 11:58 am

Mike M:
CO2 has no effect on SST because water is opaque to far IR. See the absorbency spectrum of water, with particular attention to the attenuation curve at 15 microns, the wavelength emitted by CO2. The data show that this wavelength is absorbed within 3 microns of the surface. All IR emitted by CO2 which is incident on water is converted into latent heat within a few seconds.

Reply to  jim Steele
January 26, 2015 12:55 pm

Mike M.
Explain to me then why I should accept temperature change as a proxy for change in energy content.

DD More
Reply to  Mike M.
January 26, 2015 11:51 am

Mike asks “So is there some good reason I should not trust the sea surface temperatures?”
On an recent posting involving sea temperatures I found.
Adjustments galore. Went looking for accuracy and just what they are able to measure after Bob’s last post and had a real awakening. Seems that overall measuring of ‘sea surface’ has problems. Original bucket & thermometer (no depth control), ship intake (well below surface), buoys (seem to rock in the wave with depth resolution of a meter), then IR satellite (cannot get thru the clouds) to microwave (get thru the clouds, but not the rain & surface mist). Oh and did I mention one of the satellites was doing reasonable until they had to boost the altitude, then had problems with pitch, yaw and just what was the height. The number of adjustments to correct is staggering. Includes (but not limited to); wind speed, rain, cloud amount/percent and cloud water vapor, daytime diurnal warming, high latitudes, aerosols, SSTs under 10C, columnar water vapor, higher latitudes show a slight warm bias, seasonal cycle wind direction for SST retrieval, fast moving storms and fronts, wind direction error and instrument degradation.
http://images.remss.com/papers/rsspubs/gentemann_jgr_2014.pdf
Still their abstract reads –
Errors were identified in both the MW and IR SST data sets: (1) at low atmospheric water vapor a
posthoc correction added to AMSR-E was incorrectly applied and (2) there is significant cloud contamination of nighttime MODIS retrievals at SST <10C. A correction is suggested for AMSR-E SSTs that will remove the vapor dependency. For MODIS, once the cloud contaminated data were excluded, errors were reduced but not eliminated. Biases were found to be 20.05C and 20.13C and standard deviations to be 0.48C and 0.58C for AMSR-E and MODIS, respectively. Using a three-way error analysis, individual standard deviations were determined to be 0.20C (in situ), 0.28C (AMSR-E), and 0.38C (MODIS).

Now put that in relation to historical data, where they use the 1872 to 1876 voyage of the HMS Challenger.
69,000-nautical-mile track, crossing the Atlantic, Indian and Pacific oceans and 300 ocean-temperature profiles. What a base to compare to.

Mike M.
Reply to  DD More
January 26, 2015 1:43 pm

Thanks, DD More, looks interesting.

January 25, 2015 7:56 pm

Its not warming
there was no Little Ice age
the climate doesnt change

Reply to  Steven Mosher
January 25, 2015 8:57 pm

Who is arguing there is no warming or that climate doesnt change??? That is a meaningless comment. What does the impending blizzard in New England tell us about sensitivity to CO2. What does the record high in Death Valley in 1913 tell us about sensitivity to CO2. I see no one answering the question.
There is no doubt climate has warmed since the LIA. But again what does that tell us about sensitivity to CO2 versus other climate dynamics? Johnstone and Mantua 2014 explained regional warming due to the Pacific Decadal Oscillation, and every paper I have read suggests rising CO2 in models have no skill simulating the PDO. Furthermore the changes in SST in that paper agree with most of California’s raw data but conflicts with homogenized data. What does that tell us?

lokenbr
Reply to  jim Steele
January 25, 2015 11:18 pm

Thank you Jim Steele, I’ve been reading here in for some time, and I find your logic unassailable in general. I am not sure what’s up with Mosher, but I suppose that’s the way he likes it. Don’t let him distract you 😉

Mike M.
Reply to  jim Steele
January 26, 2015 9:04 am

Jim Steele writes: “What does the impending blizzard in New England tell us about sensitivity to CO2. What does the record high in Death Valley in 1913 tell us about sensitivity to CO2. I see no one answering the question.”
OK, I’ll answer those questions: Nothing. Those are weather events.
So far as I know, Steele is correct that the models have no skill at simulating PDO (or AMO, for that matter). That tells us that even if the models have climate sensitivity right (unlikely IMO), they won’t give reliable results over short time periods, such as the 27 year period used by IPCC to test the models or the nearly 20 years of the pause.

Reply to  Steven Mosher
January 25, 2015 11:31 pm

Having fun yet?

Reply to  The Pompous Git
January 25, 2015 11:31 pm

Directed at Mosh…

January 25, 2015 8:12 pm

“So how much of the recent warming trend is due to these systematic artifacts???”
AFAIK, none. SNOTEL is a USDA system. Do you know of any climate index that uses SNOTEL data?

george e. smith
January 25, 2015 8:43 pm

Well the ideal gas law applies to ideal gases. We don’t have any ideal gases. and especially we don’t have any in our atmosphere, so the ideal gas law wouldn’t do much good. And the ideal gas law contains a volume. The atmosphere doesn’t have any specific volume. So just what are you going to calculate with the ideal gas law ??
The ideal gas law is a law relating to thermal energy (heat ; (noun)) The “green house effect ” is a consequence of electromagnetic radiation energy.

Reply to  george e. smith
January 25, 2015 11:18 pm

And it’s a trivial matter to consult MODTRAN: http://www.modtran5.com/

rogerknights
January 25, 2015 8:48 pm

11. Much of the current warm anomaly is due to the artificial rise in Siberian temps. after the fall of the USSR, which had rationed fuel supplies preferentially to districts that reported the lowest temps.

Reply to  rogerknights
January 25, 2015 9:12 pm

And despite northern Eurasia having one of the lowest densities of weather stations, it has one of the highest anomalies which calls into question homogenizaton’s evil twin “infilling”. Is it a coincidence that eastern USA has the greatest density of weather stations and the least warming and even cooling. Here is a map of GHCN station density. The warmest regions typically have the least coverage.
[Link missing? .mod]

Reply to  jim Steele
January 25, 2015 9:27 pm

Here is the link of GHCN stations
http://landscapesandcycles.net/image/100009576.png

Reply to  jim Steele
January 25, 2015 11:30 pm

Infilling being what got Australian meteorologist Warwick Hughes started. He asked Phil Jones how the centre of the “hotspot” was hotter than the the surrounding stations used to infill the temperature in the hotspot where there were no stations.

Reply to  jim Steele
January 26, 2015 1:17 am

Fundamentally the problem is this: these groups have built a *model of temperature history*. It doesn’t matter if countless examples of the corruption of pristine station data is shown to them. Any one of these examples should be sufficient to debunk the model. However these refutations of the model are always hand waved away, treated as anomalies of little impact on the record overall. Why? Because their model shows a great deal of internal consistency. Of course it does. That’s what you get when you build a model that is designed to be *consistent.* What other result could you expect? This is where, I suspect, Mosher does the greatest job of fooling himself into thinking he has done something smart.
Now of course, the model is not presented as a model of temperature history, but THE temperature history. A particular station due to the unique morphology of the region and micro climate, might show a cooling trend due to, say, a change in circulation patterns in that region for, say, 20 years. For the model, that natural phenomena becomes an aberration to be smoothed out of existence. It doesn’t matter if that is what really happened over that time period in that region; because weather and short term climate can be messy and can operate in cyclical patterns. But if that’s not what the model assumes, all that information is lost. That is why the historical record is changing so dramatically. (The notion that a temperature measurement today alters a record a decade ago is so utterly absurd it defies credulity.Scientists working in other fields, engineers such as myself, are left speechless. And the geniuses who have designed this monstrosity can’t defend it either and don’t attempt to. They just say, “that’s the way it’s done.”. These people shouldn’t be taken seriously. Not until they clean up their methods.)
Their models are not just ‘fixing’ discontinuities due to measurement error, but erasing what happened and replacing it with what the model says *should have* happened.
In the field of psychology, where I was trained, messy data is a fact of life. And the best way to spot fraudulent data was to be presented with rather consistent measurements. Because everybody knew that was not what real world data ever looked like. You didn’t try to “fix” the data set, you used it or you didn’t use it. As soon as you tried to “fix” the data in that way (and I mean by this, substantially altering) you no longer had the original data but had inserted all your assumptions – the very thing you were seeking – into the data itself. Little wonder they found what they were expecting to find.

rooter
Reply to  jim Steele
January 26, 2015 6:14 am

Just add stations and increase coverage. This is the result:
http://woodfortrees.org/graph/crutem4vgl/compress:12/plot/crutem3vgl/compress:12
The exact opposite of what you think.
Or try this:
http://i.imgur.com/4XO4whq.gif
with this result
http://i.imgur.com/Snmh9Xf.gif
(both unadjusted)

mpainter
Reply to  jim Steele
January 26, 2015 7:10 am

Will Nitschke:
Mosher, in a comment here some months ago reported that ” new data” could change past data. He ended his comment “read’em and weep”
I consider Mosher as a hopeless case.

January 25, 2015 11:48 pm

The Git learnt his basic climatology from TR Oke’s Boundary Layer Climates (among other texts). Oke has not been updated to reflect CAGW. Nor, to the best of The Git’s knowledge is there a tertiary level text comparable to Oke. Until a warmist directs me to such, The Git will continue to accept The Received View, i.e climatology as it is taught at the tertiary level.

Mike the Morlock
Reply to  The Pompous Git
January 26, 2015 12:42 am

Hi thanks for your reply on the other thread. Now ISO-9000 heh heh…. I was an internal auditor for many years were I work. If you don’t comply you can’t export to Europe. If an area failed an internal audit the area Supervisor would be dinged on his review. Some of these people had been my supervisors in the past. Guess the rest.
I would show up dressed in a black leather trench coat- black leather finger gloves and a real life black soviet ushanko. complete with the CCCP emblem. It was best on new hires, I would greet them in Russian and then say I trust everything is in order?. My “normal” partner would ask.. is something wrong? It was a BLAST.
still laughing
michael

Mike the Morlock
Reply to  Mike the Morlock
January 26, 2015 12:43 am

oops for The Pompous Git

Reply to  Mike the Morlock
January 26, 2015 2:02 am

ROFLMAO!

January 26, 2015 1:34 am

“Or does it tell us more about climate politics and data manipulation?” ~ from the essay
I think your response to your friend over at the EPA was very good. There is so much more that you could have gone in to, but one needs to keep it short and to the point to have any chance at changing a mind. (if that is really possible at this point)
The politics of the AGW delusion is interesting. So many of the main players are getting well paid in many ways to keep finding that mankind’s use of energy (our industrial society in other words) is the main cause of the “coming runaway heat wave”. We can no longer expect any data set that can be tampered with to remain untampered. That is just the way it is.
As more and more and more CO2 is added to the atmosphere and the temperatures do not rise in sync, perhaps someday in the far future people will see that the “climate sensitivity” to CO2 is zero. (or so close to zero that we can not honestly meausure it) Perhaps when the IPCC releases its 100th assesment we will finally see that the Jim Hansen theory of CO2 driving climate is utter cow droppings. (the current players will have to die off as they will never admit that they were deluded)
“Consider that NOAA’s graph is based on homogenized data …” I would ask you to consider that the whole delusion is based on homogenized, infilled, and false data. Mencken once wrote that “For every complex problem there is an answer that is clear, simple, and wrong” and I believe the alarmists have found it!

The other Casper
Reply to  markstoval
January 26, 2015 6:16 am

“perhaps someday in the far future people will see that the “climate sensitivity” to CO2 is zero. (or so close to zero that we can not honestly meausure it)”
Or clearly above zero, but moderate enough that we’ll be able to adapt to it, just as humans have been adapting to climate changes for thousands of years already.
And that adaptation will be a whole lot easier if we continue to create wealth now.

mpainter
Reply to  The other Casper
January 26, 2015 7:37 am

CS is indeterminable

Eric
January 26, 2015 2:06 am

The first question I would ask is, “Is it possible to measure the temperature of the earth with any accuracy?” The temperature changes every so many feet in the atmosphere. When the wind blows the temperature changes. The oceans are miles deep. How many temperature data points would be necessary to accurately take the temperature of a dynamic planet?
Point 2: We are talking 100th’s of a percent in temperature. What is the + or – room for instrument error?

Mindert Eiting
January 26, 2015 2:08 am

There are several procedures for obtaining a time series of global average temperatures and several data bases. Run your program on your base and print the time series. If you have the source code, ask a programmer to substitute for all procedures for a mean, procedures that compute a median. Test the program, rerun it on the data base, and print the result. Let me know whether you are as surprised as I was, when I did this. BTW medians are less sensitive to outliers than means.

Genghis
January 26, 2015 2:25 am

I have often wondered at the wisdom of averaging the temperature from the tropical and subtropical zones which receive 73% of the total solar insolation and where Clouds reduce temperature, with the polar region which only receives 6% of the solar insolation and Clouds increase the temperature.
On what planet would that make sense? A pancake planet that doesn’t flip? Warmers probably read too much Paul Bunyan growing up.

rooter
January 26, 2015 5:17 am

Many wrongs no guarantee for being right. This post is an example.
#1. Polar amplification happens during global warming. Also when there are other causes than increased CO2 is the main cause.
The temperature of the Arctic is like the temperature in the Norhtern hemisphere but with more swing. More cooling 1940 – 1970. And more warming after that than globally. Polar amplification up and down.