Are We in a Pause or a Decline? (Now Includes at Least April* Data)

WoodForTrees.org – Paul Clark – Click the pic to view at source

Image Credit: WoodForTrees.org

Guest Post By Werner Brozek, Edited By Just The Facts

*At least April data was my intention. However as of June 8, HadCRUT3 for April is still not up! Could it be because as of the end of March, the slope of 0 lasted 16 years and 1 month and they do not want to add another month or two? What do you think? WoodForTrees (WFT) is up to date however, thank you very much Paul!

The graph above shows a few different things for three data sets where there has been no warming for at least 16 years. WFT only allows one to draw straight lines between two points, however climate does not go in straight lines. Often, temperatures vary in a sinusoidal fashion which cannot yet be shown using WFT. However we can do the next best thing and show what is happening over the first half of the 16 years and what is happening over the last half. As shown, the first half shows a small rise and the last half shows a small decline. Note that neither the rise in the first half nor the drop in the last half is statistically significant. However the lines do suggest that we are just continuing a 60 year sine wave that was started in 1880 according to the following graphic:

Dr. Syun-Ichi Akasofu’s – Clive Best – Click the pic to view at source

Do you agree? What are your views on the question in the title? Do you think we are presently in a pause or in a decline or neither?

In the sections below, we will present you with the latest facts. The information will be presented in three sections and an appendix. The first section will show the period that there has been no warming for various data sets. The second section will show the period that there has been no “significant” warming on several data sets. The third section will show how 2013 to date compares with 2012 and the warmest years and months on record. The appendix illustrate sections 1 and 2 in a different format. Graphs and a table will be used to illustrate the data.

Section 1

This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.

On all data sets below, the different times for a slope that is at least very slightly negative ranges from 8 years and 5 months to 16 years and 6 months.

1. For GISS, the slope is flat since January 2001 or 12 years, 4 months. (goes to April)

2. For Hadcrut3, the slope is flat since March 1, 1997 or 16 years, 1 month. (goes to March 31, 2013)

3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 12 years, 6 months. (This goes to May. I realize that Hadcrut3 is not up to date, but on the basis of its present slope and the latest numbers that I do have from the other three sets. I am confident that I can make this prediction.)

4. For Hadcrut4, the slope is flat since November 2000 or 12 years, 6 months. (goes to April)

5. For Hadsst2, the slope is flat from March 1, 1997 to April 30, 2013, or 16 years, 2 months.

6. For UAH, the slope is flat since January 2005 or 8 years, 5 months. (goes to May)

7. For RSS, the slope is flat since December 1996 or 16 years and 6 months. (goes to May) RSS is 198/204 or 97% of the way to Ben Santer’s 17 years. This 97% is real!

The next graph shows just the lines to illustrate the above for what can be shown. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the sloped wiggly line shows how CO2 has increased over this period.

WoodForTrees.org – Paul Clark – Click the pic to view at source

When two things are plotted as I have done, the left only shows a temperature anomaly. It goes from 0.1 C to 0.6 C. A change of 0.5 C over 16 years is about 3.0 C over 100 years. And 3.0 C is about the average of what the IPCC says may be the temperature increase by 2100.

So for this to be the case, the slope for all of the data sets would have to be as steep as the CO2 slope. Hopefully the graphs show that this is totally untenable.

The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted.

WoodForTrees.org – Paul Clark – Click the pic to view at source

Section 2

For this analysis, data was retrieved from SkepticalScience.com. This analysis indicates for how long there has not been significant warming according to their criteria. The numbers below start from January of the year indicated. Data go to their latest update for each set. In every case, note that the magnitude of the second number is larger than the first number so a slope of 0 cannot be ruled out. (To the best of my knowledge, SkS uses the same criteria that Phil Jones uses to determine significance.)

The situation with GISS, which used to have no statistically significant warming for 17 years, has now been changed with new data. GISS now has over 18 years of no statistically significant warming. As a result, we can now say the following: On six different data sets, there has been no statistically significant warming for between 18 and 23 years.

The details are below and are based on the SkS site:

For RSS the warming is not significant for over 23 years.

For RSS: +0.123 +/-0.131 C/decade at the two sigma level from 1990

For UAH the warming is not significant for over 19 years.

For UAH: 0.142 +/- 0.166 C/decade at the two sigma level from 1994

For Hadcrut3 the warming is not significant for over 19 years.

For Hadcrut3: 0.092 +/- 0.112 C/decade at the two sigma level from 1994

For Hadcrut4 the warming is not significant for over 18 years.

For Hadcrut4: 0.093 +/- 0.108 C/decade at the two sigma level from 1995

For GISS the warming is not significant for over 18 years.

For GISS: 0.103 +/- 0.111 C/decade at the two sigma level from 1995

For NOAA the warming is not significant for over 18 years.

For NOAA: 0.085 +/- 0.104 C/decade at the two sigma level from 1995

If you want to know the times to the nearest month that the warming is not significant for each set to their latest update, they are as follows:

RSS since August 1989;

UAH since June 1993;

Hadcrut3 since July 1993;

Hadcrut4 since July 1994;

GISS since October 1994 and

NOAA since May 1994.

Section 3

This section shows data about 2013 and other information in the form of a table. The table shows the six data sources along the top and bottom, namely UAH, RSS, Hadcrut4, Hadcrut3, Hadsst2, and GISS. Down the column, are the following:

1. 12ra: This is the final ranking for 2012 on each data set.

2. 12an: Here I give the average anomaly for 2012.

3. year: This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.

4. ano: This is the average of the monthly anomalies of the warmest year just above.

5. mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first two letters of the month and the last two numbers of the year.

6. ano: This is the anomaly of the month just above.

7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0.

8. sig: This is the whole number of years for which warming is not significant according to the SkS criteria. The additional months are not added here, however for more details, see Section 2.

9. Jan: This is the January, 2013, anomaly for that particular data set.

10. Feb: This is the February, 2013, anomaly for that particular data set.

11. Mar: This is the March, 2013, anomaly for that particular data set.

12. Apr: This is the April, 2013, anomaly for that particular data set.

13. May: This is the May, 2013, anomaly for that particular data set.

21. ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months. However if the data set itself gives that average, I use their number. Sometimes the number in the third decimal place differs by one, presumably due to all months not having the same number of days.

22. rnk: This is the rank that each particular data set would have if the anomaly above were to remain that way for the rest of the year. Of course it won’t, but think of it as an update 20 or 25 minutes into a game. Expect wild swings from month to month at the start of the year. As well, expect huge variations between data sets at the start. Due to different base periods, the rank may be more meaningful than the average anomaly.

Source UAH RSS Had4 Had3 Sst2 GISS
1. 12ra 9th 11th 9th 10th 8th 9th
2. 12an 0.161 0.192 0.448 0.405 0.342 0.56
3. year 1998 1998 2010 1998 1998 2010
4. ano 0.419 0.55 0.547 0.548 0.451 0.66
5. mon Ap98 Ap98 Ja07 Fe98 Au98 Ja07
6. ano 0.66 0.857 0.829 0.756 0.555 0.93
7. y/m 8/5 16/6 12/6 16/1 16/2 12/4
8. sig 19 23 18 19 18
9. Jan 0.504 0.441 0.450 0.390 0.283 0.61
10.Feb 0.175 0.194 0.479 0.424 0.308 0.52
11.Mar 0.183 0.204 0.411 0.387 0.278 0.58
12.Apr 0.103 0.219 0.425 0.353 0.50
13.May 0.074 0.139
21.ave 0.208 0.239 0.440 0.401 0.306 0.553
22.rnk 6th 8th 11th 12th 11th 10th
Source UAH RSS Had4 Had3 Sst2 GISS

If you wish to verify all of the latest anomalies, go to the following links, UAH,

For RSS, Hadcrut4, Hadcrut3, Hadsst2,and GISS.

To see all points since January 2012 in the form of a graph, see the WFT graph below:

WoodForTrees.org – Paul Clark – Click the pic to view at source

I wish to make a comment about this graph from WFT. It is right up to date. The only reason that both HadCRUT3 and WTI only go to March is because WTI uses 4 data sets, one of which is HadCRUT3, so if HadCRUT3 is not there for April, WTI cannot be there for April as well.

Appendix

In this part, we are summarizing data for each set separately.

RSS

The slope is flat since December 1996 or 16 years and 6 months. (goes to May) RSS is 198/204 or 97% of the way to Ben Santer’s 17 years.

For RSS the warming is not significant for over 23 years.

For RSS: +0.123 +/-0.131 C/decade at the two sigma level from 1990.

The RSS average anomaly so far for 2013 is 0.239. This would rank 8th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2012 was 0.192 and it came in 11th.

Following are two graphs via WFT. Both show all plotted points for RSS since 1990. Then two lines are shown on the first graph. The first upward sloping line is the line from where warming is not significant according to the SkS site criteria. The second straight line shows the point from where the slope is flat.

The second graph shows the above, but in addition, there are two extra lines. These show the upper and lower lines using the SkS site criteria. Note that the lower line is almost horizontal but slopes slightly downward. This indicates that there is a slight chance that cooling has occurred since 1990 according to RSS

graph 1 and graph 2.

UAH

The slope is flat since January 2005 or 8 years, 5 months. (goes to May)

For UAH, the warming is not significant for over 19 years.

For UAH: 0.142 +/- 0.166 C/decade at the two sigma level from 1994

The UAH average anomaly so far for 2013 is 0.208. This would rank 6th if it stayed this way. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2012 was 0.161 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to UAH.

Graph 1 and graph 2.

Hadcrut4

The slope is flat since November 2000 or 12 years, 6 months. (goes to April.)

For Hadcrut4, the warming is not significant for over 18 years.

For Hadcrut4: 0.093 +/- 0.108 C/decade at the two sigma level from 1995

The Hadcrut4 average anomaly so far for 2013 is 0.440. This would rank 11th if it stayed this way. 2010 was the warmest at 0.547. The highest ever monthly anomaly was in January of 2007 when it reached 0.829. The anomaly in 2012 was 0.448 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut4.

Graph 1 and graph 2.

Hadcrut3

The slope is flat since March 1 1997 or 16 years, 1 month (goes to March 31, 2013)

For Hadcrut3, the warming is not significant for over 19 years.

For Hadcrut3: 0.092 +/- 0.112 C/decade at the two sigma level from 1994

The Hadcrut3 average anomaly so far for 2013 is 0.401. This would rank 12th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to go back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2012 was 0.405 and it came in 10th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut3.

Graph 1 and graph 2.

Hadsst2

For Hadsst2, the slope is flat since March 1, 1997 or 16 years, 2 months. (goes to April 30, 2013).

The Hadsst2 average anomaly for the first four months for 2013 is 0.306. This would rank 11th if it stayed this way. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2012 was 0.342 and it came in 8th.

Sorry! The only graph available for Hadsst2 is the following

this.

GISS

The slope is flat since January 2001 or 12 years, 4 months. (goes to April)

For GISS, the warming is not significant for over 18 years.

For GISS: 0.103 +/- 0.111 C/decade at the two sigma level from 1995

The GISS average anomaly so far for 2013 is 0.553. This would rank 10th if it stayed this way. 2010 was the warmest at 0.66. The highest ever monthly anomaly was in January of 2007 when it reached 0.93. The anomaly in 2012 was 0.56 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to GISS.

Graph 1 and graph 2

Conclusion

Above, various facts have been presented along with sources from where all facts were obtained. Keep in mind that no one is entitled to their own facts. It is only in the interpretation of the facts for which legitimate discussions can take place. After looking at the above facts, do you think that we should spend billions to prevent the claimed catastrophic anthropogenic global warming? Or do you think we should take a “wait and see” attitude for a few years to be sure that future warming will be as catastrophic as some claim it will be? Keep in mind that even the MET office felt the need to revise its forecasts. Look at the following and keep in mind that the MET office believes that the 1998 mark will be beaten by 2017. Do you agree?

WoodForTrees.org – Paul Clark – Click the pic to view at source

By the way, here is an earlier prediction by the MET office:

“(H)alf of the years after 2009 are predicted to be hotter than the current record hot year, 1998.”

When this prediction was made, they had Hadcrut3 and so far, the 1998 mark has not been broken on Hadcrut3. 2013 is not starting well if they want a new record in 2013. Here are some relevant facts today: The sun is extremely quiet; ENSO has been between 0 and -0.5 since the start of the year; it takes at least 3 months for ENSO effects to kick in and the Hadcrut3 average anomaly after March was 0.401 which would rank it in 12th place. Granted, it is only 3 months, but you are not going to set any records starting the race in 12th place after three months. So even if a 1998 type El Nino started to set in tomorrow, it would be at least 4 or 5 months for the maximum ENSO reading to be reached. Then it would take at least 3 more months for the high ENSO to be reflected in Earth’s temperature. How hot would November and December then have to be to set a new record? In my opinion, the odds of setting a new record in 2013 are extremely remote.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
277 Comments
Inline Feedbacks
View all comments
June 10, 2013 3:56 pm

barry says:
June 9, 2013 at 10:51 pm
Mario,
[Mario said]
Your ilk have used models to suggest the trend must continue towards warming.
[Barry said]
I don’t know who my ‘ilk’ are meant to be. I don’t feel I belong to any particular club in this debate.
Climate scientists rely on physics, not models, to posit that an enhanced greenhouse should warm the Earth. It is a notion shared by the reputable climate scientists and Anthony watts, who rightly forbids at this site the unphysical proposition to the contrary.
[Mario said]
Your ilk said a pause in warming could not happen for 10 then 15 years.
[Barry said]
The mid-range model ensembles include runs that have flat trends for 10, 15 and 20 years. Where do this ‘ilk’ say that a pause could not happen?
[Mario said]
You are creating another strawman argument because their argument fell on its face.
[Barry said]
What strawman? That the greenhouse effect is real?
+++++++++++++++++++
ILK = people who take my tax money forcibly by threatening people to vote to steal it from me using a problem that does not exist. This ILK (think Gore, Mann etc) becomes wealthy at my expense and at the expense of the poorest in our society.
First you deny that models are used to show warming, but instead say that physics is used to show warming. NO BARRY Physics does not show that the world is going to warm, only the models show that a single molecule called CO2 will trigger events leading to amplified warming. You seem to misunderstand the difference between the theory of GHE and hypothesis of CAGW
Saying that the greenhouse effect is real is not proof of CAGW. The only evidence that CO2 caused the 20th Century warming is in “the models.” But as you know, the models are wrong. Sorry to rub salt in the wound, but we’re not going to boil over.
PS – You don’t seem to know what your ilk professes and you make a poor representative for them. Your side changes history moving goal posts over and over again. First it was 10 years… then it was certainly not more than 15 years.
Your ilk’s models show all sorts of things ranging from flatlining and huge amounts of warming. Most of them show huge amounts of warming –and the only ones you and your ilk believe in are the ones that heat us up towards catastrophe. Those ones are used for policy makers by the IPCC and those ones FAILED so quickly you have not taken the time to realize it.

Dinostratus
June 10, 2013 6:13 pm

Ian W says:
June 9, 2013 at 4:16 pm
Dinostratus says:
June 9, 2013 at 3:20 pm
It is foolish to create a linear ‘projection’ based on the outputs of a chaotic system. Is it any more logical to carry out a Fourier transform on the outputs of a chaotic system?
A) Just to be perfectly pedantic, we don’t know that climate is chaotic. What it is, I have no idea.
B) I think you get my point. Any convergent expansion we know of is invalid, IMHO.

Richard M
June 10, 2013 7:21 pm

We have had the assertion that models do in fact show periods of more than 15 years with no warming. However, I have not seen any reference to any such model. I strongly suspect that if we looked at the details of such a run we would find one or two volcanoes driving down the temperature. Since we haven’t had a single cooling volcanic eruption in the 16.5 years of current flat temps I doubt very much the claimed model has anything to do with what has actually occurred.
Let’s see the detail of the supposed 20 year model run with flat temps.

barry
June 10, 2013 7:30 pm

wbrozek,

If we assume this is the new goal post, then in the meantime, are the world governments willing to not waste money on something that may not be a problem after all?

25 years is my own reckoning. The classic climate period is 30 years to smooth out internal variability.
You are here moving on to talk about policy. Is this the impetus for your analysis?
The truth is we know that the world should heat up with enhanced greenhouse effect. We don’t know by how much. Here you focus on a recent flattish trend in the data to recommend that the models are broken, and that therefore there might not be a problem to fix. You are saying, essentially, that maybe (“surely”? “definitely”?) climate sensitivity is low. But we don’t know that either. It could be high or low.
I don’t like to discuss policy options on climate change (I’m not interested and not informed), but as a broad idea, I would say that policy makers should see climate change as a risk with unknown outcomes, potentially serious, potentially inconsequential, and act prudently. This is the position Roger Pielke Snr endorses. The tricky thing is that if the effects are serious, the longer action is delayed, the worse it will be, and we won’t know for a few decades. A very difficult conundrum for policy makers who tend to look little further ahead than the next four or five years. It would be great if global warming was long, slow and not terribly disruptive, but your analysis, and the best efforts from those arguing similarly, do not persuade me that we can be sure serious impacts will not happen. Is this what you want people to believe?

June 10, 2013 8:06 pm

tumetuestumefaisdubien1 says:
June 10, 2013 at 10:39 am
“What TSI trend was there?
The 1964-1994 (1964.0-1993.92) 30 years TSI trend (Solanki daily TSI reconstruction data)
+0.437 W.m-2 (0.146 W.m-2/decade)”
So I plot (you can do this too) the Solanki data for 1964 to 1994, and I see a periodic waveform with three cycles (assuming exactly three cycles are captured). Looking at the peaks, I see the first peak significantly depressed compared to the second two. Looking at the valleys, I see all valleys have about the same value. If I then calculate the total energy per cycle, I see the first cycle has significantly less energy than the next two. I see the second cycle having the greatest energy and the third cycle having slightly less energy than the second cycle. I conclude that a trend line for these three cycles is inappropriate.
Instead, if I look at the temperature data sets, and I assume the Solanki data is the only contributor, I would expect to see lowest temperatures for 1964-1974, highest temperatures for 1974-1984, and slightly lower temperatures for 1984-1994.
I plot two data sets of ocean surface temperatures.
http://www.woodfortrees.org/plot/esrl-co2/from:1964/to:1994/offset:-315/scale:0.007/offset:-0.125/plot/hadsst3gl/from:1964/to:1994/mean:50/plot/hadsst2gl/from:1964/to:1994/mean:50
I instead see the temperatures continually increasing (underneath the solar cycle) for two data sets.
Next I do an energy balance calculation from:
http://www.windows2universe.org/earth/climate/sun_radiation_at_earth.html
I find that if I use a 0.437 W/m2 increase, I only get a 0.02 degree temperature rise, not the actual 0.27 degree increase.
Just for kicks, I plotted CO2 on the three ocean data sets. I offset and scaled the data to match the temperature data for two sets. (I am assuming the general temperature increase is primarily due to CO2. This is not necessarily so as there could be other contributions that I am ignoring for now.) Note the very good match in the general temperature increase (ignoring the solar cycle) to the CO2 curve.
I could conclude from this that I can identify the solar cycle and the CO2 impacts on temperature. I also conclude that the impact of the average increase in solar irradiation over the given 30-year period is minimal compared to other contributions.


Reply to  Frank Mlinar
June 11, 2013 9:01 am

Frank Mlinar says:
June 10, 2013 at 8:06 pm
“I do an energy balance calculation from:
http://www.windows2universe.org/earth/climate/sun_radiation_at_earth.html
I find that if I use a 0.437 W/m2 increase, I only get a 0.02 degree temperature rise”
You don’t do calculation, you just use a (flawed) script at a website, not much knowing what are you doing.
The model at windows2universe.org is obviously a mess and in any case is not at all useful for ocean surface temperature anomaly rise estimation.
The ocean absorbs most of the incoming shortwave spectra converting it to heat -> temperature rise and most of the rest contributes to surface evaporation. And it works at most of the possible angles, because the water has quite low reflectivity and therefore quite a low albedo of <0.1, while it has very high transparency to solar spectra.
By rule of thumb the ocean is responsible for ~90% of all solar shortwave spectra conversion to heat via solar photons extinction on this planet (I calculated it for myself, not red it somewhere on the internet), rest is going on the land surface and only ~1-2% in the atmosphere*. So if you want right numbers for the TSI rise -> SST rise dependence, you must calculate the absolute ocean surface heat figures via calculating how much heat results from the solar irradiation in the ocean surface layer. Then you’ll find it very well agrees with the TSI rise in the 20th century.
Here you have the Solanki TSI trends plot.
Here I’ve put to your WFT graph the SSN instead of CO2 and made it 1964 solar minima – 1996 solar minima for you to see the solar activity -> SST dependence and see that the solar cycle signal is quite well distinguishable in the SST anomaly data and it is in fact the strongest signal you can see there by bare eye together with intermitent ENSO signals. Nothing like that you can see there for CO2.
I’ll add that from the above linked water absorbtion spectrum graph is absolutely clear the water is millions of times more opaque to the atmosphere 254 K ~10 µm spectra so the GHE can’t have a significant effect on the sea epipelagic layer temperature, because simply the 254K spectra cannot significantly penetrate it deeper than several hundreths of milimeter. So to compare SST data with CO2 data is a nonsense. You never can find a true CO2 signal in the SST data, because it is simply physically impossible to have it there.
The 0.27 K SST rise dependence on 0.427 W/m^2 TSI rise calculation in relative and absolute numbers together with the comparison to the 1902-1932 period and to whole 1900-2000 period you can find here. It shows quite convincingly that the chief cause of the sea surface layer warming in the 20th century, including the last warming period since 1960s, was the rising solar activity. And I just repeat the ocean surface layer (the layer significantly penetrated by solar irradiance up to 200 m depth) has 50+ times higher heat capacity than whole the atmosphere and almost 1000 times higher heat capacity than surface atmosphere boundary layer, so it is clear it is the ocean (warmed by sun) which chiefly warms the atmosphere surface boundary layer, not vice versa.
——–
* btw. even just the photosynthesis
stores more heat per year at this planet than the whole atmosphere stores (due to its low thermal capacity and releatively low average temperature 254 K -weighted by pressure in standard atmospheric model, more or less equal to atmosphere blackbody temperature – which the atmosphere has not at the Earth’s surface but 5000+ meters above sea level, so all the alarmist Stefan-Boltzman law tricks claiming 33K higher temperature at the surface than predicted are obvious nonsenses only showing that the people claiming it have no clue what Stefan-Boltzman law is about – the atmosphere is in fact very well in terms with the Stefan-Boltzman law and theres no “33 K difference”) and the photosynthesis sequesters ~10 times more Carbon dioxide than the current anthropogenic CO2 emissions – so just the photosynthesis rate is the reason why the anhropogenic CO2 rise based CAGW pseudohypotheses are absolute bunk, designed to fool those who missed the high-school math and physics classes to get some money from them and let them feel as saviors of the world while the opposite is true.

Reply to  tumetuestumefaisdubien1
June 12, 2013 5:05 pm

tumetuestumefaisdubien1 says:
June 11, 2013 at 9:01 am
“You don’t do calculation, you just use a (flawed) script at a website, not much knowing what are you doing.”
First I wish to thank you for the additional references as they further support my conclusions.
Now:
I used the energy balance equation for the earth directly:
T=[(solar irradiance * albedo)/(4 * Boltzmans constant)] ^ 0.25
Where:
solar irradiance = 1366 W/m2
albedo (actually 1 – albedo) = 0.31 ( I also tried 0.18 with minor impact)
This equation plus its derivation is all over the internet: http://eesc.columbia.edu/courses/ees/climate/lectures/radiation/ for example.
Here’s a discussion of seawater albedo: http://cove.larc.nasa.gov/papers/jingrl04.pdf
Think of the energy balance this way:
Construct an imaginary sphere around the earth at some distance (say at the top of the atmosphere for example).
Integrate the energy entering/leaving across the complete surface of the sphere.
According to thermodynamics, at equilibrium the net energy will be exactly zero.
The albedo used for earth energy balance is about 0.3 (I used 0.31). This number can be found in multiple places.
The equivalent temperature at equilibrium assumes a black body modified by the albedo. That temperature does not include the “greenhouse effect” from the atmosphere and things like water vapor CO2 CH4 CFCs, etc.
The delta temperature between the black body and the earth’s surface temperature (or ocean surface) is the result of energy being “trapped”.
I changed the albedo to zero (0), and the temperature rise using your trend line for 1964 to 1994 was 0.022 degrees K. For an albedo of 0.31 the temperature rise was 0.020.
Regarding trend lines:
http://tumetuestumefaisdubien1.sweb.cz/Solanki%201900-2000%20Trends.png
The solar irradiance is a periodic function (close to a sine wave).
Any trend line over multiple periods should be measured over an integer number of periods. This is especially true when trending over just a few periods. You can start at any point of the starting cycle, and you must end at the corresponding point of the end cycle.
Your 1964 – 1994 trend line begins close to a minima and ends halfway up the side of the third period. That is you did not use an integer number of periods.
This is also true for your 1902 – 1932 trend line.
Easy places to use are at the maxima or minima.
I recalculated your trend lines using three complete cycles and the minima. My time frames are: 1965.2 to 1996.7 (about). My trend line slope is 0.076 W/m2/decade; about half of what you calculated.
For 1902 to 1932, I used 1902.1 to 1933.6 and got a slope of 0.071 where you got 0.094.
For 1900 to 2000, I used 1902.1 to 1996.7 and got a slope of 0.074 where you got 0.694.
As you can see, the start and stop points are important for periodic signals.
Now for 1900 to 2000, just eyeballing the Solanki data, it appears that there is a general increase in solar energy from 1900 to about 1954 or so, From 1954 to 2000 there is a general decrease (or possibly flat) in solar energy. I would recommend two trend lines be used, and I would look for these trends in the SST data.
Some trend line calculations using minima:
1902.1 to
1933.6 0.071
1954 0.107
1965.2 0.126
1975.7 0.091
1986.2 0.087
1996.7 0.074
Note how the trend increases up to 1965.2 as additional solar cycles are added. Then the trend decreases for additional solar cycles. This further substantiates the desirability of two trend lines across 1900 to 2000.
Drawing a trend line from 1954 to 1996.7 results in a negative trend: -0.02.
Using peaks and the time frame 1958.9 to 1990.8 results in a negative trend: -0.003.
So for the solar energy to account for the general shape of SST, it should increase until about 1954 or so and then start decreasing (with wiggles). Again it does not.
This all reminds me of a study I once did on gun control. I saw a paper where the author took three points of the gun murder rate; approximately 1960, 1970, 1980 and tried to claim the murder rate was increasing rapidly when it in fact was not. The murder rate was flat before 1960, rose rapidly during the 60s, flattened out during the 70s, peaked again in the early 80s and then went back to the 70s rate.
“Here I’ve put to your WFT graph the SSN instead of CO2 and made it 1964 solar minima – 1996 solar minima for you to see the solar activity -> SST dependence and see that the solar cycle signal is quite well distinguishable in the SST anomaly data and it is in fact the strongest signal you can see there by bare eye together with intermitent ENSO signals. Nothing like that you can see there for CO2.”
Look again. the SST has a general upward trend. That is, the solar signal minima continue to increase in the SST data. This cannot be accounted for from either the solar sunspot data or the Solanki datan as the minima remain at the same value. In other words the solar irradiance signal is riding on top of another signal that is continually increasing. I can scale the CO2 data such that it matches this increase. I am not implying that CO2 is the only other contributor, rather I am saying I can match the signal. In other, other words, solar irradiance does not account for this increase in overall temperature.
Regarding photosynthesis, plants grow and die absorbing CO2 and releasing CO2, just like the energy balance equation. Humans keep increasing (and cutting down trees and planting asphalt) and burning more. If you want to use this argument, you need to back it up.

Reply to  Frank Mlinar
June 13, 2013 3:48 pm

Frank Mlinar says:
June 12, 2013 at 5:05 pm
“I used the energy balance equation for the earth directly:
T=[(solar irradiance * albedo)/(4 * Boltzmans constant)] ^ 0.25
Where:
solar irradiance = 1366 W/m2
albedo (actually 1 – albedo) = 0.31 ( I also tried 0.18 with minor impact)”

First let be clear that the solar irradiance 1366 W/m^2 is not a correct value to use in Stefan-Boltzman law equation. The 1366 W/m^2 is old obsolete TSI estimation resulting from satelite instrument systematic error by the instruments aperture backscatter. This was very improved with SORCE-TIM instrument, which gives last decade TSI average of 1361.2628 W/m^2. Here you find the data, please mind also the column 10 which are the actual real Earth’s distance TSI data which have ~1315-1410 W/m^2 variance during the year due to eliptic Earth’s orbit around the sun during the year, with perihelion maximum in January – this is not so important for short-term climate influences, but it is crucial for long-term ones, because of the axial precession which shifts the perihelion 20 minutes 24 seconds each tropical year and it will in the future – in about 12000 years – shift the perihelion in the summer months and cause the ocean – which is mainly at the southern hemisphere – being considerably less insolated – causing the net solar forcing on the sea surface temperature change of minus 5-7 W/m^2 averaged globally and trigger so the southern warm thermohalines slowdown, slowdown of the northward Atlantic current and lead to run-out glaciation and ice age. It is clear that even the most alarmist estimations of the CO2 forcing cannot offset it and therefore a runout GHE is impossible in the next 12000 years even if the CO2 forcing estimations would be true.
Now to the meaning of the Stefan-Boltzman law. It predicts the atmosphere blackbody temperature of ~254.6 K (for TSI 1361.2628 W/m^2 and albedo 0.3) But it is not the surface air temperature which is predicted by the Stefan-Boltzman law. What is predicted by the Stefan-Boltzman law is in fact the mean temperature at which the atmosphere radiates. And this temperature ~254.6 K in fact very well agrees with the pressure weighted mean atmosphere temperature you can find at ~5100 m above sea level in the standard atmospheric model and if you go from there towards surface using adiabatic lapse rate, the resulting surface temperature very well agrees with the mean surface air temperature.
In fact our atmosphere is very well in terms with the Stefan-Boltzman law and there’s no mysterious “33K difference” which cannot be explained by the wet adiabatic lapse rate. Moreover the fact, that the actual wet adiabatic lapse rate is so much lower than theoretical dry adiabatic lapse rate confirms that it is indeed the water vapor which causes the actual troposphere temperature gradient (~4-7 K/1000 m) be so less steep than it would be in the cause there’s no water vapor in the air (9.8 K/1000 m).
So to use Stefan-Boltzman law for sea surface water temperature estimation is a nonsense the same way as it is nonsense to use it for the surface air temperature estimation – the surface air temperature is not the mean atmosphere temperature. A similar nonsense as to claim using it that the surface air temperature is somewhat (the mysterious “33K”) higher than theoretically predicted.
It is not and only the people who in fact don’t much understand what the Stefan-Boltzman law actually predicts (the mean temperature at which the atmosphere radiates, which well agrees with the mean atmosphere temperature which you find at about 5100 m above MSL, not the surface air average temperature – which moreover is not “15 C” but only 13.4 C, because the mean Earth elevation is not zero but 245 m above MSL) claim something like that. This fundamental misunderstanding of the Stefan-Boltzman law is the “mother” of all the CAGW nonsenses the climatologists and their followers spew all around.
The CO2 content in the atmosphere near surface (there’s ~25 times more water by mass in the surface air than CO2, the water which moreover is in form of vapor carying ~37.4 Joules per mole per Kelvin specific heat + 40.7 kiloJoules per mole latent heat, while the CO2 caries only the ~37.1 Joules per mole per Kelvin, never able to release the latent heat at the temperatures usual in the Earth’s atmosphere, which rarely go in the troposphere under the minus 57 C needed for the CO2 to condensate – not even the cold tropopause has such a low temperature) by far is not the chief surface temperatures driver. It has in fact only a minor effect if compared with the TSI rise ->heat content rise -> surface temperature anomaly rise numbers for the ocean.
“Construct an imaginary sphere around the earth at some distance (say at the top of the atmosphere for example).”
No, I’ll not do it, it is not a correct approach – as explained i.e. here. There’s no imaginary sphere around Earth, there’s the atmosphere, which radiates chaotically into all directions at all elevations (especially the CO2 is distributed in the atmosphere quite very evenly), nevertheless having slightly higher aperture (rising with elevation) to the space than back to the surface – due to the fact that the Earth is not flat, but it is a sphere – so in fact more radiating CO2 in the atmosphere means more heat dissipated by it back into the space in the real world, not in the flat discworld the CO2 GHE climatology often reminds me about with their sureal models. Moreover there’s the water vapor in the air, which is lighter than air and transports so the (huge 40.7 kiloJoules per mole) latent heat up until it condenses releasing it, changing so the adiabatic lapse rate quite considerably. Moreover it is mainly the surface liquids(ocean) and solids (the land with it green flora cover) which radiate the longwave mid-IR, not the atmosphere. Moreover it radiates it at considerably higher temperatures than is the mean atmosphere temperature (for example the ocean has ~290 K average surface temperature), and most of this longwave radiation passes the atmosphere unimpeded directly to the space and only a minor part of it heats the atmospheric water (having very high absorbtion rates for the 290 K spectra) and only very tiny part of it heats the CO2 content in the atmosphere.
The solar irradiance is a periodic function (close to a sine wave).
No, it is considerably different from a sine wave. It is not symmetrical as the sine wave if normalised to 1 AU distance and in fact it varies ~95 W/m^2 during the year due to the fact the Earth’s orbit is not round, but eliptical.
“Any trend line over multiple periods should be measured over an integer number of periods.“
I really don’t see a point for doing it when we calculate absolute solar shortwave radiation -> heat conversion numbers for arbitrary 30 years periods serving to compare same length periods both starting at solar minima for 1st and 2nd half of the 20th century. Moreover I anyway included the comparison with the 1900-2000 period which is arguably long enough to see the actual TSI trend – SST anomaly change ratio which in absolute numbers very well agrees with the actual ocean epipelagic layer thickness and temperature anomaly change and shows so it was indeed the sun which is responsible for most of the sea surface layer warming, moreover the absolute heat number of ~2×10^23 Joules is much higher heat content surplus caused by the TSI rise, than is the whole heat content of the atmosphere surface boundary layer and just the epipelagic layer has 50+ times higher heat capacity than whole the atmosphere and has 3-4 K higher absolute average temperature than the surface air layer (at the 245 meter mean elevation) and therefore it is clear what heats here what (the water the air).
”This is especially true when trending over just a few periods. You can start at any point of the starting cycle, and you must end at the corresponding point of the end cycle.”
No, I must not do it when calculating in absolute solar shortwave radiation -> heat conversion numbers. In fact I used the exactly 30 years periods just arbitrarily, because 30 years trends are considered significant (and in this 1902-1932 and 1964-1994 cases ARE significant).
”desirability of two trend lines across 1900 to 2000.”
I actually did two trends chart using SSN monthly data (I didn’t use TSI data, because the Solanki TSI reconstruction ends in 2004 and it is not a trivial task to homogenize them with the later satelite data). You can see it here.
The animated graph shows very well the solar activity trends in terms of the sunspot activity still rose well into the 2000s. In fact the turning point of the SSN solar activity trends from the 1964 solar minima to downward slopes is in March 2006, and since then the trends quickly get a steep downward slopes due to relatively very low solar activity in the current cycle – if we use the L. Svalgaards Waldmeier discontinuity SSN correction, the current solar cycle sunspot activity level tends to become comparable to solar cycle 5 at the beginning of the Dalton minimum. (Mind also the trends tend to culminate not at the solar cycle maxima but typically 2-3 years after it. Which is due to the periodic, unsymmetrical, cumulative nature of the solar cycle signal. This you must take into account if doing solar trends-temperature anomaly correlations, otherwise you’ll cancel a big chunk of the relation in the process. It is simply not so simple to compare steep slopes periodic signals with the nonperiodic signals as the surface temperature anomalies, you must know what are you doing to get some telling results from it.) The turning point very well coincides with the sea surface temperature anomaly trends turning point in the 2000s, although well visible it becomes not sooner than when the current solar cycle will end in 2020s.
Look again. the SST has a general upward trend.
And so what? Does it prove that the SST anomaly rises because of the rising CO2 content in the atmosphere?
NO. -a correlation – as the one of the CO2 rise-temperature rise – moreover a clearly intermittent one (now you don’t find such positive correlation anymore) – is not a proof of causality.
Especially not in the last ~12 years where the CO2 rise-SST rise relation is an anticorrelation.
As I showed above with the animated SSN trends graph, the solar activity trends had generally rising slopes most of the time up until mid 2000’s. If you correlate them with the SST anomaly trends you’ll get much better correlation than with the CO2 atmospheric content rise. As showed here, the SST trend vectors well agree with the solar cycle signal, except the last period of the rising solar activity in the SC24.
Why? Simply because
the trend from the 1996 solar minimum to the current solar cycle maximum has a relatively steep downward slope even it begins at solar minimum and ends in the current solar cycle maximum period. (This is a case which well shows why it is nonsense to do the solar activity trends only from minima to minima and maxima to maxima)
A quite steep downward trend of -0.377 W/m^2 (-0.224 W/m^2 per decade) you find with the ACRIM satelite TSI composite (- which unlike the PMOD is now homogenized with the right SORCE-TIM 1361 W/m^2 level – and after studying in detail how it was created and comparing it to the PMOD composite I deem the ACRIM TSI composite way more reliable than the PMOD TSI composite. Nevertheless a significant downward TSI trend you would find with the PMOD anyway too -although you will need to homogenize it at SORCE-TIM level and fill the numerous missing daily values data gaps in it to be able calculate the trends from it.)
“That is, the solar signal minima continue to increase in the SST data.
The sea surface temperature anomaly rose (it doesn’t anymore at the decadic scale), because the surplus heat accumulated in the ocean surface layer. – So if you add some surplus (order of 10^22 just in the solar cycle 22 for example) Joules one solar cycle the Joules you add next cycle add to the previous. That’s why the sea surface temperature anomaly rose when the TSI trends had the upward slopes. Yes, it’s so simple and the rising CO2 atmospheric content has not much to do with it – else than it is mainly the result (due to relatively steep dependence of the CO2 solubility in water on temperature) and not so much the cause of the sea surface warming!
“I can scale the CO2 data such that it matches this increase.”
No, you cannot anymore, because while the SST anomaly has downward trend for more than 10 years the CO2 atmospheric content has steep upward trend. There’s no way to reconcile the two now.
And I predict that it will be not possible at very least next 15 years, because the relatively low solar activity will continue to cause the downward SST anomaly mid-term trend slope at very least until the next solar cycle maximum, moreover many now predict similarly low solar cycle 25 as the current SC24, and if it happens, then there would be not possible to reconcile the two next quarter of century. You’ll see in the next decade more and more clearly, that I was right. -Which will be killing for the CAGW political agenda.

Reply to  Frank Mlinar
June 13, 2013 3:51 pm

Frank Mlinar says:
June 12, 2013 at 5:05 pm
Sorry, here the link to the animated SSN trends graph.

Reply to  tumetuestumefaisdubien1
June 13, 2013 9:03 am

tumetuestumefaisdubien1 says:
June 10, 2013 at 10:39 am
“wbrozek says:
June 10, 2013 at 8:42 am
They love to talk of all the 10^22 Joules that supposedly have gone into the deep ocean, but when you do the calculations, you find it amounts to a small fraction of a degree that you cannot even detect if you were to go from one pool to the next at the slightly higher temperature.
Yes.
If we do simple calculation only the upper 1m of the ocean has more heat content than 10^22 Joules.
-The weight of the upper 1m of the ocean is
1,026[surface layer mean sea water density in grams per cubic centimeter] x (3,61132×10^20[ocean area in square meters]) = 3,7×10^20 grams
and has the heat capacity 4.2 × (3,7×10^20) = 1.5×10^21J.K-1
so the upper 1m ocean heat content is (1.5×10^21) ×290 K = 4.35×10^23 Joules….”
Let’s pretend we have a column of water with the top of the column one square meter. Let’s further pretend it is a sunny day at noon on the equator. Let’s pretend the seawater albedo is 0 (all energy is absorbed). Let’s use the solar irradiance in 1964 on average of 1365.7 W/m2. Therefore, using the above trend line the average irradiance in 1994 is 0.437 + 1365.7 = 1366.137 W/m2. This is a 0.032% increase in total irradiance from 1964 to 1994 (using the trend line).
This is also 1365.7 J/s/m2 for 1964 and 1366.137 J/s/m2 for 1994. This energy is maintaining (for example) a 290 degree K (1964) water temperature (at least at the top of the column). Because of energy balance, the heat received during the day is reradiated at night in 1964 and 1994. (We already heated the water over millions of years and now we are in balance)
In 1964, if the water temperature is assumed to be 290 degrees K. Therefore in 1994, the water temperature is 290 + 0.27 = 290.27 degrees K.
Heat capacity of seawater = 3993 J/kg/K
Weight of one cubic meter of seawater = 1000 kg per cubic meter (kg/m3)
( we could use any amount of water we want.)
At 290 degrees K
Total heat in one cubic meter of seawater = 3993 * 1000 * 290 = 1.15797 * 10^9 J.
At 290.27 degrees K
Total heat in one cubic meter of seawater = 3993 * 1000 * 290.27 = 1.15905 * 10^9 J.
This is a 0.093% increase.
Now let’s do this for 1900 to 2000
Temperature rise = 0.657K
Irradiance rise = 0.694 W/m2
Assumed temperature in 1900 = 290 K
Average irradiance in 1900 = 1365.37 W/m2
Average heat in 1990 = 1365.37 J/s/m2
Total heat in one cubic meter of seawater in 1990 = 3993 * 1000 * 290 = 1.15797 * 10^9 J.
Average irradiance in 2000 = 1366.084 W/m2
Average heat in 2000 = 1365.37 J/s/m2
Temperature in 2000 = 290.657 K
Total heat in one cubic meter of seawater in 2000 = 3993 * 1000 * 290.657 = 1.16059 * 10^9 J.
Percent increase in solar irradiance from 1900 to 2000 = 0.0523%
Percent increase in heat in one cubic meter of seawater from 1900 to 2000 = 0.227%
As an additional sanity check, look at a representative solar cycle where the irradiance changes (max to min) by about (eyeballing) 1.2 W/m2. Looking at the SST data (again eyeballing), the temperature changes over the max to min solar cycle by about 0.1 K. Since the solar irradiance increase for 1900 to 2000 of 0.694 W/m2 is approximately half of a single solar cycle variation of 1.2 W/m2, I would expect the 1900 to 2000 temperature rise to be about half of a single solar cycle variation or 0.05 K. This is an order of magnitude less than the actual temperature rise of 0.657 K.
The errors I see in tumetuestumefaisdubien1’s analysis are as follows:
Trend lines do not cover an integer number of solar cycles which cause an erroneous trend. See my post on June 12, 2013 at 5:05 pm.
The analysis does not take energy balance into account, as shown by only considering the change in heat and not the total heat delivered. (at the end of the day ;-), the total delivered heat is reradiated into space.) The following quote substantiates this. “The water in the ocean is very well insulated from the space by the atmosphere, so only significant means how the heat directly delivered to it by converting the shortwave solar spectra and stored in the seawater can escape back to the space is by heating the air by both mid-IR radiation and direct heat conduction from the ocean surface water to air.”
tumetuestumefaisdubien1 sums the delta heat received over 1900 to 2000: “+0.694 x 0.92 (1- water albedo 0.08 = 0.92) = 0.638 W.m-2 got under the sea surface x 3153600000 (x60 seconds x60 minutes x24 hours x365 days x100 years = 3153600000 number of seconds in 100 years)” but does not include the total heat received. If his statement (see 2. above) were true, the total heat received per day would also not escape.

June 10, 2013 9:04 pm

barry says:
“You are saying, essentially, that maybe (“surely”? “definitely”?) climate sensitivity is low. But we don’t know that either. It could be high or low.”
==============================
Well, barry is wrong once again. Sensitivity is certainly low. Either unmeasurably low, or possibly zero. We don’t know, because whatever it is, it is too small to measure.
We know that sensitivity cannot be high, because following a 40% rise in CO2, there has been no acceleration in global warming. None. That fact drives a stake into the heart of the CAGW conjecture.
barry continues: “It would be great if global warming was long, slow and not terribly disruptive, but your analysis, and the best efforts from those arguing similarly, do not persuade me that we can be sure serious impacts will not happen. Is this what you want people to believe?”
barry neglected to put “natural” before “global warming”, because there are no testable measurements showing that global warming is anything other than a natural recovery from the LIA.
It does not matter, as barry says, what people ‘want to believe’, and it does not matter what “persuades” barry. What matters is testable, reproducible, empirical evidence of catastrophic AGW — or of ordinary AGW, for that matter. Unfortunately, measurable evidence for AGW is non-existent. Therefore, people like barry are asserting based only on their religious faith. That is not good enough.
No wonder barry does not want to discuss policy.

Chris
June 10, 2013 9:27 pm

Re the California budget “surplus”:
It’s a surplus in name only, labelled a surplus because the politicians conveniently ignore the state’s future obligations. For example, underfunded pension plans.
“California, which faced a $26 billion deficit two years ago, expects a surplus of between $1.2 billion and $4.4 billion this year, thanks to a combination of tax increases, budget cuts and an improving economy. But it could be erased if the state were to adequately finance its teachers’ pension fund, which says it will need an additional $4.5 billion a year, much of it from the state, to pay the benefits it promised.”
(That’s just *one* pension plan.)
http://www.nytimes.com/2013/06/01/us/surpluses-help-but-fiscal-woes-for-states-go-on.html

Barry Elledge
June 10, 2013 10:13 pm

Barry at 9:19 am on 6/10 said:
“Climate models do have flat runs of up to 20 years.”
The slopes of linear regressions of T are one way of describing trends. Another way is to look for how long one would expect to wait before observing a new high global T if the IPCC models of T are correct; this relates the observed T directly to the models on which the AGW hypothesis are based.
Gavin Schmidt provided such an estimate in 2008. He summarized the results of 55 model simulations used to produce the T projections in the 2007 IPCC report. http://www.realclimate.org/index/php/archives/2008/05/what-the-ipcc-models-really-say/
Based on the spread in T values of the ensemble of simulations, Gavin calculated that one would expect to observe a new T high within 8 years from any given high value, with 95% probability, The likelihood of a new high value of at least 0.1 deg C with 95% probability should occur within 18 years. These values were calculated for the IPCC model simulations over the interval from 1990 to 2030.
Comparing these projections from the IPCC climate models to the observed Ts, I conclude that all 6 data sets reported here fail to meet the predicted 8-year warming trend. Four data sets (RSS, UAH, HADCRUT3 and SST) record highs in 1998, and thus haven’t set new records for 14 years. The 2 data sets which did set new records in 2010 (HADCRUT4 and GISS) went 12 years before the new highs. Thus by Gavin’s standard, the IPCC climate models appear to have been falsified already at the 95% confidence interval.
The longer proposal is that global T should increase by at least 0.1 deg C within an interval of 18 years. All 6 climate models show T values in 1998 which have not since been exceeded by as much as 0.1 deg C. Thus if the current flat Ts persist for another 4 years, by Gavin’s estimate the IPCC models will have been invalidated by both criteria.
I’m staying tuned.

June 10, 2013 10:34 pm

And let’s not forget that the only way the models could succeed is if they actually modeled the climate. They do NOT model the climate, they are model some imagined dream by which CO2 triggers feedbacks of other substances to behave in ways that nature shows they do not behave.
If the models match the observations, that does not mean they are working, it just means that observations did not prove them wrong. However, if models can’t match observations, then the models are wrong. Somehow, political scientists want skeptics to be on their heels, when the burdon of proof is actually on them.

barry
June 10, 2013 10:44 pm

Judith Curry had a quick look at whether modelling has been broken by observations.:
http://judithcurry.com/2013/02/22/spinning-the-climate-model-observation-comparison/
There are four graphs, one showing current temps outside the envelope, and three with current temps within model projections. All 4 have surface temps within the range (Christy’s includes satellite temps, which are just below the lowest run). The models are based on surface temps. Models may be stretched, but I don’t think they have broken just yet.
There have been comments upthread that ‘warmists’ keep shifting the goalposts on how much data is needed, but the classic 30 year period has been around for a long time. 2001 IPCC defines climate change thus:

Climate change refers to a statistically significant variation in either the mean state of the climate or in its variability, persisting for an extended period (typically decades or longer).

http://www.ipcc.ch/ipccreports/tar/wg1/518.htm
Consistent with 2007 IPCC: http://www.ipcc.ch/publications_and_data/ar4/syr/en/mains1.html
Goalposts have not been moved, but the shorter-term values given above and elsewhere are speaking to very specific issues, often in order to demonstrate that such short periods are problematic. 20 years, according to the IPCC is a bare minimum, and that has been the position for at least 12 years.

Chris Schoneveld
June 11, 2013 1:01 am

dbstealey says:
“that global warming is anything other than a natural recovery from the LIA.”
There we go again. What is this “recovery”? It seems to act as a big deus ex machina that needs no explanation.

June 11, 2013 4:07 am

Chris Schoneveld,
An unknown cause was responsible for the LIA. Is that a Deus ex Machina ?
No, it is a scientific fact without an adequate explanation. That happens often in science, from particle physics to cosmology.
Following the LIA, the climate began returning to the mean determined by the amount of incoming and exiting energy. Nothing mysterious there.
What would be mysterious is if the climate did not recover toward its natural balance. The real Deus ex Machina is the idea that CO2 is some kid of magic trace gas that causes runaway global warming.

June 11, 2013 4:24 am

DIrkran, is the linear trend analysis of global temperature an oversimplistic model? RichardLH, are the global temperatures the sum of periodic signals with fixed periods? Why for example would ENSO have a fixed period?

Richard M
June 11, 2013 5:30 am

dbstealey is correct. All it takes for a recovery is regression to the mean. For example, if it was lower solar input to the planet that led to the LIA, then just returning to normal solar input would cause a slow increase over time.
BTW, I’m still waiting for someone to provide a reference to a model run with over 15 years of flat temps and no volcanic eruption. Bueller … Bueller … Barry?
Finally, the warming from 1975-2005 cannot be considered to be due to GHGs totally until that time period is corrected for the PDO. We know the 1915-1945 period warming period was also association with the PDO and showed similar warming. You can’t attribute all the latter warming to something that could not have caused the previous warming.

barry
June 11, 2013 7:32 am

Richard,

BTW, I’m still waiting for someone to provide a reference to a model run with over 15 years of flat temps and no volcanic eruption.

You missed it.
http://wattsupwiththat.com/2013/06/09/are-we-in-a-pause-or-a-decline-now-includes-at-least-april-data/#comment-1332054
And from Barry Elledge’s (no relation) link to RC:
“Note that even for a 20 year period, there is one realisation that has a negative trend”
http://www.realclimate.org/index/php/archives/2008/05/what-the-ipcc-models-really-say/
It is by no means a common occurence in modeling, and it is certainly valid to be taking interest in the lack of trend for various (but not all) data sets for the last 15 years or so. But it is premature to say the models are broken. If the flat trend continues until the end of the decade, that will be quite unexpected, and a severe blow to models.
I used the SkS trend plotter to establish the shortest time periods where trends achieve statistical significance (trend greater than uncertainty).
NOAA: 1996 – 2013 — 0.216 C/decade [+/- 0.191]
GISS: 1994 – 2013 — 0.131 C/decade [+/- 0.108]
Had4: 1993 – 2013 — 0.140 C/decade [+/- 0.098]
UAH: 1993 – 2013 —- 0.171 C/decade [+/- 0.161]
RSS: 1989 – 2013 —- 0.139 C/decade [+/- 0.126]
This suggests to me that for the surface temps, at least 20 years should be assessed, and for the satellite temps, 25, owing to the greater variability im those data sets.
A better test would assess different periods in the long-term record, which is done in this post:
http://moregrumbinescience.blogspot.com.au/2009/01/results-on-deciding-trends.html
Grumbine makes a strong point.

“Choose is a key word in doing science. We try to avoid having choices. Choices can be made differently by different people, for different reasons, and not all those reasons will turn out to be good ones. Finding a scientific principle and then looking for how to satisfy that principle is far better.”

Far better than cite mining.

Chris Schoneveld
June 11, 2013 8:39 am

dbstealey says:
“What would be mysterious is if the climate did not recover toward its natural balance. ”
Is there such a thing a natural balance in climate? Climate responds to a multitude of outside factors of which the sun’s activity is one, CO2 another (be it probably very minor) and ocean currents. As Leif Svalgaard argued here so many times, the sun’s effect on the climate fluctuations of the last 500 years is not unequivocal, to put it mildly. What about deep ocean currents that take hundres of years to surface? So the use of the term “recovery” seems to apply we have a clear idea what that recovery entails. So when Akasofu draws a linear upward trend to represent that recovery, he can only do that if he knows the exact cause for such a linear increase and I doubt he has made a sufficiently convincing case for that.

Reply to  Chris Schoneveld
June 11, 2013 9:52 am

Chris Schoneveld says:
June 11, 2013 at 8:39 am
“As Leif Svalgaard argued here so many times, the sun’s effect on the climate fluctuations of the last 500 years is not unequivocal, to put it mildly.”
I don’t know about last 500 years, but with all the respect I have for Leif, he – as far as I know – actually never did the needed calculations – which show quite clearly, the TSI rise was ideed the chief cause of the 20th century warming via warming the ocean surface layer. In short: there was TSI level rise in the 20th century of about ~0.35 W per square meter (half of the 0.694 W.m-2 1900-2000 TSI trend in Solanki daily TSI reconstruction) which is consistent with the SSN trends and quite very consistent with the 0.66 K ocean surface temperature anomaly rise when one calculates in absolute numbers how much heat the TSI rise possibly directly added to the ocean’s surface water (~2 x 10^23 Joules) and how much therefore the surface temperature anomaly should rise. The absolute amount of heat added to the sea surface layer by rising TSI via solar shortwave spectra photons extinction is orders of magnitude higher than what possibly a CO2 backradiation itself could contribute and therefore it is quite clear that it was the rising solar activity which chiefly caused the warming in the 20th century, including the warming in the period 1960s-1990s.
There’s also an independent check, that it indeed weren’t the anthropogenic CO2 emissions which caused the warming: It is officially estimated that only the photosynthesis sequesters 10 times more CO2 than are the current anthropogenic emissions, therefore it is impossible the anthropogenic CO2 emissions could be the cause of the rising level of the atmospheric CO2 and it is more likely the CO2 level rise in the atmosphere is the result, not cause of the surface temperature anomalies (especially the sea surface temperature anomaly) rise – due to strong temperature dependence of the CO2 solubility in water .

barry
June 11, 2013 9:30 am

dbstealey says:
“What would be mysterious is if the climate did not recover toward its natural balance. ”
What is the climate’s ‘natural balance’. I think you mean ‘normal’ balance – you’d argue that all of the swings have been natural, wouldn’t you? So what is this ‘normal balance’? What is the global temperature to which the climate is mysteriously attracted?

rgbatduke
June 11, 2013 10:41 am

As I said, I have no problem agreeeing that there is an oscillation in the data that really does mean somthing about the climate, however the fact that real physical processes can be identified that explain it is much better evidence than a correllation. I am a statistcian and as such I am normally much more easily convinced by physics than I am by statistics, as I know how easy it is to be misled by statistics unless you take the proper steps, such as hypothesis testing (which is deeply flawed, by a useful sanity check nevertheless).
Well said, actually, and I completely agree. Hypothesis testing has its uses, but the 0.05 cutoff for p-values is arbitrary and meaningless, especially when analyzing large successive sets of data. It leaves open all sorts of possibilities for data dredging (for example) — my favorite example being the xkcd comic “Green Jellybeans Cause Acne”.
Nevertheless, it is fairly reasonable to doubt the assertion that the science is settled when the rather large collection of GCMs out there produce an absolute spaghetti snarl of future predictions, no two of which agree, and all of which are supposedly based on “physics”, and none of which are in particularly good agreement with the data past/hindcast or present. In a case like this I would argue that while it is true that the current more or less neutral trend in temperature since the 1997/1998 super ENSO does not falsify the hypothesis of CAGW driven by CO_2 concentration, neither is it particularly strong evidence in favor of it. Similarly while the observed 2.5 to 3.5 mm/year rate of SLR observed over the last 30 to 40 years (consistent as both a rate/range and as “acceleration” during a time of supposed hockey stick temperature increases, with features that match those in the natural unforced climate record in the 100 years prior to that for which we have tide gauge data) doesn’t falsify Hansen’s grotesque predictions of 5 meter SLR by the end of the century, but it damn sure doesn’t verify them or make them more plausible.
Bayesian reasoning isn’t that difficult. Negative evidence reduces the probable truth of a hypothesis (from any former state of prior belief). Positive evidence increases it. It’s that simple. At the moment the lack of positive evidence under circumstances where one expects it is gradually reducing the reasonable assessment of the probable truth of a variety of claims about global warming, especially the more extreme claims. That doesn’t mean that it is false, but it does mean that there is less and less reason to think that it is true or act as if it is true unless and until there is some positive evidence to push a reasonable probability that it is true back up into the action zone.
rgb

June 11, 2013 11:02 am

barry says:
“What is the global temperature to which the climate is mysteriously attracted?”
Mysterious to barry, but not to the average WUWT reader. The global temperature has increased by ≈0.35º/century for hundreds of years. That natural rise has not accelerated, despite the ≈40% rise in CO2 — proving to all but the most rabid True Believer that CO2 has nothing measurable to do with global warming.
The planet warms and cools, both above and below its long term naturally rising trend line. But it always reverts to the mean — even following major anomalies such as the LIA. It would be very unusual if temperatures did not recover, or revert to the long term rising trend line.

John Tillman
June 11, 2013 11:22 am

Chris Schoneveld says:
June 11, 2013 at 8:39 am
dbstealey says:
“What would be mysterious is if the climate did not recover toward its natural balance. ”
Is there such a thing a natural balance in climate? Climate responds to a multitude of outside factors of which the sun’s activity is one, CO2 another (be it probably very minor) and ocean currents. As Leif Svalgaard argued here so many times, the sun’s effect on the climate fluctuations of the last 500 years is not unequivocal, to put it mildly. What about deep ocean currents that take hundres of years to surface? So the use of the term “recovery” seems to apply we have a clear idea what that recovery entails. So when Akasofu draws a linear upward trend to represent that recovery, he can only do that if he knows the exact cause for such a linear increase and I doubt he has made a sufficiently convincing case for that.
——————————-
I don’t know what dbstealey means by “natural balance”, but at least during the Phanerozoic Eon (the past 543 million years or so), Earth’s climate has fluctuated within an average global temperature range of about ten to 25 degrees C, with two possible geologically brief excursions a degree or two higher (per Scotese’s reconstruction):
http://thedragonstales.blogspot.com/2006/09/comtemplating-scotese-geological.html
Earth is homeostatic within certain bounds. During the Snowball or Slushball Earth episodes of the preceding Proterozoic Eon, global mean temperature might have hit 0 C, but I don’t know.
Our planet thus appears (advocates of chaotic climate might disagree) to experience climatic cycles on time scales of every order of magnitude from billions of years to centuries or decades. When the average of weather becomes climate isn’t clear (at least to me), but some argue for thirty years.
In any case, climate, however defined, is always changing, ie recovering from one trend or another, at different time frames. During the Little Ice Age, Earth (or large regions of it) was recovering from the Medieval Warm Period, ie regressing to the longer-term mean. The MWP was recovery from the Dark Ages (or Migration Age) Cold Period & also on a longer scale the downward trend in temperature from the Holocene Climatic Optimum, ending c. 5000 years ago, & Minoan Warm Period peak, c. 3300 years ago.
Whether these excursions above or below a trend line are truly cyclic & describe sine waves or not is perhaps debatable. However, if such waves are recoverable from proxy data, it isn’t necessary to be able to say with any degree of certainty what causes them. Of course climatologists & other scientists would like to know & honest seekers are trying to find out. Whatever the causes may be, CO2 concentration is liable to play a minor & supporting role.
Dr. Akasofu’s recovery trend line is derived empirically, based upon observed temperature data since the end of the Little Ice Age around 1850-60. It could also possibly be extended back to the depths of the LIA, c. 1690-1710, perhaps at a different slope. Clearly, the trend prior to elevated CO2 levels after c. 1945 owes primarily if not exclusively to natural causes rather than human activities. The slope appears not to vary for the period 1850-1945 from its inclination in 1945 to present. Cooling was observed (although Hansen tried to obscure this fact) in the 1960s & ’70s before warming in the ’80s & ’90s (sexed up by GISS & other record keepers/adjusters).
The question for genuine climate scientists now is what caused recent observed warming (of whatever extent). Were the 1980s & ’90s warmer because of man-made GHGs or simply because of an upswing in the natural (quasi sine wave) fluctuation around the rising trend, observed in previous intervals (as suggested by Dr. Akasofu)? If primarily natural (instead of IPCC’s unfounded claim of 90% Man(n)-caused), then arises the more fundamental issue of what are the non-human “forcings”.
In science it’s OK (in fact a good thing, for starters) to say you don’t know yet, but are working on hypotheses & testing them against reality rather models. It’s less OK to claim on spurious to dubious grounds that the science is “settled”. From 1687 to 1905 the consensus on gravitation was settled, for instance, as was the immobility of the continents before the 1960s. Likewise the immobility of the Earth for long after Copernicus challenged that settled, consensus science.

John Tillman
June 11, 2013 11:26 am

dbstealey says:
June 11, 2013 at 11:02 am
That’s what I thought you meant.

Chris Schoneveld
June 11, 2013 11:53 am

My main objection to the word “recovery” is that it implies there is somewhere a normal temperature that the climate is supposed to have and that there is a natural tendency to go back to that preferred temperature. Maybe that is indeed the case but It also implies that we know what that prefered temperature is and that we also know the processes that cause the temperatures to deviate from the normal. In other words we then have a full understanding of the natural processes that control climate. And that is not the case by a long shot.

Barry Elledge
June 11, 2013 12:01 pm

barry states at 7:32 am on June 11:
“But it is premature to say the models are broken.” He also quotes Gavin Schmidt from the realclimate blogpost I had referenced (at 10:13 pm on June 10 supra). Gavin pointed out that one of the 55 “realizations” of the models used to produce the 2007 IPCC AR4 projection had global T declining for 20 years into the future.
My point was that by at least one of the two criteria which Gavin proposed in 2007 the subsequent T did not agree with the models. Gavin proposed the 8-year interval for a new high global T of any magnitude, and an 18 year interval for a new high of at least 0.1 deg C. He assessed the probabilities of observing these new highs within these intervals at 95%.
By the usual scientific criteria for significance, the models used to produce AR4 have been invalidated on the 8-year interval standard. The prospects for the 18-year interval standard aren’t looking so hot (so to speak) at the moment. But another 4 years will tell.
By the way, the reason Gavin chose the 0.1 deg C level is because a rise of that magnitude should be reflected in all of the data sets.
Now Gavin calculated the probabilities of a T rise over the stated intervals by evaluating the variance within the whole ensemble of 55 simulations. So the fact that one of the 55 simulations showed declining T for 20 years does not in any way contradict or invalidate Gavin’s estimate: that simulation was used along with the other 54 to create the estimate.
I tend to like the “new high” method of calculating agreement between models and observations for 2 reasons. One is procedural: Gavin is pro-AGW, he established the standard 5 years ago, and it is least relatively straight-forward to evaluate. Another reason is more fundamental: there is no particular a priori reason to expect global T to vary linearly with time, and fitting a straight line to observations is done because it’s simple.
“barry” also argues that the flat trends should continue for at least 20 or 25 years before we can conclude that the models are broken. This argument cuts both ways: after all, the GCM models were never verified in any engineering sense before being adopted. Any real verification should be prospective, and that hasn’t been done. So let both sides of the argument relax and wait for the evidence to trickle in over time before making any decisions about the validity of the models, much less decisions about public policy based on those models.

June 11, 2013 12:16 pm

Chris Schoneveld,
When I wrote that the global temperature “…always reverts to the mean — even following major anomalies such as the LIA,” I should have added: “except when it doesn’t.”
Looking at a long term graph of global temperatures, we see that the trend remains the same, until something changes it.
The central argument in the debate is whether or not CO2 has a measurable effect on global temperature. It may. However, there are no testable, verifiable, empirical measurements showing any such effect.
Without any verifiable measurements, AGW is no more than a conjecture; an opinion. The fact that global warming has stopped for the past decade and a half is a very good reason to doubt that CO2 has much of an effect. And some climatologists [such as Dr Ferenc Miskolczi] argue that CO2 has zero warming effect. The current situation provides support for that view.

Werner Brozek
June 11, 2013 2:22 pm

barry says:
June 11, 2013 at 7:32 am
NOAA: 1996 – 2013 — 0.216 C/decade [+/- 0.191]
This was the “land” only.
If you want global:
since 1994: 0.107 ±0.098 °C/decade (2σ)