Are We in a Pause or a Decline? (Now Includes at Least April* Data)

WoodForTrees.org – Paul Clark – Click the pic to view at source

Image Credit: WoodForTrees.org

Guest Post By Werner Brozek, Edited By Just The Facts

*At least April data was my intention. However as of June 8, HadCRUT3 for April is still not up! Could it be because as of the end of March, the slope of 0 lasted 16 years and 1 month and they do not want to add another month or two? What do you think? WoodForTrees (WFT) is up to date however, thank you very much Paul!

The graph above shows a few different things for three data sets where there has been no warming for at least 16 years. WFT only allows one to draw straight lines between two points, however climate does not go in straight lines. Often, temperatures vary in a sinusoidal fashion which cannot yet be shown using WFT. However we can do the next best thing and show what is happening over the first half of the 16 years and what is happening over the last half. As shown, the first half shows a small rise and the last half shows a small decline. Note that neither the rise in the first half nor the drop in the last half is statistically significant. However the lines do suggest that we are just continuing a 60 year sine wave that was started in 1880 according to the following graphic:

Dr. Syun-Ichi Akasofu’s – Clive Best – Click the pic to view at source

Do you agree? What are your views on the question in the title? Do you think we are presently in a pause or in a decline or neither?

In the sections below, we will present you with the latest facts. The information will be presented in three sections and an appendix. The first section will show the period that there has been no warming for various data sets. The second section will show the period that there has been no “significant” warming on several data sets. The third section will show how 2013 to date compares with 2012 and the warmest years and months on record. The appendix illustrate sections 1 and 2 in a different format. Graphs and a table will be used to illustrate the data.

Section 1

This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.

On all data sets below, the different times for a slope that is at least very slightly negative ranges from 8 years and 5 months to 16 years and 6 months.

1. For GISS, the slope is flat since January 2001 or 12 years, 4 months. (goes to April)

2. For Hadcrut3, the slope is flat since March 1, 1997 or 16 years, 1 month. (goes to March 31, 2013)

3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 12 years, 6 months. (This goes to May. I realize that Hadcrut3 is not up to date, but on the basis of its present slope and the latest numbers that I do have from the other three sets. I am confident that I can make this prediction.)

4. For Hadcrut4, the slope is flat since November 2000 or 12 years, 6 months. (goes to April)

5. For Hadsst2, the slope is flat from March 1, 1997 to April 30, 2013, or 16 years, 2 months.

6. For UAH, the slope is flat since January 2005 or 8 years, 5 months. (goes to May)

7. For RSS, the slope is flat since December 1996 or 16 years and 6 months. (goes to May) RSS is 198/204 or 97% of the way to Ben Santer’s 17 years. This 97% is real!

The next graph shows just the lines to illustrate the above for what can be shown. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the sloped wiggly line shows how CO2 has increased over this period.

WoodForTrees.org – Paul Clark – Click the pic to view at source

When two things are plotted as I have done, the left only shows a temperature anomaly. It goes from 0.1 C to 0.6 C. A change of 0.5 C over 16 years is about 3.0 C over 100 years. And 3.0 C is about the average of what the IPCC says may be the temperature increase by 2100.

So for this to be the case, the slope for all of the data sets would have to be as steep as the CO2 slope. Hopefully the graphs show that this is totally untenable.

The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted.

WoodForTrees.org – Paul Clark – Click the pic to view at source

Section 2

For this analysis, data was retrieved from SkepticalScience.com. This analysis indicates for how long there has not been significant warming according to their criteria. The numbers below start from January of the year indicated. Data go to their latest update for each set. In every case, note that the magnitude of the second number is larger than the first number so a slope of 0 cannot be ruled out. (To the best of my knowledge, SkS uses the same criteria that Phil Jones uses to determine significance.)

The situation with GISS, which used to have no statistically significant warming for 17 years, has now been changed with new data. GISS now has over 18 years of no statistically significant warming. As a result, we can now say the following: On six different data sets, there has been no statistically significant warming for between 18 and 23 years.

The details are below and are based on the SkS site:

For RSS the warming is not significant for over 23 years.

For RSS: +0.123 +/-0.131 C/decade at the two sigma level from 1990

For UAH the warming is not significant for over 19 years.

For UAH: 0.142 +/- 0.166 C/decade at the two sigma level from 1994

For Hadcrut3 the warming is not significant for over 19 years.

For Hadcrut3: 0.092 +/- 0.112 C/decade at the two sigma level from 1994

For Hadcrut4 the warming is not significant for over 18 years.

For Hadcrut4: 0.093 +/- 0.108 C/decade at the two sigma level from 1995

For GISS the warming is not significant for over 18 years.

For GISS: 0.103 +/- 0.111 C/decade at the two sigma level from 1995

For NOAA the warming is not significant for over 18 years.

For NOAA: 0.085 +/- 0.104 C/decade at the two sigma level from 1995

If you want to know the times to the nearest month that the warming is not significant for each set to their latest update, they are as follows:

RSS since August 1989;

UAH since June 1993;

Hadcrut3 since July 1993;

Hadcrut4 since July 1994;

GISS since October 1994 and

NOAA since May 1994.

Section 3

This section shows data about 2013 and other information in the form of a table. The table shows the six data sources along the top and bottom, namely UAH, RSS, Hadcrut4, Hadcrut3, Hadsst2, and GISS. Down the column, are the following:

1. 12ra: This is the final ranking for 2012 on each data set.

2. 12an: Here I give the average anomaly for 2012.

3. year: This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.

4. ano: This is the average of the monthly anomalies of the warmest year just above.

5. mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first two letters of the month and the last two numbers of the year.

6. ano: This is the anomaly of the month just above.

7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0.

8. sig: This is the whole number of years for which warming is not significant according to the SkS criteria. The additional months are not added here, however for more details, see Section 2.

9. Jan: This is the January, 2013, anomaly for that particular data set.

10. Feb: This is the February, 2013, anomaly for that particular data set.

11. Mar: This is the March, 2013, anomaly for that particular data set.

12. Apr: This is the April, 2013, anomaly for that particular data set.

13. May: This is the May, 2013, anomaly for that particular data set.

21. ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months. However if the data set itself gives that average, I use their number. Sometimes the number in the third decimal place differs by one, presumably due to all months not having the same number of days.

22. rnk: This is the rank that each particular data set would have if the anomaly above were to remain that way for the rest of the year. Of course it won’t, but think of it as an update 20 or 25 minutes into a game. Expect wild swings from month to month at the start of the year. As well, expect huge variations between data sets at the start. Due to different base periods, the rank may be more meaningful than the average anomaly.

Source UAH RSS Had4 Had3 Sst2 GISS
1. 12ra 9th 11th 9th 10th 8th 9th
2. 12an 0.161 0.192 0.448 0.405 0.342 0.56
3. year 1998 1998 2010 1998 1998 2010
4. ano 0.419 0.55 0.547 0.548 0.451 0.66
5. mon Ap98 Ap98 Ja07 Fe98 Au98 Ja07
6. ano 0.66 0.857 0.829 0.756 0.555 0.93
7. y/m 8/5 16/6 12/6 16/1 16/2 12/4
8. sig 19 23 18 19 18
9. Jan 0.504 0.441 0.450 0.390 0.283 0.61
10.Feb 0.175 0.194 0.479 0.424 0.308 0.52
11.Mar 0.183 0.204 0.411 0.387 0.278 0.58
12.Apr 0.103 0.219 0.425 0.353 0.50
13.May 0.074 0.139
21.ave 0.208 0.239 0.440 0.401 0.306 0.553
22.rnk 6th 8th 11th 12th 11th 10th
Source UAH RSS Had4 Had3 Sst2 GISS

If you wish to verify all of the latest anomalies, go to the following links, UAH,

For RSS, Hadcrut4, Hadcrut3, Hadsst2,and GISS.

To see all points since January 2012 in the form of a graph, see the WFT graph below:

WoodForTrees.org – Paul Clark – Click the pic to view at source

I wish to make a comment about this graph from WFT. It is right up to date. The only reason that both HadCRUT3 and WTI only go to March is because WTI uses 4 data sets, one of which is HadCRUT3, so if HadCRUT3 is not there for April, WTI cannot be there for April as well.

Appendix

In this part, we are summarizing data for each set separately.

RSS

The slope is flat since December 1996 or 16 years and 6 months. (goes to May) RSS is 198/204 or 97% of the way to Ben Santer’s 17 years.

For RSS the warming is not significant for over 23 years.

For RSS: +0.123 +/-0.131 C/decade at the two sigma level from 1990.

The RSS average anomaly so far for 2013 is 0.239. This would rank 8th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2012 was 0.192 and it came in 11th.

Following are two graphs via WFT. Both show all plotted points for RSS since 1990. Then two lines are shown on the first graph. The first upward sloping line is the line from where warming is not significant according to the SkS site criteria. The second straight line shows the point from where the slope is flat.

The second graph shows the above, but in addition, there are two extra lines. These show the upper and lower lines using the SkS site criteria. Note that the lower line is almost horizontal but slopes slightly downward. This indicates that there is a slight chance that cooling has occurred since 1990 according to RSS

graph 1 and graph 2.

UAH

The slope is flat since January 2005 or 8 years, 5 months. (goes to May)

For UAH, the warming is not significant for over 19 years.

For UAH: 0.142 +/- 0.166 C/decade at the two sigma level from 1994

The UAH average anomaly so far for 2013 is 0.208. This would rank 6th if it stayed this way. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2012 was 0.161 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to UAH.

Graph 1 and graph 2.

Hadcrut4

The slope is flat since November 2000 or 12 years, 6 months. (goes to April.)

For Hadcrut4, the warming is not significant for over 18 years.

For Hadcrut4: 0.093 +/- 0.108 C/decade at the two sigma level from 1995

The Hadcrut4 average anomaly so far for 2013 is 0.440. This would rank 11th if it stayed this way. 2010 was the warmest at 0.547. The highest ever monthly anomaly was in January of 2007 when it reached 0.829. The anomaly in 2012 was 0.448 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut4.

Graph 1 and graph 2.

Hadcrut3

The slope is flat since March 1 1997 or 16 years, 1 month (goes to March 31, 2013)

For Hadcrut3, the warming is not significant for over 19 years.

For Hadcrut3: 0.092 +/- 0.112 C/decade at the two sigma level from 1994

The Hadcrut3 average anomaly so far for 2013 is 0.401. This would rank 12th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to go back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2012 was 0.405 and it came in 10th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut3.

Graph 1 and graph 2.

Hadsst2

For Hadsst2, the slope is flat since March 1, 1997 or 16 years, 2 months. (goes to April 30, 2013).

The Hadsst2 average anomaly for the first four months for 2013 is 0.306. This would rank 11th if it stayed this way. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2012 was 0.342 and it came in 8th.

Sorry! The only graph available for Hadsst2 is the following

this.

GISS

The slope is flat since January 2001 or 12 years, 4 months. (goes to April)

For GISS, the warming is not significant for over 18 years.

For GISS: 0.103 +/- 0.111 C/decade at the two sigma level from 1995

The GISS average anomaly so far for 2013 is 0.553. This would rank 10th if it stayed this way. 2010 was the warmest at 0.66. The highest ever monthly anomaly was in January of 2007 when it reached 0.93. The anomaly in 2012 was 0.56 and it came in 9th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to GISS.

Graph 1 and graph 2

Conclusion

Above, various facts have been presented along with sources from where all facts were obtained. Keep in mind that no one is entitled to their own facts. It is only in the interpretation of the facts for which legitimate discussions can take place. After looking at the above facts, do you think that we should spend billions to prevent the claimed catastrophic anthropogenic global warming? Or do you think we should take a “wait and see” attitude for a few years to be sure that future warming will be as catastrophic as some claim it will be? Keep in mind that even the MET office felt the need to revise its forecasts. Look at the following and keep in mind that the MET office believes that the 1998 mark will be beaten by 2017. Do you agree?

WoodForTrees.org – Paul Clark – Click the pic to view at source

By the way, here is an earlier prediction by the MET office:

“(H)alf of the years after 2009 are predicted to be hotter than the current record hot year, 1998.”

When this prediction was made, they had Hadcrut3 and so far, the 1998 mark has not been broken on Hadcrut3. 2013 is not starting well if they want a new record in 2013. Here are some relevant facts today: The sun is extremely quiet; ENSO has been between 0 and -0.5 since the start of the year; it takes at least 3 months for ENSO effects to kick in and the Hadcrut3 average anomaly after March was 0.401 which would rank it in 12th place. Granted, it is only 3 months, but you are not going to set any records starting the race in 12th place after three months. So even if a 1998 type El Nino started to set in tomorrow, it would be at least 4 or 5 months for the maximum ENSO reading to be reached. Then it would take at least 3 more months for the high ENSO to be reflected in Earth’s temperature. How hot would November and December then have to be to set a new record? In my opinion, the odds of setting a new record in 2013 are extremely remote.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
277 Comments
Inline Feedbacks
View all comments
rogerknights
June 10, 2013 7:44 am

rgb says:
This leaves catastrophic anthropogenic global warming enthusiasts in a difficult position. In law it might be called habeus corpus — the need to produce a body before you go around trying somebody for murder.

The term you wanted was corpus delicti (habeus corpus means release the prisoner):

(Found by Googling):
The phrase corpus delicti might be used to mean the physical object upon which the crime was committed, such as a dead body or the charred remains of a …

Greg Goodman
June 10, 2013 7:51 am

Just a comment on the last one. , this is rate of change. So what we see is _deceleration_ started in 1998 and rate of change crossed into negative territory around 2005.
This is exactly the turning point that Willis found in his look at north Pacific yesterday:
http://wattsupwiththat.files.wordpress.com/2013/06/cumulative-monthly-north-pacific-index.jpg

June 10, 2013 7:51 am

sceptical says:
June 10, 2013 at 5:48 am
There has been lots of warming within the last 16 years. Since 2008 there has been strong warming. Looks like the rate of change is on a significant upswing.
The year 2008 ranks 22 on RSS. And until we get a La Nina that is as deep as the one in 2008, it is quite possible that even in 5 years time, the slope could be positive from 2008. However as far as “a significant upswing” is concerned, the error bars are huge! They are 0.224 ±1.238 °C/decade (2σ) according to SkS. (And I know I told you this on an earlier thread.)

RichardLH
June 10, 2013 7:57 am

My prediction, based on the UAH data series, is that UAH Global values will average above 0.13 until the end of 2013 and will then drop with minimums below -0.2 through 2014 and return to above the 0.13 figure in 2015/6.
What fancy methodology is the basis for this conclusion you ask? It is just a summary of the UAH Global data so far, the observation of the fairly clear periodic nature in the signal and hence the most probable future based on that data.
http://i1291.photobucket.com/albums/b550/RichardLH/uahtrendsinflectionfuture_zps7451ccf9.png.html
This shows plotting a sequence of moving average filters (in cascade and with particular values designed to remove the digital sampling artifacts that are otherwise created) on the UAH Global data series to identify the major underlying periodic features contained therein. These are evident (in the record and not from any external theory) at 37 months, 4 years and ~60 years. The nodal points indicated are where the various filter outputs ‘cross’ revealing the local ‘zero’ crossings nodal points in the otherwise fairly noisy signal.
Removal of those identifiable cycles would probably reduce the whole series to white noise but that would require much more analysis and, preferably, a much longer data series :-).
In any case it is interesting to pose the question, when is the next falling node due and what value should be placed on it? Also what are the reasons for the above periodic features?
The ones that come to mind are 37 months = Lunar, 4 years = Solar and ~60 years Jupiter? That would mean that tiny lateral tidal/gravitational forces in the atmosphere modify where the jet streams form with consequent (and probably lagged) impacts on Global temperature data.
It is likely that data from RSS data series will show similar charteristics (as they actually come from the same data source after all).
It may also be possible to detect this in the average North/South position of the jet streams which should move based on the above periodicity if the suggestion about gravity being the cause is true.
It is, therefore, possible to claim that all of the observed data variation so far in the UAH Global series can be explained just by using natural periodic cycles of particular periods, phase and magnitudes. This claim is based purely on the data so far collected and the most probable future extention of that data set given the periodic nature of the data observed so far.

June 10, 2013 8:11 am

Chris Schoneveld says:
June 10, 2013 at 6:11 am
What is the physical cause of the recovery?
I am not sure if we know that. Perhaps the sun and ocean cycles play a role. However it seems certain that CO2 had nothing to do with it based on the following graphs.
http://www.woodfortrees.org/plot/hadcrut4gl/from:1880/plot/hadcrut4gl/from:1880/to:1910/trend/plot/hadcrut4gl/from:1910/to:1945/trend/plot/hadcrut4gl/from:1945/to:1975/trend/plot/hadcrut4gl/from:1975/to:2002/trend/plot/hadcrut4gl/from:2002/to:2014/trend/plot/esrl-co2/from:1958/normalise

Henry Galt
June 10, 2013 8:14 am

wbrozek says:
June 9, 2013 at 10:50 pm
“barry says:
June 9, 2013 at 9:43 pm
So you are deliberately choosing your start date to find the trend you want
I did not “choose” any date. The last date is the most recent month for which WFT had data and I had no choice in that. The furthest date in the past was where the slope is very slightly negative. I did not “choose” that date either. I just went where the data took me to find the longest period of no warming.”
This, folks, IS cognitive dissonance in action.

dikranmarsupial
June 10, 2013 8:16 am

Werner Brozek “I agree with you that when errors bars are included, we cannot say there is a pause and we cannot say there is a decline.”
I am glad we are in agreement on this, in future, please can you make this point explicitly when you present your summaries of observed trends, otherwise you are likely to mislead your audience and cause them to look ignorant when promulgating canards of the form “no global warming since [insert cherry picked date here]”.

jai mitchell
June 10, 2013 8:17 am

@rgbatduck
When you make carbon based energy artificially more expensive, you know what you do? You kick a state like California into a depression that continues there even though it is over nearly everywhere else.
—————
California has a 4 billion budget surplus this year.
—————
#2 — WRONG
building lots of war material, and then transforming that domestic spending from war material to be blown up into roads and bridges and other infrastructure.
Just like we will do when you capitulate and realize that you have been sold a flat-earth bill of goods by people PAID to do it by those who want to continue making money selling carbon-based fuels.

Henry Galt
June 10, 2013 8:20 am

No warmist I have presented this approach to can handle it – they all turn into foam-flecked imbeciles.
Thank you, thank you, thank you Werner B.

June 10, 2013 8:22 am

barry says:
June 10, 2013 at 6:47 am
Santer et al posited 17-year trends as a minimum for climate studies. Others say more.
From the article:
“For RSS, the slope is flat since December 1996 or 16 years and 6 months. (goes to May) RSS is 198/204 or 97% of the way to Ben Santer’s 17 years. This 97% is real!”
As for Santer’s 17 years and more by others, were any of these time periods quoted 10 or more years ago or is it a matter of people shifting goal posts because the original goal posts were too narrow?

TomB
June 10, 2013 8:26 am

Water you talkin’ about? Everbody nos that “The world is getting warmer faster than anticipated. A new report from the International Energy Agency says global temperatures will rise twice as fast as projected if countries don’t act to slash their admissions soon.”
http://news.yahoo.com/world-getting-warmer-faster-expected-132734289.html
So we have ta stop all admissions, right now! /sarc

jai mitchell
June 10, 2013 8:33 am

@M Courtney
do you have any idea how many children in the united states have died as a result of ideologically biased, politicized “scientists” who were working for PR firms paid by the tobacco industry worked to cloud the issue regarding the hazards of second hand smoking? At that time it was still legal to smoke in airplanes. . .
These things are well documented in the papers released by Stephen Glantz as a result of the lawsuits against the tobacco industry.
(they also showed how the tobacco industry used cartoons to specifically target children for their products and that they were intentionally manipulating the nicotine levels in their products to make people more addicted).
At the same time the CEOs of the tobacco industry stood up before congress and stated, under oath, that they did not “believe” that nicotine was addictive.
do YOU believe that nicotine is addictive???
If ONLY a scientist could tell you nicotine as addictive or not (based on proven studies and scientific experiments) would you believe it was addictive?
so don’t tell me that evil corporations aren’t interested in killing innocent people and children to make a buck. The fact is, they are and they spend LOTS and LOTS of money doing it.

dikranmarsupial
June 10, 2013 8:36 am

wbrozek, Santer’s paper is concerned with the statistical power of the hypothesis test; if you don’t have much data, you will fail to reject the null hypothesis when when it is false. This is not a matter of shifting goal posts, the statistical analysis is pretty standard, the reason that the Santer paper was written was to address a common misunderstanding of climate data, namely “no warming since 1998”. If climate blogs and the media had not been so keen to promulgate this misunderstanding, Santer would have had no need to point out where the goalposts actually were.
Climatologists have traditionally used periods of around 30 years because they know from experience that conclusions drawn from shorter periods are unreliable, so there was previously no need for a paper to be published stating what the research community already knew to be the bleedin’ obvious.
Another paper of this nature is Easterling and Wehner (2009) http://onlinelibrary.wiley.com/doi/10.1029/2009GL037810/full, which shows that periods of little or no warming have happened before in the observations and also crop up in model projections (i.e. the models say that this sort of thing will happen now and again, but they are chaotic phenomena so the models can’t predict when they will happen).

June 10, 2013 8:42 am

JP says:
June 10, 2013 at 7:41 am
I still would like to see what “deep ocean heating” look like.
It looks like a desperate ploy to keep the idea of CAGW alive. They love to talk of all the 10^22 Joules that supposedly have gone into the deep ocean, but when you do the calculations, you find it amounts to a small fraction of a degree that you cannot even detect if you were to go from one pool to the next at the slightly higher temperature. Furthermore, if in fact the huge oceans are such a huge buffer that absorbs 98% of the excess heat, then we of course have nothing to worry about. So now the warmists have a further problem, namely to explain how this extra extremely diluted extra heat will somehow concentrate in one place and spike the air temperature.

Reply to  wbrozek
June 10, 2013 10:39 am

wbrozek says:
June 10, 2013 at 8:42 am
They love to talk of all the 10^22 Joules that supposedly have gone into the deep ocean, but when you do the calculations, you find it amounts to a small fraction of a degree that you cannot even detect if you were to go from one pool to the next at the slightly higher temperature.
Yes.
If we do simple calculation only the upper 1m of the ocean has more heat content than 10^22 Joules.
-The weight of the upper 1m of the ocean is
1,026[surface layer mean sea water density in grams per cubic centimeter] x (3,61132×10^20[ocean area in square meters]) = 3,7×10^20 grams
and has the heat capacity 4.2 × (3,7×10^20) = 1.5×10^21J.K-1
so the upper 1m ocean heat content is (1.5×10^21) ×290 K = 4.35×10^23 Joules
To put the things into perspective I’ll add a sea surface anomaly dependence on Total solar irradiance calculation for the 30 years period 1964 solar minima to 1994 I made recently for my upcoming paper:
How much the SST anomaly changed during the 1964-1994 period in question?
+0.27 K (HadSST2 11 year running average to filter out the periodic solar signal)
What TSI trend was there?
The 1964-1994 (1964.0-1993.92) 30 years TSI trend (Solanki daily TSI reconstruction data)
+0.437 W.m-2 (0.146 W.m-2/decade)
Now the calculation what could be the thickness of the warmed ocean layer relative to the measured SST global anomaly change:
+0.437 x 0.92 (1- water albedo 0.08 = 0.92) = 0.402 W.m-2 got under the sea surface
x 946080000 (x60 seconds x60 minutes x24 hours x365 days x30 years = 946080000 number of seconds in 30 years)
= up to 380362003 Joules.m-2 (1 Joule = 1 W.s-1)
How much water this heats it +0.27 K?
380362003/1.1283 [4.179 J.cm-3.K-1 volumetric water heat capacity x 0.27K SST anomaly rise = 1.1283 Joules.cm-3 was needed to heat the water the 0.27 K.]
=337110700 cm3
/1000000 (cm3 in 1m3)
=337 m3 water per m2 of the ocean surface
/4 (We divide by two because always only one half of the Earth is insolated and we divide by another 2 because only half of the linear trend value realizes as the TSI mean rise throughout the whole trend period – the average of the zero on the beginning and the full value at the trend’s end is half of its value)
= 84.25 m3
x 0.82 [1 – the rest of the terrestrial albedo]
= 69.09
– means up to 69 meters upper layer of the ocean could be in average heated by the sun the 0.27 K realizing the TSI trend of +0.437 W.m-2 converting the solar shortwave radiation into heat and to sea temperature rise during the period 1964-1994.
How much surplus heat in absolute numbers the 1964-1994 0.437 W.m-2 TSI rising trend could add to the ocean?
0.2185 [half of the 1964-1994 TSI trend] x 361132000000000 [ocean area in square meters] x 946707782 [number of seconds in 30 years] x 0.7 [1-terrestrial albedo] = 5.23 x 10^22 Joules could be delivered by the solar irradiance to the sea – which morever translates to heat content contained in just 5.23 x 10^22 / 4.35×10^23 [the heat content of the upper 1m of the ocean] = 0.12 which means such a heat is contained in the uppermost 12 centimeters of the ocean surface layer.
So why talk about “missing 10^22 Joules” in the first place when all the solar energy surplus delivered to the ocean by the elevated solar irradiance was just in order of 10^22 Joules during the 30 years 1964-1994?
——-
Purely under line I yet compare to 100 years period 1900-2000:
How much the SST anomaly changed during the period in question?
+0.657 K (HadSST2 11 year running average to filter out the periodic solar signal)
The 1900-2000 (1900.0-1999.92) 100 years TSI trend (Solanki daily TSI reconstruction data)
is +0.694 W.m-2 (0.069 W.m-2/decade)
The warmed ocean layer thickness relative to the measured SST anomaly change calculation:
+0.694 x 0.92 (1- water albedo 0.08 = 0.92) = 0.638 W.m-2
x 3153600000 (x60 seconds x60 minutes x24 hours x365 days x100 years)
= up to 2013510528 Joules.m-2 (1 Joule = 1 W.s-1)
How much water this heats it +0.657 K?
/ 2.745603 (4.179 J.cm-3.K-1 volumetric water heat capacity x 0.657 K SST anomaly rise = 2.745603 Joules.cm-3)
=733358220 cm^3
/1000000 (cm^3in 1m^3)
=733 m^3 water per m^2 of the ocean surface
/4 (We divide by two because always only one half of the Earth is insolated and we divide by another 2 because only half of the linear trend value
realizes as the TSI mean rise throughout the whole trend period – the average of the zero on the beginning and the full value at the trend’s end is half
of its value)
= 183.33 m^3 / m^2 ocean surface
x 0.82 [1 – the rest of the terrestrial albedo]
= 150.3
– which means up to 150 meters upper layer of the ocean could be in average heated by the sun the 0.657 K realizing the TSI trend of +0.694 W.m-2 converting the solar shortwave radiation to heat and sea temperature rise during the period 1900-2000. The number is very well in terms of the ocean surface epipelagic layer thickness which has 50+ times heat content when compared to whole the atmosphere and therefore we can conclude the bulk of the 20th century sea surface temperature rise can be attributed to total solar irradiance rise as the chief cause.
Just to check more: How much surplus heat in absolute numbers the 1900-2000 0.694 W.m-2 TSI rising trend could add to the ocean?
0.347 [half of the 1900-2000 TSI trend] x 361132000000000 [ocean area in square meters] x 3153600000 [number of seconds in 100 years] x 0.7 [1-terrestrial albedo] = 2.766 x 10^23 Joules – which translates to heat content contained in just 2.766 x 10^23 / 4.35×10^23 [the heat content of the upper 1m of the ocean] = 0.63 which means in the uppermost 63 centimeters of the ocean surface layer heat content is equivalent to the heat delivered by the rising TSI extinction in the ocean upper layer during the 100 years, but it is nevertheless a number well in order of 10^23 Joules which can be directly attributed to the rising solar activity just in the ocean’s surface layer.
So maybe Trenberth is missing “the 10^22 Joules” somewhere, but does it even matter?

beng
June 10, 2013 8:48 am

***
jai mitchell says:
June 9, 2013 at 8:57 pm
you have so many misunderstandings about the basics it makes me wonder if you are actually doing this to purposefully lie.
***
ROFLMFAO. Rgb’s understanding is a whale compared to your gnat.

June 10, 2013 9:02 am

Henry Galt says:
June 10, 2013 at 8:14 am
This, folks, IS cognitive dissonance in action.
Please see:
http://canadianawareness.org/2013/02/ipcc-head-rajendra-pachauri-acknowledges-17-year-stall-in-global-warming/
“IPCC Head Rajendra Pachauri Acknowledges 17 Year Stall In Global Warming”
Did Rajendra Pauchauri do anything differently than what I did to determine that there was a 17 year stall in global warming? If so, please enlighten me.

Henry Galt
June 10, 2013 9:16 am

jai,
Do you for one second think that the fossil companies will not sell their entire inventories regardless of what the right/left do/believe?

June 10, 2013 9:19 am

dikranmarsupial says:
June 10, 2013 at 8:36 am
Climatologists have traditionally used periods of around 30 years because they know from experience that conclusions drawn from shorter periods are unreliable
In light of the 60 year sine wave that apparently has some role, perhaps 60 years should be used. In my opinion, way too much emphasis has been placed on the 30 year upswing part of the 60 year cycle from about 1975 to 2005.

barry
June 10, 2013 9:19 am

However we also cannot say any warming is catastrophic. But there is one thing we can say and that is that the climate models are no good.

Climate models do have flat runs of up to 20 years. These were in the IPCC mid-range ensemble, which end up warming in the long-term. To repeat what Dikran said, the models can’t predict when flat trends will occur.

“We show that the climate over the 21st century can and likely will produce periods of a decade or two where the globally averaged surface air temperature shows no trend or even slight cooling in the presence of longer-term warming….
An individual simulation, as opposed to a multi-model, multi-realization average, reveals interesting decadal scale features that can provide insight into the single trajectory that the actual climate is taking. We highlight two periods in Figure 2, 2001–2010 [10 yrs] and 2016–2031 [16 yrs]. Both of these periods show a small, statistically insignificant negative trend based on a simple least-squares trend line and there are other periods, such as the last seven years of this simulation, that show a similar lack of trend. This behavior occurs without any simulated volcanic eruptions or solar variability (natural forcing) that could result in a widespread cooling for some period of years and thus is presumed entirely due to natural internal variability.”

http://www.crd.lbl.gov/assets/pubs_presos/grl25859.pdf
Those models are based on surface, not satellite (lower troposphere) temps. If we get 25 years of flat or negative trend, then the models are definitely broken. I think it is fair to ask the question at this time, but it is overreach to reckon conclusively.

RichardLH
June 10, 2013 9:22 am

wbrozek says:
June 10, 2013 at 9:19 am
“In my opinion, way too much emphasis has been placed on the 30 year upswing part of the 60 year cycle from about 1975 to 2005.”
This summary of the data says otherwise. http://i1291.photobucket.com/albums/b550/RichardLH/uahtrendsinflectionfuture_zps7451ccf9.png.html

June 10, 2013 9:24 am

“Are we in a pause or a decline?” And the answer is…based on just the last 16 years, we are in a pause (or maybe a decline). If we look at much larger timescales, we could say this is a wiggle in the graph similar to what we have experienced before that overlays a continual increase (and not a straight line increase). There are many causes for the shape of the data, and those causes should be pointed out (and how they contribute) before we can make an increase/pause/decline assessment.

dikranmarsupial
June 10, 2013 9:35 am

wbrozek wrote “In light of the 60 year sine wave that apparently has some role, …” I note that you ignored the substantive point of my post, which was pointing out that Santer’s paper was not at all moving the goal posts and was just the application of basic statistical principles in response to promulgation of canards.
Is there statistically significant evidence to show that there is an [enter cherry picked period here] year periodic signal in the dataset and that it isn’t just the effect of noise on top of the effect of known changes in the forcings? No, but if you have such evidence I’d very much like to see it (genuinely).

RichardLH
June 10, 2013 9:37 am

dikranmarsupial says:
June 10, 2013 at 9:35 am
“Is there statistically significant evidence to show that there is an [enter cherry picked period here] year periodic signal in the dataset ”
That depends on if you can see such signals in this data set. http://i1291.photobucket.com/albums/b550/RichardLH/uahtrendsinflectionfuture_zps7451ccf9.png.html

Werner Brozek
June 10, 2013 9:44 am

Frank Mlinar says:
June 10, 2013 at 9:24 am
There are many causes for the shape of the data, and those causes should be pointed out (and how they contribute) before we can make an increase/pause/decline assessment.
So let us suppose that neglecting errors bars, we can show that temperatures dropped over the last 8 years. Furthermore, let us suppose that someone pointed out that the last 8 years had twice as many La Ninas as El Ninos. Are you suggesting that if this was indeed the case, that we could turn around and say the temperatures really went up?

Reply to  Werner Brozek
June 10, 2013 11:28 am

“So let us suppose that neglecting errors bars, we can show that temperatures dropped over the last 8 years. Furthermore, let us suppose that someone pointed out that the last 8 years had twice as many La Ninas as El Ninos. Are you suggesting that if this was indeed the case, that we could turn around and say the temperatures really went up?”
What I am saying is until one identifies and accounts for the contributors, one cannot say which way the temperatures are going long term. If you look at long term data, it appears that the latest pause is a reasonable one. That is, it has happened before. I personally do not know the why behind the current pause nor how long it will last. The most I can do is extrapolate from the long term data. Regarding La Ninas and El Ninos, I assume that “signal” can also be detected, but it would be a high frequency signal when compared to things such as solar, Pacific Decadal, etc. CO2 does not show a periodicity and would be a very low frequency (almost DC) signal (actually signal without a cyclical component). CFCs are man made and are also a low frequency signal but I have no idea what the shape is other than what I have read. Aerosols are another signal that depends on natural and man made emissions.

1 4 5 6 7 8 11