Jim Hansen's 99% Surety on Global Warming Doesn't Hold Up

Guest essay by Pat Frank –

When Jim Hansen testified before the Senate Committee on Energy and Natural Resources, on June 23, 1988, he said that he was 99% sure human-caused global warming was already happening.

Ever wonder how he got so sure?

I discovered the answer while researching the validity of the global surface air temperature record.

The story is worth attention because Jim Hansen’s 1988 testimony set the low evidentiary standard subsequently adopted by consensus climatology.

The background is well-known. Senator Tim Wirth arranged to have the committee meeting on the historically hottest day of summer. And the record 98 F that day fully met his needs. Senator Wirth also ensured that the meeting-room windows were left open over-night, so that the air conditioning was ineffective. The room was sweltering. Jim Hansen was a hit. It was a fine victory of cynicism and circumstance over scruples and science.

1. The 99% Solution: The substance of Jim Hansen’s testimony that day is provided in the little Appendix at the bottom of this essay; see [1] for the full record. But the essence of 99% is in the next Figure, the GISS 1987+ global air temperature record, complete with ludicrously small error bars (1sigma = ±0.035 C, or ±0.025 C).

clip_image002

Original Caption: “Global surface air temperature change for the past century, with the zero point defined as the 1951-1980 mean. Uncertainty bars (95% confidence limits) … [are] the result of incomplete spatial coverage by measurement stations, primarily in ocean areas. The 1988 point compares the January-May temperature to the mean for the same 5 months in 1951-1980.

This is pretty much Jim Hansen’s Figure 1 presented to the senate committee. I’ve added the green box, showing the ±0.13 C 1sigma jitter of global temperature during the 1951-1980 reference period.

The 1987 record was Figure 1 in Hansen and Lebedeff published in April 1988, about 3 months before his testimony, [2] and was Figure 6 of Hansen and Lebedeff, November 1987. [3]

In his testimony Jim Hansen implied that this 1sigma = ±0.13 C jitter was the full sum total of natural climate variability. The rise air temperature by mid-1988, nearly 0.4 C, was then 3s beyond nature. Obviously, that made the trend 99% unnatural.

That’s the whole ball of wax. Don’t believe it? Check out the quotes in the Appendix.

Somehow the 1884 and 1937 trend was overlooked by both Jim Hansen and the Senators. Right before their eyes was a 0.84 C global air temperature increase. Let’s see, that’s more than 6sigma beyond nature. In Jim Hansen world, that makes the trend more than 99.99966 % likely to be unnatural. Hmmm … what could possibly have caused that?

What about the probable ~1 C, unnaturally 7.7 sigma, increase in global air temperature between the Little Ice Age, 1650, and 1900? [4] Humans couldn’t have done it. Climate gremlins, maybe?

And those darn Dansgaard-Oeschger and Heinrich events, with their trends of multiple degrees Centigrade of global air temperature change per decade. Unnatural, too?

Or maybe they never happened. There’s an exciting new challenge the AGW stalwarts can take up for the cause: ‘We have to get rid of the Dansgaard-Oeschger and Heinrich periods.

2. Enter Physical Causality: But, testimony didn’t end there. Jim Hansen next offered his GISS Model II global warming scenarios A, B, and C to prove that the recent 99% unnatural warming was caused by CO2 emissions. After all, physics provides causality. The next Figure shows what the senators saw and what JGR published, after peer-review and all. [5]

The committee saw, and peer-reviewed JGR published, predictions without error bars. Pace JGR, but that makes them physically meaningless. They can not and do not signify any physical causality, at all.

If one goes ahead and imports scientific credibility by computing physically valid error bars (±8.9 C in 1988), the scenarios show themselves to be, well, physically meaningless. [6] Oh, well. No rescue there.

clip_image004

From the testimonial legend: “Annual mean global surface air temperature computed for trace gas scenarios A, B and C described in reference 1 (reference [5] below – P). … The shaded range is an estimate of global temperature during the peak of the current and previous interglacial periods, about 6,000 and 120,000 years before present, respectively. …

So there you have it, Jim Hansen’s 99 % surety: for his purposes the entire 1sigma range of natural global variability in air temperature is ±0.13 C. The fact that there is no physical justification at all for his choice didn’t seem to bother anyone, including a trained Ph.D. astrophysicist. It is a very opportune statistic, though.

Jim Hansen’s physical causality? Established by reference to warming scenarios of unrevealed, unremarked, and almost certainly uncalculated accuracy, computed using a model that was (and remains) unvetted by any published critical physical analysis.

In my view, the analysis is horridly incompetent. But it set the standard of consensus climatology that has remained in force right up to the present.

Appendix

Jim Hansen’s oral proof testimony to the committee: “[The] global temperature … is the highest of the period of record (then about 100 years). The rate of warming over the past 25 years … is the highest on record. 1988 will be the warmest year on the record.

“Causal association requires first that the warming be larger than natural climate variability and, second, that the magnitude and naturel of the warming be consistent with the greenhouse mechanism.

 

“The warming is almost 0.4 degrees Centigrade by 1987 relative to climatology, which is defined as the 30 year mean, 1950 to 1980 and, in fact, the warming is more than 0.4 degrees Centigrade in 1988. The probability of a chance warming of that magnitude is about 1 percent. So, with 99 percent confidence we can state that the warming during this time period is a real warming trend.

 

“The main point to be made here is that the expected global warming [Jim Hansen’s Model II Scenarios A, B, and C – P] is of the same magnitude as the observed warming. Since there is only a 1 percent chance of an accidental warming of this magnitude, the agreement with the expected greenhouse effect is of considerable significance.” [1]

Jim Hansen’s written proof testimony to the committee: “The present observed global warming is close to 0.4 oC, relative to … the thirty year (1951-1980) mean. A warming of 0.4 oC is three times larger than the standard deviation of annual mean temperature in the 30-year climatology. The standard deviation of 0.13 oC is a typical amount by which the global temperature fluctuates annually about its 30 year mean; the probability of a chance warming of three standard deviation is about 1%. Thus we can state with about 99% confidence that current temperatures represent a real warming trend rather than a chance fluctuation of the 30 year period.” [1]

And, just to lock it in, here’s what the GRL authoritatively peer-reviewed Hansen and Lebedeff say about the trend: What is the significance of recent global warming? The standard deviation of annual-mean global-mean temperature about the 30-year mean is 0.13 oC for the period 1951-1980. Thus the 1987 global temperature of 0.33 oC, relative to the 1951-1980 climatology, is a warming of between 2s and 3s. If a warming of 3s is reached, it will represent a trend significant at the 99% confidence level. However, causal connection of the warming with the greenhouse effect requires examination of the expected climate system response to a slowly evolving climate forcing, a subject beyond the scope of this paper.” [2]

The “expected climate response” was Hansen’s Model II A, B, and C scenarios, both published, [5] and presented before the committee, [1] without any error bars.

From the testimony scenario Figure legend: “[Scenario A assumes continued growth rates of trace gas emission rates typical of the past 20 years, i.e., about 1.5 % yr-1 emission growth; scenario B has emission rates approximately fixed at current rate; scenario C drastically reduces trace gas emissions between 1990 and 2000].”

 

[1s, (2s, 3s, etc) changed to 1sigma for clarity.  ]


 

References:

1. Hansen, J. Statement of Dr. James Hansen, Director, NASA Goddard Institute for Space Studies. 1988 [Last accessed: 11 August 2014; Testimony before the US Senate Committee on Energy and Natural Resources: The Greenhouse Effect: Impacts on Current Global Temperature and Regional Heat Waves]. Available from: http://image.guardian.co.uk/sys-files/Environment/documents/2008/06/23/ClimateChangeHearing1988.pdf.

2. Hansen, J. and S. Lebedeff, Global Surface Air Temperatures: Update through 1987. Geophys. Res. Lett., 1988. 15(4): p. 323-326.

3. Hansen, J. and S. Lebedeff, Global Trends of Measured Surface Air Temperature. J. Geophys. Res., 1987. 92(D11): p. 13345-13372.

4. Keigwin, L. Bermuda Rise Box Core Data. IGBP PAGES/World Data Center-A for Paleoclimatology Data Contribution Series # 96-030. 1996 [Last accessed: 14 September 2007; Available from: ftp://ftp.ncdc.noaa.gov/pub/data/paleo/paleocean/by_contributor/keigwin1996/.

5. Hansen, J., et al., Global Climate Changes as Forecast by Goddard Institute for Space Studies Three‐Dimensional Model. J. Geophys. Res., 1988. 93(D8): p. 9341-9364.

6. Frank, P., Propagation of Error and the Reliability of Global Air Temperature Projections; Invited Poster, in American Geophysical Union Fall Meeting. 2013: San Francisco, CA; Available from: http://meteo.lcd.lu/globalwarming/Frank/propagation_of_error_poster_AGU2013.pdf (2.9 MB pdf).

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

157 Comments
Inline Feedbacks
View all comments
Nancy C
February 18, 2015 9:02 am

If you look at current GISS data sets and compare it to the one shown here, it appears they not only didn’t know what the temperature would be in the future, they didn’t even know what the temperature was at the time. Still, they were 95% sure they had the data right and 99% sure they knew what it meant.

rw
Reply to  Nancy C
February 18, 2015 12:10 pm

Excellent point. I’d go so far as to say that one can’t be a warmist if one can spot contradictions like this.

highflight56433
February 18, 2015 9:15 am

I wonder how much the atmospheric O2 is declining? Should it be a 2:1 Oxygen to Carbon ratio to the amount of fossil fuel carbons OXIDIZED. Never hear about that. ???? Prost!

Reply to  highflight56433
February 18, 2015 9:45 am

The decline in atmospheric O2 is well documented, about 2 ppm/year, try Googling it.

RACookPE1978
Editor
Reply to  Phil.
February 18, 2015 10:32 am

Phil.

The decline in atmospheric O2 is well documented, about 2 ppm/year, try Googling it.

From [209,004 ppm to 209,002 ppm to 209,000 ppm to 208,998 ppm to 208,996 ppm] …

the Trace gases followed Hansen’s Scenario C, if anything he underestimated the reduction in emissions.

So, are you claiming that “trace gases” now are more important to the earth’s radiation heat balance than CO2?
After all, trace gases were very small but CO2 increased from 1945 to 1966, but global average temperatures went down.
Trace gases and CO2 increased from 1966 to 1976, but global average temperatures were steady.
Trace gases and CO2 increased from 1976 through 1996, and global average temperatures increased.
Trace gases were steady but CO2 increased from 1996 through 2015, and temperatures were steady.

Reply to  Phil.
February 18, 2015 11:04 am

RA , I think you are off by an order of magnitude .
20% is 200,000 ppm .
CO2 was originally somewhere around 300,000 ppm before photosynthesis since it contained that entire 200,000 ppm O2 .
[fixed. .mod]

RACookPE1978
Editor
Reply to  Bob Armstrong
February 18, 2015 11:35 am

Very good point. You are correct.
But, then again, we are told time and time again that this blog is not “peer-reviewed” so I must be right, right? 8<)
Mod: request the 20,000 ref point be "0 'ed" properly.

Reply to  RACookPE1978
February 18, 2015 12:34 pm

This is 21st century “peer review” . The old high cost print “pal review” model is of rapidly diminishing value or influence .
The fact that the original 0% O2 , 30% CO2 atmosphere , rather than causing the planet to broil , enabled the explosion of green life is overwhelming evidence that this whole absurdity fails grade school science ( cool sun not withstanding ) .

highflight56433
Reply to  Phil.
February 18, 2015 2:27 pm

Everybody knows that, but there is only one gas exploited by the get rich on carbon tax crowd. I I imagine that they might be missing out! Help them tax every gas in the atmosphere…maybe create some new ones! Invest in futures, get credits when gas “X” declines, buy low, sell high…. Prost !

Unmentionable
February 18, 2015 9:28 am

Jim Hansen? … isn’t he the same guy who made Guy Smiley?
And incredible talent with muppets.

A C Osborn
February 18, 2015 9:34 am

The other thing to note is that Hansen actually destroys the case of CAGW with his grey shaded area in the second graph showing the “Estimated Temperature during the Altithermal and Eemain”, the current and previous interglacial periods, about 6,000 and 120,000 years before present.
Well we still haven’t got above there yet, even with all their Quality Control Adjustments to the Raw data.

Dudley Horscroft
February 18, 2015 9:54 am

So atmospheric CO2 has been increasing. ISTR that it was about 400 ppm about 4 years ago. What is it now? Or has it stopped increasing (so as to create the plateau?).

RACookPE1978
Editor
Reply to  Dudley Horscroft
February 18, 2015 10:24 am

Dudley Horscroft
Dudley Horscroft

So atmospheric CO2 has been increasing. ISTR that it was about 400 ppm about 4 years ago. What is it now? Or has it stopped increasing (so as to create the plateau?).

The “shelf” (now 18 years – 3 months) began before atmospheric CO2 reached 400 ppm, continued just as flat as it passed 400 ppm, and has continued as it exceeded 400 ppm.

Reply to  Dudley Horscroft
February 18, 2015 10:33 am

The resident liberal-left newspaper here ‘The Irish Times’ prints daily on its weather page the previous days temperature maxima from major cities around the world, as do most other newspapers.The curious thing is that their temperatures are pretty consistently between one and two degrees Celsius higher than the temperature maxima in many other papers. Their source is the MeteoGroup. Does anyone know anything about this group and whether they have a “track record”?

Kevin Kilty
February 18, 2015 10:17 am

I have written about this particular testimony many times in the past. There is a very large difference between his spoken testimony that day in front of congress and the conclusion made in the peer-reviewed paper. Making fundamentally different statements to different audiences seems a hallmark of pathological science. But the more important point is how he arrived at this.
He assumed that in the absence of human interference earth temperature would be a stationary random process with zero mean referenced to the 1951-1980 period and standard deviation of 0.13 per yearly average. And his probability statement rests on the distribution being gaussian. I am unconvinced that mean earth temperature would be stationary under the stated conditions. There seems little evidence that it is gaussian distributed. It might even be that the distribution of such statistic is such that the central limit theorem does not apply.
Uncertainty in measured quantities ought to consider four things: 1) bias, 2)uncertainty contributed by instrument, 3)observer, and 4) underlying process. To my mind no one has ever conducted a proper Gage R&R study to separate the influences, surface stations project was the first serious step this direction in my view; and rarely will researchers in this field consider bias except in one direction — that historical temperatures have always been biased too high and must be corrected downward.

rgbatduke
Reply to  Kevin Kilty
February 18, 2015 10:42 am

He assumed that in the absence of human interference earth temperature would be a stationary random process with zero mean referenced to the 1951-1980 period and standard deviation of 0.13 per yearly average. And his probability statement rests on the distribution being gaussian. I am unconvinced that mean earth temperature would be stationary under the stated conditions. There seems little evidence that it is gaussian distributed. It might even be that the distribution of such statistic is such that the central limit theorem does not apply.

Oh, please. One glance at the temperature record over any significant time span is sufficient proof that it isn’t stationary. Including the same graph above where he makes this “assumption”. It isn’t true in the immediate past of 1950 to 1980. It isn’t true in any century long span of the thermal record. Indeed:
http://commons.wikimedia.org/wiki/File:Holocene_Temperature_Variations.png
show me the “stationary” period during the last 12,000 years! Note well that the black line is a) an average of all of the colored spaghetti without regard to its presumed accuracy or location; b) smoothed over roughly 300 years. Which means (as one looks at the spaghetti, the black line, and Hansen’s 30 year baseline data) that temperature is never stationary, it is always gradually increasing or decreasing, sometimes rapidly, sometimes slowly. It also strongly suggests that at least regionally, temperatures can vary by whole degrees per century, if not more.
So one doesn’t need to remain “unconvinced” that the earth’s mean temperature would have been stationary if it weren’t for CO_2 over the last decade, century, or millennium, as there isn’t a shred of evidence for probable stability on any of those timescales in historical or proxy-derived data. We simply cannot predict what it would have done with, or without, CO_2. There is also no reason to think that the climate mean temperature is more than (possibly) gaussian on an annual or monthly basis around a comparatively slowly varying (but constantly varying) mean.
If you want a butt-kicking good discussion of the evil of applying normal stats to non-stationary time series, read some of William Briggs’ extensive writings on the subject:
http://wmbriggs.com/post/5172/
This is just one of several of his enormously biting comments on the subject, but since it kicks off a series that really walks through it by the numbers, it is worth the read.
None of this is “modern statistics” by the way. It’s just that Hansen is, as noted, either an idiot or a liar or both. Quite possibly both.
rgb

Bryan A
February 18, 2015 10:31 am

Not that I agree with Mr. Hansen’s (Henson) Muppetteering of the records in many cases but I think the point was missed
“This is pretty much Jim Hansen’s Figure 1 presented to the senate committee. I’ve added the green box, showing the ±0.13 C 1s jitter of global temperature during the 1951-1980 reference period.
The 1987 record was Figure 1 in Hansen and Lebedeff published in April 1988, about 3 months before his testimony, [2] and was Figure 6 of Hansen and Lebedeff, November 1987. [3]
In his testimony Jim Hansen implied that this 1s = ±0.13 C jitter was the full sum total of natural climate variability. The rise air temperature by mid-1988, nearly 0.4 C, was then 3s beyond nature. Obviously, that made the trend 99% unnatural.
That’s the whole ball of wax. Don’t believe it? Check out the quotes in the Appendix.
Somehow the 1884 and 1937 trend was overlooked by both Jim Hansen and the Senators. Right before their eyes was a 0.84 C global air temperature increase. Let’s see, that’s more than 6s beyond nature. In Jim Hansen world, that makes the trend more than 99.99966 % likely to be unnatural. Hmmm … what could possibly have caused that?
What about the probable ~1 C, unnaturally 7.7 s, increase in global air temperature between the Little Ice Age, 1650, and 1900? [4] Humans couldn’t have done it. Climate gremlins, maybe?”
Mr. Hansen was referring to the Temperature/Time ratio.
The quoted trends of 1884 to 1937 .84C was over a 53 year period 1/10C per 6.3 years
The quoted probable trend of 1C from 1650 to 1900 is a span of 250 years 1/10C per 25 years
And
The quoted trends of 1884 to 1937 .84C was over a 53 year period 1/10C per 6.3 years
While
He is referring to the increase (trend) of .4C / 6.5 year period. 1/10C per 1.6 years
I guess the Muppetteer is concerned His hand will sweat in his sock puppets

rgbatduke
Reply to  Bryan A
February 18, 2015 11:35 am

And you are dead right on every account. It was absurd even as he presented it. Sadly — and I say this as a professional who has founded two companies based on advanced statistical modeling and who has had to try to explain how it works to corporate executives of fortune 500 companies:
NOBODY UNDERSTANDS STATISTICS
This is a statement that is a self-referential statistical truth. If you go down a phone book (to cite a nearly obsolete reference but you understand what I mean) and use random numbers to select individuals, call them on the phone, and ask them what the Central Limit Theorem is, not one in a 100 — maybe not even one in 1000 — will be able to answer at all, and it is more likely one in 10,000 that MIGHT actually be able to tell you that it says that the mean of any collection of N > ~30 independent, identically distributed samples drawn without bias from a distribution with a suitably bounded variance forms, in the limit, a gaussian/normal distribution around the true mean of the distribution. The percentage that would know what an error function is even smaller, even though a surprising number would be able to tell you that the standard deviation has something to do with probable accuracy.
This is almost independent of educational level achieved, although in mathematics, statistics, computer science and many of the sciences people with bachelor’s degrees in principle should know it. This includes many, but not all, physicians and health care providers (it should be all, but sadly the world isn’t perfect).
Of people who know it, you can subtract away those that don’t know about joint and conditional probabilities, marginal probabilities, and Bayes’ theorem in its various generalizations, hypothesis testing, stationary vs non-stationary distributions, various distributions, and modelling in general, and you are left with a thin, thin fraction of all of humanity, one that excludes most scientists, a fair number of mathematicians, most computer scientists. You are left with the small fraction of humanity that is marginally competent to judge statistical results for things like bias, uncertainty, predictivity, probably truth. Maybe one in 100,000? With luck? Maybe even one in a million.
Sure, they are concentrated in the well-educated first world countries so there are more than 3000 people in the US or Europe, but how many are there in Africa (per capita) or Asia (per capita)?
rgb

Tom in Florida
Reply to  rgbatduke
February 18, 2015 12:24 pm

You, Sir, are one in a million.

george e. smith
Reply to  rgbatduke
February 18, 2015 12:44 pm

Well as I have said on several occasions. Statistics is always performed on a (any) given set of exactly known numbers, whose origin is entirely irrelevant, to the process. (as far as the validity of the results of whatever statistical algorithms you choose to apply.)
So statistics is literally numerical origami, and whether you end up with a crane or a swan after you do the work, is entirely a function of the algorithm and not the numbers in the set.
The mischief starts when you try to attach meaning to what your folded paper turned out to be.
However you fold, you can never get any information about any number that is not in the set. You can’t predict whether it will be higher, lower, or equal to the last number in the set, or for that matter, to any other number in the set.
That is true if your number set is 1,2,3,4,5,6,7,8,9 ; or if it is the set of the very first actual numbers to be found on each page of that telephone book you mentioned.
It is simply a matter of faith to believe that statistics can tell you anything about any numbers not in the statisticated data set. The numbers themselves contain ALL of the information that is present in the data set.
Some persons would write a nice story or a piece of music on an empty clean sheet of paper.
Still other persons, can’t think of anything more creative to do, than to fold it up , and see what it looks like.
Just my opinion of course

sleepingbear dunes
Reply to  rgbatduke
February 18, 2015 3:07 pm

rgb
As always I enjoyed your comments immensely. For some reason, I think I would have loved your classes 50 years ago.
A few of your testimonies before Congress would sink the boat quicker than anything I can imagine.

D.J. Hawkins
Reply to  rgbatduke
February 18, 2015 3:16 pm

I knew some of these things, once upon a time. I slogged through Student’s “T”, the erf and even F-tests as a student in engineering. I had a very small window of need for some of the basic functions in my very first job nearly 40 years ago, but since then, nada. If you don’t use it, you lose it. I wouldn’t dream of trying to explain the central limit theorem to my sons without a handy reference and a couple of nights review beforehand, and “go ask your teacher” is the more likely reply. The real challenge for us “civilians” is trying to figure out when “the experts” are dealing from the bottom of the deck and when they’re dealing straight.

Reply to  rgbatduke
February 19, 2015 2:32 am

I couldn’t answer your questions either, but I can still spot bogus stats a mile away. Just because I don’t know the vocabulary, doesn’t mean I can’t understand the data.

policycritic
Reply to  rgbatduke
February 19, 2015 3:05 pm

@rgbatduke,
Is my memory correct: did you call Keynes “a statistical GOD?”

Bryan A
Reply to  Bryan A
February 18, 2015 2:31 pm

Forgot to include the trend over the last 18 years
From 1997 to 2015 the trend is 0.0C per 18 years

Reply to  Bryan A
February 18, 2015 6:03 pm

Bryan, Hansen’s 99% certainty statistic doesn’t rely on a temperature/time ratio. It’s all about the 0.4 C magnitude of the trend. Not about its rate.

February 18, 2015 10:37 am

If you have ever had problems with mildew, you probably learned that it is pointless to attack the mildew directly; you must change the conditions that cause mildew to thrive. The mildew spore are omnipresent, just waiting for favorable climes.
People like Hansen are like mildew. They thrive under immoral conditions. What conditions are those?
Not long ago, in the U.S., paper money was a promise — a contract — by the government to give the bearer on demand a certain amount and fineness of silver or gold. The price of gold was not fixed in terms of the dollar, as is popularly assumed; the dollar was by contract redeemable in a fixed amount of silver or gold.
In 1933, with an unconstitutional stroke of the pen, FDR passed a law demanding all US citizens turn in to the government the largest circulating pool of gold coins ever, at risk of 10 years in prison and a $10,000 fine. He gave them IOU nothings in return. He unilaterally broke the government’s contract with the people. (Yet foreigners could continue to redeem dollars for gold.)
In 1971, Nixon likewise gave the middle finger to foreign dollar holders. Like the first, this was a fraudulent default, as the US had plenty of gold with which to honor its contract.
Since 1971 the US dollar has not been ‘money’, but rather, evidence of broken promises.
In other words, the moral dimension of money has been annihilated. Money used to be a token of good faith, of honorable dealings. Money and morality were inseparable concepts.
Now money, pseudo-money really, is a constant reminder that we are, via ‘legal tender laws’, forced at gunpoint to use broken promises redeemable in more of the same. The Treasury issues bonds redeemable in irredeemable Federal Reserve Notes that are themselves backed by bonds. The pretty name for this is ‘check-kiting.’ It is one gigantic lie.
So there you have it. In a world where money is a lie, and where morality has been eliminated ‘for the greater good’, anything goes. The end justifies the means.
So, back to Hansen… what do we expect? Money intercedes in every aspect of our lives. Money is now corrupt. Funding is now corrupt. Users of funding are now corrupt.
Hansen is like mildew. He and his slimy type won’t go away regardless of how hard you scrub. As long as our monetary system is corrupt, you will see an endless cavalcade of liars and con men (read: confidence men) masquerading as helpers, bent on destruction.
Anything goes.

Reply to  Max Photon
February 19, 2015 2:42 am

Wonderful Max. Yes the Global Warming Con is just part of a larger Banking Con, which in turn is part of a larger agenda for concentrated power.
I wish more people were aware of the deliberately created environment that keeps the majority focused on the various consequences of the environment but blind to its cause.

February 18, 2015 11:03 am

BBC news just shown an item on the north east US and Canada’s 2015 big freeze, it is incomprehensible that such events could be a constituent of the so called global warming theory.

Toneb
Reply to  vukcevic
February 18, 2015 12:16 pm

Correct – it’s nothing to do with AGW.
Just a normal PJS meridional extension south over the E half of N America.

knr
Reply to  vukcevic
February 18, 2015 2:41 pm

Its the ‘magic’ of AGW that well everything can prove it , hotter/colder wetter/dryer , nothing can disprove it. Think religion and you will see how this works.

rabbit
February 18, 2015 11:06 am

“The standard deviation of 0.13 oC is a typical amount by which the global temperature fluctuates annually about its 30 year mean; the probability of a chance warming of three standard deviation is about 1%. Thus we can state with about 99% confidence that current temperatures represent a real warming trend rather than a chance fluctuation of the 30 year period.”
Makes perfect sense provided the natural global temperature behaves like a random variable with a Gaussian distribution of fixed mean and variance, and each year is independent of the next.
Almost none of which is likely true.

ShrNfr
February 18, 2015 11:13 am

I hate these #uc*ing morons who say that they are XXX% sure something is happening. You can never, ever, ever say that in the scientific method. All that you can say is that you can reject the (null) hypothesis that something is not happening at the XXX% significance level. Most people say “Well, isn’t that the same thing??” and the answer to that question is “Not on your life.” Francis Bacon would vomit on Hansen as not adhering to the scientific method. Get some bread to feed Hansen. He is a quack.
CAGW is a escathological cargo cult and will go down in history as being even more absurd that phrenology. At least phrenologists had something vaguely correct. The shape, etc. of the brain does predict behaviour to a certain extent. It is just that the shape of the skull says nothing about the brain inside it. The Phrenologists even had a journal once upon a time. https://archive.org/details/americanphrenol04unkngoog

Alx
February 18, 2015 11:32 am

In a fair world, tar, feathers and Hanson riding a rail out of town.
In the real world, Hanson, lucrative career, incompetence and dishonesty pays.

bw
February 18, 2015 11:36 am

I doubt J. Hansen’s scientific reputation is what he is concerned about. His real motivations are entirely political. As far has his science, just read a few of his published papers over the years. Pathetic garbage.
For RACook, Atmospheric O2 at 20.9 percent is 209000 ppm.
[fixed. .mod]

RACookPE1978
Editor
February 18, 2015 11:37 am

Request you check the following sentence in the first paragraph above: The 0.035 and 0.025 are not clear there, nor how they are used with respect to the 0.13 “jitter”
“But the essence of 99% is in the next Figure, the GISS 1987+ global air temperature record, complete with ludicrously small error bars (1sigma = ±0.035 C, or ±0.025 C).”

Reply to  RACookPE1978
February 18, 2015 3:17 pm

In the original, the “±0.025 C” was red, to indicate it referred to the red error bar on the face of the Figure. But the color didn’t translate into HTML. Live and learn. 🙂 Thanks for pointing it out so I could clarify.
I just noticed, too, that I owe the moderator a vote of thanks, for writing in “sigma” where “s” once reigned. Thanks, mod. 🙂
[No, still not clear. But thank you.
So the +/- 0.025 is for the red error bar. But then what is the +/_ 0.035 linked to? .mod]

Reply to  Pat Frank
February 18, 2015 4:39 pm

Mod, the (+/-)0.035 C is linked to the black error bar. It’s beneath the red error bar on the Figure. The black one is a bit bigger and sticks out, top and bottom.
The unsmoothed black temp line –> black error bar, = [black text, (+/-)0.035 C]. The smoothed red temp line –> red error bar = [red text, (+/-)0.025 C].
The original idea was to indicate meaning using color-code, so as to avoid having to use text. But, that all back-fired, as we all now see. 🙂

February 18, 2015 12:33 pm

Pat Frank leads off a paragraph with,
“Jim Hansen’s physical causality? . . . .”

I think Pat Frank’s quest post is an important retrospective. My take on it is that it shows Jim Hansen’s pre-science masquerading as science under the respectability of NASA’s once excellent scientific reputation. Thanks Pat Frank.
The physical causality argument used by Jim Hansen (Head of NASA GISS From 1981 to 2013) was based on an illogical argument of ‘begging the question’ (petitio principii).
His ‘begging the question’ physical causality argument is that since manmade CO2 ‘a priori’ must destroy the Earth Atmospheric System (EAS)** then science must show the EAS is being destroyed and must show the EAS destruction will accelerate in the future.
‘Begging the question’ illogic forms the supporting basis of many mythologies.
** regardless of whether of not all the rest of the dynamics remain equal in the EAS
John

February 18, 2015 1:48 pm

“The standard deviation of 0.13 oC is a typical amount by which the global temperature fluctuates annually about its 30 year mean”
I guess by typical Jim means the lowest deviation every seen.

February 18, 2015 2:07 pm

Hansen based his stuff on a climate model. While many models have aged better than I have, not many make the cover of Sports Illustrated swimsuit issue year after year. SI just keeps finding new ones…that won’t age any better. 😎

Reply to  Gunga Din
February 19, 2015 8:27 am

It could be argued that the SI models expose more data with each iteration.
Not so the GCMs.
They’re pretty much the same covered up lies.

richard
February 18, 2015 2:53 pm

99% eh,
World Meteorological Organization – “The main problem arises from the fact that the shape of climate change signal is unknown”

richard
February 18, 2015 2:54 pm

WMO-
“The main problem of
the application of absolute methods is that the separation between the climate change
signal and the inhomogeneity is essentially impossible”

richard
February 18, 2015 2:59 pm

WMO-
“Data homogeneity is strongly related to the climate change
problem, which is at the centre of scientific and policy debates. It has been recognized and widely accepted that long and reliable observation series are required to address climate change issues and impact studies. Unfortunately, these high quality meteorological data series seldom exist”

richard
February 18, 2015 3:02 pm

oh blimey what a mess.
WMO
“It was already mentioned that the long-term climatological time series are often
plagued with discontinuities caused by station relocation, installation of new instruments,
etc. Several types of disturbances can distort or even hide the climatic signal. Therefore, it is quite natural that the data are tested in order to locate possible discontinuities. However, usually the detection of the homogeneity breaks is not enough. The breaks appear to be so common that rejection of inhomogeneous series simply leave too few and too short series for further analysis”

February 18, 2015 10:38 pm

Oh what? The surety is error bars? Spare me your Bayesian bars and show me something deductive, even inductive. Show me reason, not statistics.

Reply to  gymnosperm
February 19, 2015 7:26 am

Bayesian Statistics (Definition): The art of describing your prejudices in Greek Letters and then estimating your chances of getting away with them.

Reply to  M Courtney
February 19, 2015 8:22 am

Not fair , but : LoL !

Reply to  M Courtney
February 19, 2015 9:34 am

We agree again, M Courtney. 🙂

RACookPE1978
Editor
Reply to  gymnosperm
February 19, 2015 10:02 am

reminds me of that old joke.
A climate scientist walked into an error bar …
But I digress. It could never happen. 8<)

knr
Reply to  RACookPE1978
February 19, 2015 3:43 pm

Is that because climate ‘scientists’ don’t believe in use error bars or because they have such a wide range that they die before they could ever walk into one ?

Reply to  gymnosperm
February 19, 2015 1:31 pm

“The surety of error”? I’ll give Hansen that.

Samuel C Cogar
February 20, 2015 7:39 am

There is an ole saying that goes ….. “If you won’t listen, … then you will have to feel”.
So my question is, … just how many of those fossil fuel protesting college students and other individuals under the age of 30 ….. are for the 1st time in their lives, during this February 2015, actually feeling what 0 F to -20 F surface temperatures actually feel like or the “wind chill” temperatures which are much lower?
Whatta you wanna bet that 90+% of them are truly appreciating the “warmth” that fossil fuels are currently providing then.
My thermometer was reading -12 F here in central West Virginia, USA, at 6:00 AM this morning of 02-20-15, …. which are temperatures I last experienced here some 20 years ago

February 21, 2015 11:09 pm

maybe it’s me but the math by some posters here is kinda funny. CO2 went from ~360 ppm to 400 ppm a 10% increase. Well, ya kinda. That is the kind of sketchy math that makes it seem like it went up a bunch. So it went from .036 % to .040 %. Reality would suggest that is an increase of .004%. From .03 to .04% is just a .01% increase. Not 25%. It is the same with temperature. What do you call it when someone uses Celsius instead of Kelvin in a heat transfer equation? A freshman mistake. Temperature, for heat transfer calculations, didn’t go from 20 C to 21.3 C or whatever the jump was. It went from 293 Kelvin to 294.3 Kelvin, or a .4 % change, not a 6.5% change. You could refer to that way, but it would be inaccurate for scientific discussions.