Hide the decline deja vu? Mann's 'little white line' as 'False Hope' may actually be false hype

Foreword by Anthony Watts 

An essay by Monckton of Brenchley follows, but I wanted to bring this graphic from Dr. Mann’s recent Scientific American article to attention first. In the infamous “hide the decline” episode revealed by Climategate surrounding the modern day ending portion of the “hockey stick”, Mann has been accused of using “Mike’s Nature Trick” to hide the decline in modern (proxy) temperatures by adding on the surface record. In this case, the little white line from his SciAm graphic shows how “the pause” is labeled a “faux pause”, (a little play on words) and how the pause is elevated above past surface temperatures.

earth-will-cross-the-climate-danger-threshold-by-2036_large[1]

Source: http://www.scientificamerican.com/sciam/assets/Image/articles/earth-will-cross-the-climate-danger-threshold-by-2036_large.jpg

Zoom of section of SciAm's graph from Dr. Mann. The 1°C line was added for reference.
Zoom of section of SciAm’s graph from Dr. Mann. The 1°C line was added for reference.

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.

So, over  the weekend I asked Willis Eschenbach to use his “graph digitizer” tool (which he has used before) to turn Mann’s little white line into numerical data, and he happily obliged.

Here is the result when Mann’s little white line is compared and matched to two well known surface temperature anomaly datasets:

mann_falsehope_vs_GISS-HAD4

What is most interesting is that  Mann’s “white line” shows a notable difference during the “pause” from HadCRUT4 and GISS LOTI. Why would our modern era of “the pause” be the only place where a significant divergence exists? It’s like “hide the decline” deja vu.

The digitized Mann’s white line data is available here: Manns_white_line_digitized.(.xlsx)

As of this writing, we don’t know what dataset was used to create Mann’s white line of surface temperature anomaly, or the base period used. On the SciAm graphic it simply says “Source: Michael E. Mann” on the lower right.

It isn’t GISS land ocean temperature index (LOTI), that starts in 1880. And it doesn’t appear to be HadCRUT4 either. Maybe it is BEST but not using the data going back to 1750? But that isn’t likely either, since BEST pretty much matches the other datasets, and in Mann’s graphic above, which peaks out at above 1°C, none of those hit higher than 0.7°C. What’s up with that?

land-and-ocean-other-results-1950-large[1]

Now compare that plot above to this portion Dr. Mann’s SciAm plot, noting the recent period of surface temperature and the 1°C reference line which I extended from the Y axis:

Manns_white_line_extended_1C

I’m reminded of Dr. Mann’s claims about climate skeptics in this video: http://www.linktv.org/video/9382/inside-the-climate-wars-a-conversation-with-michael-mann

At 4:20 in the video, Dr. Mann claims that US climate skeptics are part of  the “greatest disinformation campaign ever run”. If his position is so strong and pure, why then do we see silly things like this graph given with an elevated ending of global surface temperature (in contrast to 5 other datasets) and not a single data source citation given?

UPDATE: Mark B writes in comments:

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.

Explanation of graph including links to source code and data were given here: http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/

REPLY: Yes, I’ve seen that, but there is a discrepancy, the label on the image is “Historical Mean Annual Temperature” (white)

In http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/ it is written:

Historical Simulations. The model was driven with estimated annual natural and anthropogenic forcing over the years A.D. 850 to 2012. Greenhouse radiative forcing was calculated using the approximation (ref. 8) FGHG = 5.35log(CO2e/280), where 280 parts per million (ppm) is the preindustrial CO2 level and CO2e is the “equivalent” anthropogenic CO2. We used the CO2 data from ref. 9, scaled to give CO2e values 20 percent larger than CO2 alone (for example, in 2009 CO2 was 380 ppm whereas CO2e was estimated at 455 ppm). Northern Hemisphere anthropogenic tropospheric aerosol forcing was not available for ref. 9 so was taken instead from ref. 2, with an increase in amplitude by 5 percent to accommodate a slightly larger indirect effect than in ref. 2, and a linear extrapolation of the original series (which ends in 1999) to extend though 2012.

“Historical Mean Annual Temperature” is NOT the same as “Historical Simulations” It looks to me like a bait and switch.

UPDATE2: Note the lead in text says “Global temperature rise…”

But in comments, Willis and Bill Illis have worked out that the white line represents only half the planet, the Northern Hemisphere. The white line is HadCRUT NH value, not global.

Obviously we can’t take such statements as the lead in text saying “global” at face value. Imagine if a climate skeptic made a graph like this. We’d be excoriated.

What needs to be done is to create a graph that shows what this would have looked like had Mann not cherry picked the NH and presented it on a graph with the text “Global temperature rise…”.

==============================================================

Mann’s ‘False Hope’ is false hype

By Christopher Monckton of Brenchley

The legendary Dr Walter Lewin, Professor of Physics at MIT, used to introduce his undergraduate courses by saying that every result in physics depended ultimately on measurement; that mass, distance, and time are its three fundamental physical units that every observation in these and all of their derivative units is subject to measurement uncertainty; and that every result in physics, if only for this reason, is to some degree uncertain.

Contrast this instinctual humility of the true physicist with the unbecoming and, on the evidence to date, unjustifiable self-assurance of the surprisingly small band of enthusiasts who have sought to tell us there is a “climate crisis”’. Not the least among these is Michael Mann, perpetrator of the Hokey-Stick graph that wrought the faux abolition of the medieval warm period.

In logic, every declarative statement is assigned a truth-value: 1 (or, in computer programs, –1) for true, 0 for false. Let us determine the truth-values of various assertions made by Mann, in a recent article entitled False Hope, published in the propaganda-sheet Scientific American.

Mann’s maunderings and meanderings will be in bold face, followed by what science actually says in Roman face, and the verdict: Truth-value 1, or truth-value 0?

Mann: “Global warming continues unabated.”

Science: Starting in Orwell’s Year (1984), and taking the mean of the five standard global temperature datasets since then, the rate of warming has changed as follows:

1979-1990 Aug 140 months +0.080 Cº/decade.

1979-2002 Apr 280 months +0.153 Cº/decade.

1979-2013 Dec 420 months +0.145 Cº/decade.

The slowdown in the global warming rate has arisen from the long pause, now 13 years 2 months in length on the mean of all five datasets (assuming that HadCRUT4, which is yet to report, shows a result similar to the drop in global temperatures reported by the other four datasets).

Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.

Mann: “… during the past decade there was a slowing in the rate at which the earth’s average surface temperature had been increasing. The event is commonly referred to as “the pause,” but that is a misnomer: temperatures still rose, just not as fast as during the prior decade.”

Science: During the decade February 2005 to January 2014, on the mean of all five datasets, there was a warming of 0.01 Cº, statistically indistinguishable from zero.

Truth-value 0: Temperatures did not rise in any statistically significant sense, and the increase was within the measurement uncertainty in the datasets, so that we do not know there was any global warming at all over the decade. Here, Walter Lewin’s insistence on the importance of measurement uncertainty is well demonstrated.

Mann: “In response to the data, the IPCC in its September 2013 report lowered one aspect of its prediction for future warming.”

Science: In 2013 the IPCC reduced the lower bound of its 2007 equilibrium climate-sensitivity interval from 2 Cº to 1.5 Cº warming per CO2 doubling, the value that had prevailed in all previous Assessment Reports. It also reduced the entire interval of near-term projected warming from [0.4, 1.0] Cº to [0.3, 0.7] Cº. Furthermore, it abandoned its previous attempts at providing a central estimate of climate sensitivity.

Verdict: Truth value 0. The IPCC did not lower only “one aspect of its prediction for future warming” but several key aspects, abandoning the central prediction altogether.

Mann: If the world keeps burning fossil fuels at the current rate, it will cross a threshold into environmental ruin by 2036. The “faux pause” could buy the planet a few extra years beyond that date to reduce greenhouse gas emissions and avoid the crossover–but only a few.

Science: Mann is asserting that on the basis of some “calculations” he says he has done, the world will face “environmental ruin” by 2036 or not long thereafter. However, Mann has failed to admit any uncertainty in his “calculations” and consequently in his predictions.

Verdict: Truth-value 0. Given the ever-growing discrepancy between prediction and observation in the models, and Mann’s own disastrous record in erroneously abolishing the medieval warm period by questionable statistical prestidigitation, the uncertainty in his predictions is very large, and a true scientist would have said so.

Mann: “The dramatic nature of global warming captured world attention in 2001, when the IPCC published a graph that my co-authors and I devised, which became known as the ‘hockey stick’. The shaft of the stick, horizontal and sloping gently downward from left to right, indicated only modest changes in Northern Hemisphere temperature for almost 1,000 years–as far back as our data went.”

Science: The Hokey-Stick graph falsely eradicated both the medieval warm period and the little ice age. At co2science.org, Dr. Craig Idso maintains a database of more than 1000 papers demonstrating by measurement (rather than modeling) that the medieval warm period was real, was near-global, and was at least as warm as the present just about everywhere. McIntyre & McKitrick showed the graph to be erroneous, based on multiple failures of good statistical practice. The medieval warm period and the little ice age are well attested in archaeology, history, architecture, and art. It was the blatant nonsense of the Hokey Stick that awoke many to the fact that a small academic clique was peddling unsound politics, not sound science.

Verdict: Truth value 0. Once again, Mann fails to refer to the uncertainties in his reconstructions, and to the many independent studies that have found his methods false and his conclusions erroneous. Here, he takes a self-congratulatory, nakedly partisan stance that is as far from representing true science as it is possible to go.

Mann: “The upturned blade of the stick, at the right, indicated an abrupt and unprecedented rise since the mid-1800s.”

Science: The graph, by confining the analysis to the northern hemisphere, overstated 20th-century global warming by half. Mann says the rise in global temperatures, shown on the graph as 1.1 Cº over the 20th century, is “unprecedented”. However, the Central England Temperature Record, the world’s oldest, showed a rise of 0.9 Cº in the century from 1663 to 1762, almost entirely preceding the industrial revolution, compared with an observed rate of just 0.7 Cº over the 20th century. The CETR is a good proxy for global temperature change. In the 120 years to December 2013 it showed a warming rate within 0.01 Cº of the warming rate taken as the mean of the three global terrestrial datasets.

Verdict: Truth value 0. The warming of the 20th century was less than the warming for the late 17th to the late 18th centuries.

clip_image002

Mann: “The graph became a lightning rod in the climate change debate, and I, as a result, reluctantly became a public figure.”

Science: For “lightning-rod” read “laughing-stock”. For “reluctantly” read “enthusiastically”. For “public figure” read “vain and pompous charlatan who put the ‘Ass’ in ‘Assessment Report’”.

Verdict: Pass the sick-bucket, Alice.

Mann: “In its September 2013 report, the IPCC extended the stick back in time, concluding that the recent warming was likely unprecedented for at least 1400 years.”

Science: The IPCC is here at odds with the published scientific literature. In my expert review of the pre-final draft of the Fifth Assessment Report, I sent the IPCC a list of 450 papers in the reviewed literature that demonstrated the reality of the warm period. The IPCC studiously ignored it. Almost all of the 450 papers are unreferenced in the IPCC’s allegedly comprehensive review of the literature. I conducted a separate test using the IPCC’s own methods, by taking a reconstruction of sea-level change over the past 1000 years, from Grinsted et al. (2009), and comparing it with the schematic in the IPCC’s 1990 First Assessment Report showing the existence and prominence of both the medieval warm period and the little ice age. The two graphs are remarkably similar, indicating the possibility that the sea-level rise in the Middle Ages was caused by the warmer weather then, and that the fall in the Little Ice Age was caused by cooler weather. The sea-level reconstruction conspicuously does not follow a Hokey-Stick shape.

clip_image004

Verdict: Truth value 0. The IPCC has misrepresented the literature on this as on other aspects of climate science. There are of course uncertainties in any 1000-year reconstruction, but if Grinsted et al. have it right then perhaps Mann would care to explain how it was that sea level rose and fell by as much as 8 inches either side of today’s rather average value if there was no global warming or cooling to cause the change?

Mann: “Equilibrium climate sensitivity is shorthand for the amount of warming expected, given a particular fossil-fuel emissions scenario.”

Science: Equilibrium climate sensitivity is a measure of the global warming to be expected in 1000-3000 years’ time in response to a doubling of CO2 concentration, regardless of how that doubling came about. It has nothing to do with fossil-fuel emissions scenarios.

Truth value: 0. Mann may well be genuinely ignorant here (as elsewhere).

Mann: “Because the nature of these feedback factors is uncertain, the IPCC provides a range for ECS, rather than a single number. In the September report … the IPCC had lowered the bottom end of the range. … The IPCC based the lowered bound on one narrow line of evidence: the slowing of surface warming during the past decade – yes, the faux pause.”

Science: For well over a decade there has been no global warming at all. The pause is not faux, it is real, as Railroad Engineer Pachauri, the IPCC’s joke choice for climate-science chairman, has publicly admitted. And the absence of any global warming for up to a quarter of a century is not “one narrow line of evidence”: it is the heart of the entire debate. The warming that was predicted has not happened.

Verdict: Truth value 0. Mann is here at odds with the IPCC, which – for once – paid heed to the wisdom of its expert reviewers and explicitly abandoned the models, such as that of Mann, which have been consistent only in their relentless exaggeration of the global warming rate.

Mann: “Many climate scientists – myself included – think that a single decade is too brief to accurately measure global warming and that the IPCC was unduly influenced by this one, short-term number.”

Science: Overlooking the split infinitive, the IPCC was not “unduly influenced”: it was, at last, taking more account of evidence from the real world than of fictitious predictions from the vast but inept computer models that were the foundation of the climate scare. Nor was the IPCC depending upon “one short-term number”.

James Hansen of NASA projected 0.5 C°/decade global warming as his “business-as-usual” case in testimony before Congress in 1988. The IPCC’s 1990 First Assessment Report took Hansen’s 0.5 C°/decade as its upper bound. It projected 0.35 C°/decade as its mid-range estimate, and 0.3 C°/decade as its best estimate.

The pre-final draft of the 2013 Fifth Assessment Report projected 0.23 C°/decade as its mid-range estimate, but the published version reduced this value to just 0.13 C°/decade – little more than a quarter of Hansen’s original estimate of a quarter of a century previously.

Observed outturn has been 0.08 Cº/decade since 1901, 0.12 C°/decade since 1950, 0.14 C°/decade since 1990, and zero since the late 1990s.

Three-quarters of the “climate crisis” predicted just 24 years ago has not come to pass. The Fifth Assessment Report bases its near-term projections on a start-date of 2005. The visible divergence of the predicted and observed trends since then is remarkable.

clip_image006

It is still more remarkable how seldom in the scientific journals the growing discrepancy between prediction and observation is presented or discussed.

Verdict: Truth value 0. Step by inexorable step, the IPCC is being driven to abandon one extremist prediction after another, as real-world observation continues to fall a very long way short of what it had been predicting.

Mann: “The accumulated effect of volcanic eruptions during the past decade, including the Icelandic volcano with the impossible name, Eyjafjallajökull, may have had a greater cooling effect on the earth’s surface than has been accounted for in most climate model simulations. There was also a slight but measurable decrease in the sun’s output that was not taken into account in the IPCC’s simulations.”

Science: So the models failed to make proper allowance for, still less to predict, what actually happened in the real world.

Verdict: Truth value 0. Eyjafjallajökull caused much disruption, delaying me in the United States for a week (it’s an ill wind …), but it was a comparatively minor volcanic eruption whose signature in the temperature record cannot be readily distinguished from the la Niña cooling following the el Niño at the beginning of 2010. The discrepancy between models’ predictions and observed reality can no longer be as plausibly dismissed as this, and the IPCC knows it.

Mann: “In the latter half of the decade, La Niña conditions persisted in the eastern and central tropical Pacific, keeping global surface temperatures about 0.1 degree C colder than average …”

Science: There were La Niña (cooling) events in 1979, 1983, 1985, 1989, 1993, 1999, 2004, and 2008 – the only la Niña in the second half of the noughties. There were, however, two el Niño (warming) events: in 2007 and 2010.

Verdict: Truth value 0. There is very little basis in the observed record for what Mann says. He is looking for a pretext – any pretext – rather than facing the fact that the models have been programmed to exaggerate future global warming.

Mann: “Finally, one recent study suggests that incomplete sampling of Arctic temperatures led to underestimation of how much the globe actually warmed.”

Science: And that “study” has been debunked. The numerous attempts by meteorological agencies around the world to depress temperatures in the early 20th century to make the centennial warming rate seem larger than it is have far outweighed any failure to measure temperature change in one tiny region of the planet.

Verdict: Truth value 0. Increasingly, as the science collapses, the likes of Mann will resort in desperation to single studies, usually written by one or another of the remarkably small clique of bad scientists who have been driving this silly scare. Meanwhile, the vrai pause continues. As CO2 concentrations increase, the Pause will not be likely to continue indefinitely. But it is now clear that the rate at which the world will warm will be considerably less than the usual suspects have predicted.

Mann: “When all the forms of evidence are combined, they point to a most likely value for ECS that is close to three degrees C.”

Science: The IPCC has now become explicit about not being explicit about a central estimate of climate sensitivity. Given that two-thirds of Mann’s suggested 3 Cº value depends upon the operation over millennial timescales of temperature feedbacks that Mann himself admits are subject to enormous uncertainties; given that not one of the feedbacks can be directly measured or distinguished by any empirical method either from other feedbacks or from the forcings that triggered it; and given that non-radiative transports are woefully represented in the models, there is no legitimate scientific basis whatsoever for Mann’s conclusion that a 3 Cº climate sensitivity is correct.

Truth value: 0. What Mann is careful not to point out is that the IPCC imagines that only half of the warming from a doubling of CO2 concentration will arise in the next 200 years. The rest will only come through over 1000-3000 years. Now, at current emission rates a doubling of the pre-industrial 280 ppmv CO2 will not occur for 80 years. However, 0.9 Cº warming has already occurred since 1750, leaving only another 0.6 Cº warming to occur by 2280, on the assumption that all of the 0.9 Cº was manmade. And that is if Mann and the models are right.

Mann: “And as it turns out, the climate models the IPCC actually used in its Fifth Assessment Report imply an even higher value of 3.2 degrees C.”

Science: The 2007 Fourth Assessment Report said there would be 3.26 Cº warming at equilibrium after a CO2 doubling. But the 2013 Fifth Report said no such thing. It has fallen commendably silent.

Verdict: Truth value 0. Mann is, yet again, at odds with the IPCC, which has now begun to learn that caution is appropriate in the physical sciences.

Mann: “The IPCC’s lower bound for ECS, in other words, probably does not have much significance for future world climate–and neither does the faux pause.”

Science: This is pure wishful thinking on Mann’s part. In all Assessment Reports except the Fourth, the IPCC chose 1.5 Cº as its lower bound for equilibrium climate sensitivity to doubled CO2 concentration. In the Fourth it flirted briefly with 2 Cº, but abandoned that value when faced with the real-world evidence that Mann sneeringly dismisses as “the faux pause”.

Verdict: Truth value 0. Calling the vrai pause “the faux pause” is a faux pas.

Mann: “What would it mean if the actual equilibrium climate sensitivity were half a degree lower than previously thought? Would it change the risks presented by business-as-usual fossil-fuel burning? How quickly would the earth cross the critical threshold?”

Science: But what is the “critical threshold”? Mann fails to define it. Is there some value for global mean surface temperature that is the best of all temperatures in the best of all possible worlds? If so, Mann’s hypothesis can only be tested if he enlightens us on what that ideal temperature is. He does not do so.

Verdict: Truth value 0. In the absence of a clear and scientifically justified statement of an ideal temperature, plus a further justified statement that a given departure from that ideal temperature would be dangerous, there is no case for a “critical threshold”. Furthermore, there is at present little empirical basis for a global warming of more than 1 Cº over the coming century.

Mann: “Most scientists concur that two degrees C of warming above the temperature during preindustrial time would harm all sectors of civilization–food, water, health, land, national security, energy and economic prosperity.”

Science: No survey of scientists to determine whether they “concur” as to the 2 Cº above pre-industrial temperature that Mann considers on no evidence to be the “critical threshold” has been conducted. Even if such a survey had been conducted – and preferably conducted by someone less accident-prone than the absurd Cook and Nutticelli – that would tell us nothing about the scientific desirability or undesirability of such a “threshold”: for science is not done by consensus, though totalitarian politics is. And it was totalitarian politicians, not scientists, who determined the 2 Cº threshold, on no evidence, at one of the interminable paid holidays in exotic locations known as UN annual climate conferences.

Verdict: Truth value 0. There is no scientific basis for the 2 Cº threshold, and Mann does not really attempt to offer one.

Mann: “Although climate models have critics, they reflect our best ability to describe how the climate system works, based on physics, chemistry and biology.”

Science: Mann’s own model that contrived the Hokey-Stick graph shows what happens when a model is constructed with insufficient attention to considerations that might point against the modeler’s personal preconceptions. The model used a highly selective subset of the source data; it excluded hundreds of papers demonstrating the inconvenient truth that the medieval warm period existed; it gave almost 400 times as much weighting to datasets showing the medieval warm period as it did to datasets that did not show it; and the algorithm that drew the graph would draw Hokey Sticks even if random red noise rather than the real data were used. The problem with any model of a sufficiently complex object is that there are too many tunable parameters, so that the modeler can – perhaps unconsciously – predetermine the output. To make matters worse, intercomparison tends to institutionalize errors throughout all the models. Besides, since the climate behaves as a chaotic object, modeling its evolution beyond around ten days ahead is not possible. We can say (and without using a model) that if we add plant-food to the air it will be warmer than if we had not done so; but (with or without a model) we cannot say with any reliability how much warming is to be expected.

Verdict: Truth value 0. Models have their uses, but as predictors of long-term temperature trends they are, for well-understood reasons, valueless.

Mann: “And they [the models] have a proved track record: for example, the actual warming in recent years was accurately predicted by the models decades ago.”

Science: Here is Hansen’s 1988 prediction of how much global warming should have occurred since then, according to his “Giss Model E”.

clip_image008

The trend shown by Hansen is +0.5 Cº per decade. The outturn since 1988, however, was just 0.15 Cº per decade, less than one-third of what Hansen described as his “business-as-usual” case. Models’ projections have been consistently exaggerated:

clip_image010

Verdict: Truth value 0. The models have consistently and considerably exaggerated the warming of recent decades. The next graph shows a series of central projections, compared with the observed outturn to date, extrapolated to 2050. This is not a picture of successful climate prediction. It is on the basis of these failed predictions that almost the entire case for alarm about the climate is unsoundly founded.

clip_image012

Mann: “I ran the model again and again, for ECS values ranging from the IPCC’s lower bound (1.5 Cº) to its upper bound (4.5 Cº). The curves for an equilibrium climate sensitivity of 2.5 Cº and 3 Cº fit the instrument readings most closely. The curves for a substantially lower (1.5 Cº) and higher (4.5 Cº) sensitivity did not fit the recent instrumental record at all, reinforcing the notion that they are not realistic.”

Science: Legates et al. (2013) established that only 0.3% of abstracts of 11,944 climate science papers published in the 21 years 1991-2011 explicitly stated that we are responsible for more than half of the 0.69 Cº global warming of recent decades. Suppose that 0.33 Cº was our contribution to global warming since 1950, that CO2 concentration in that year was 305 ppmv and is now 398 ppmv. Then the radiative forcing from CO2 that contributed to that warming was 5.35 ln(398/305) = 1.42 Watts per square meter. Assuming that the IPCC’s central estimate of 713 ppmv CO2 by 2100 is accurate, the CO2 forcing from now to 2100 will be 5.35 ln(713/398), or 3.12 W m–2. On the assumption that the ratio of CO2 forcing to that from other greenhouse gases will remain broadly constant, and that temperature feedbacks will have exercised 44/31 of the multiplying effect seen to date, the manmade warming to be expected by 2100 on the basis of the 0.33 Cº warming since 1950 will be 3.12/1.42 x 0.33 x 44/31 = 1 Cº. Broadly speaking, the IPCC expects this century’s warming to be equivalent to that from a doubling of CO2 concentration. In that event, 1 Cº is the warming we should expect from a CO2 doubling, and the only sense in which the 1.5 Cº lower bound of the IPCC’s interval of climate-sensitivity estimates is “unrealistic” is that it is probably somewhat too high.

Verdict: Truth value 0. Here, as elsewhere, Mann appears unaware of the actual evolution of global temperatures during the post-1950 era when we might in theory have exercised some warming influence. There has been less warming than They thought, and – on the basis of the scientific consensus established by Legates et al. – less of the observe warming is anthropogenic than They thought they thought.

Mann: “To my wonder, I found that for an ECS of 3 Cº, our planet would cross the dangerous warming threshold of two degrees C in 2036, only 22 years from now. When I considered the lower ECS value of 2.5 Cº, the world would cross the threshold in 2046, just 10 years later.”

Science: Mann here perpetrates one of the fundamental errors of the climate-extremists. He assumes that the prediction of a climate model is subject to so little uncertainty that it constitutes a fact. This statement is one of a series by true-believers saying we have only x years to Save The Planet by shutting down the West. Ex-Prince Chazza has done it. Al Gore has done it. The UN did it big-time by saying in 2005 that there would be 50 million climate refugees by 2010. There weren’t.

Verdict: Truth value 0. Extreme warming that has been predicted does not become a fact unless it comes to pass. If you want my prediction, it won’t. And that’s a fact.

Mann: “So even if we accept a lower equilibrium climate sensitivity value, it hardly signals the end of global warming or even a pause. Instead it simply buys us a little bit of time – potentially valuable time – to prevent our planet from crossing the threshold.”

Science: No one is suggesting that the Pause will continue indefinitely. Theory as well as observation suggests otherwise. However, a Pause that has not occurred cannot “buy us a little bit of time”. Mann’s mention of “buying us a little bit of time” is, therefore, an admission that the Pause is real, as all of the temperature datasets show.

Verdict: Truth value 0. A low enough climate sensitivity will allow temperatures to remain stable for decades at a time, during periods when natural factors tending towards global cooling temporarily overwhelm the warming that would otherwise occur.

Mann: “These findings have implications for what we all must do to prevent disaster.”

Science: Warming of 3 Cº would not be a “disaster”. Even the bed-wetting Stern Review of 2006 concluded that warming of 3 Cº over the 21st century would cost as little as 0-3% of global GDP. But at present we are heading for more like 1 Cº. And even the IPCC has concluded that less than 2 Cº warming compared with 1750, which works out at 1.1 Cº compared with today, will be net-beneficial.

Verdict: Truth value 0. There is no rational basis for any suggestion that our adding CO2 to the atmosphere at the predicted rate, reaching 713 ppmv by 2100, will be anything other than beneficial.

Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm. Ironically, if the world burns significantly less coal, that would lessen CO2 emissions but also reduce aerosols in the atmosphere that block the sun (such as sulfate particulates), so we would have to limit CO2 to below roughly 405 ppm. We are well on our way to surpassing these limits.”

Science: What we are concerned with is not CO2 simpliciter, but CO2-equivalent. CO2 itself contributes only 70% of the anthropogenic enhancement of the greenhouse effect. The (admittedly arbitrary) target of 450 ppmv CO2-equivalent is thus a target of only 315 ppmv CO2 – the concentration that prevailed in 1958. Mann’s suggested target of 405 ppmv CO2e would represent just 284 ppmv CO2. And that would fling us back to the pre-industrial CO2 concentration.

Verdict: Truth value 0. We are not “well on our way to surpassing these limits”: we passed them as soon as the industrial revolution began. The current CO2-equivalent concentration of 398 ppmv already exceeds the pre-industrial 284 ppmv by 40%, yet the world has warmed by only 0.9Cº since then, our contribution to that warming may well be 0.33 Cº or less.

Mann: “Some climate scientists, including James E. Hansen, former head of the NASA Goddard Institute for Space Studies, say we must also consider slower feedbacks such as changes in the continental ice sheets.”

Science: The IPCC already takes changes in ice-sheets into account. It says that in the absence of “dynamical ice flow” that cannot happen, the Greenland ice sheet would not disappear “for millennia”. And there is no prospect of losing ice from the vast ice sheet of East Antarctica, which is at too high an altitude or latitude to melt. Even the West Antarctic Ice Sheet, which has lost some ice, is proving more robust than the usual suspects had thought. Sea level, according to the GRACE gravitational anomaly satellites, has been falling (Peltier et al., 2009). During the eight years of ENVISAT’s operation, from 2004-2012, sea level rose at a scary 1.3 inches per century.

Verdict: Truth value 0. There is no reason to suppose the major ice sheets will disintegrate on timescales of less than millennia.

Mann: “Hansen and others maintain we need to get back down to the lower level of CO2 that existed during the mid-20th century–about 350 ppm.”

Science: 350 ppmv is, again, CO2-equivalent. That implies 245 ppmv, a value well below the pre-industrial 280 ppmv. At 180 ppmv, plants and trees become dangerously starved of CO2. Flinging CO2 concentration back to that value would reduce CO2 fertilization and hence crop yields drastically, and would do major damage to the rain-forests.

Mann: “In the Arctic, loss of sea ice and thawing permafrost are wreaking havoc on indigenous peoples and ecosystems.”

Science: The Arctic has not lost as much sea ice as had been thought. In the 1920s and 1930s there was probably less sea ice in the Arctic than there is today. The decline in sea ice is small in proportion to the seasonal variability, as the graph from the University of Illinois shows. And the part of the satellite record that is usually cited began in 1979. An earlier record, starting in 1973, showed a rapid growth in sea ice until it reached its peak extent in 1970. Indigenous peoples, like the polar bears, prefer warmer to colder weather. And almost all ecosystems also prefer warmer to colder weather.

clip_image014

Verdict: Truth value 0. The decline in sea ice in the Arctic is far more of a benefit than a loss.

Mann: “In low-lying island nations, land and freshwater are disappearing because of rising sea levels and erosion.”

Science: On the contrary, detailed studies show not only that low-lying island nations are not sinking beneath the waves, but that their territory is in many cases expanding. The reason is that corals grow to meet the light. As sea level rises, the corals grow and there is no net loss of territory. Also, sea level rises less in mid-ocean, where the islands are, than near the continental coasts. And sea level has scarcely been rising anyway. According to Grinsted et al., it was 8 inches higher in the medieval warm period than it is today.

Verdict: Truth value 0. If the world were once again to become as warm as it was in the Middle Ages, perhaps sea level would rise by about 8 inches. And that is all.

Mann: “Let us hope that a lower climate sensitivity of 2.5 degrees C turns out to be correct. If so, it offers cautious optimism. It provides encouragement that we can avert irreparable harm to our planet. That is, if–and only if–we accept the urgency of making a transition away from our reliance on fossil fuels for energy.”

Science: Mann is here suggesting that a climate sensitivity of 3 Cº would be disastrous, but that 2.5 Cº would not. The notion that as little as 0.5 Cº would make all the difference is almost as preposterous as the notion that climate sensitivity will prove to be as high as 2.5 Cº. As we have seen, on the assumption that less than half of the warming since 1950 was manmade, climate sensitivity could be as low as 1 Cº – a value that is increasingly finding support in the peer-reviewed literature.

Verdict: Truth value 0. The central error made by Mann and his ilk lies in their assumption that models’ predictions are as much a fact as observed reality. However, observed climate change has proven far less exciting in reality than the previous predictions of Mann and others had led us to expect. The multiple falsehoods and absurdities in his Scientific American article were made possible only by the sullen suppression by the Press of just how little of what has been predicted is happening in the real climate. In how many legacy news media have you seen the Pause reported at all? But it will not be possible for the mainstream organs of propaganda to conceal from their audiences forever the inconvenient truth that even the most recent, and much reduced, projections of the silly climate models are proving to be egregious exaggerations.

0 0 votes
Article Rating
256 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Curious George
March 24, 2014 9:45 am

The “greatest disinformation campaign ever run”. Listen to Dr. Mann. He knows from a personal experience.

Latitude
March 24, 2014 9:55 am

disinformation…
another oxymoron

Steve from Rockwood
March 24, 2014 9:58 am

Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.

The warming “rate” can be “declining” and global warming can still “continue” as long as the rate remains above zero. The rate of warming would have to decline to zero or negative to falsify Mann’s statement (which it has).
The quoted statement above should be changed to “since the warming rate is now negative”. Otherwise you could decline from a high rate of warming to a lower rate of warming (the second derivative is negative, but the first derivate remains positive) and still have warming.

ckb
Editor
March 24, 2014 10:02 am

As someone who attended Lewin’s 8.01 Physics class I in 1988 (or was it 89?), and then sat in from time to time in subsequent years just for the fun of it, I appreciate the truly apt reference to Lewin’s approach to science.
The stuffed monkey falling from the ceiling who was invariably hit by the arrow appreciates it far less! The monkey was done for – we were absolutely certain…. 🙂 Our ability to skewer him was well within the margin of error in setting up the demonstration…usually…

March 24, 2014 10:04 am

So how again does one distinguish a “false pause” from a real one?

Réaumur
March 24, 2014 10:04 am

“In logic, every declarative statement is assigned a truth-value: 1 (or, in computer programs, –1) for true, 0 for false. ”
Where is -1 used to represent true?

March 24, 2014 10:05 am

Steve from Rockwood,
Global warming has stopped. It may resume. Or not. But right now [and for the past 17+ years] it has stopped.

arthur4563
March 24, 2014 10:07 am

It’s amazing how easily Mann’s claims are shredded. Let’s see if the Scientific American that published his junk science has the honesty to publish this rebuttal.I think they should be advised
of this article.

Fly
March 24, 2014 10:08 am

[SNIP “Fly” aka “Aanthanur DC” Aka Daniel C, your fake name and email is not gonna fly here, you’ve been banned for policy violations/bad behavior. Sneaking back in won’t work. – Anthony]

rgbatduke
March 24, 2014 10:09 am

Science: For well over a decade there has been no global warming at all. The pause is not faux, it is real, as Railroad Engineer Pachauri, the IPCC’s joke choice for climate-science chairman, has publicly admitted. And the absence of any global warming for up to a quarter of a century is not “one narrow line of evidence”: it is the heart of the entire debate. The warming that was predicted has not happened.
Verdict: Truth value 0. Mann is here at odds with the IPCC, which – for once – paid heed to the wisdom of its expert reviewers and explicitly abandoned the models, such as that of Mann, which have been consistent only in their relentless exaggeration of the global warming rate.

And, note well, AR5 devotes all of Box 9.2 to this “Faux” pause. In fact, it is well worth directly cut-and-paste quoting the box title:
Box 9.2 | Climate Models and the Hiatus in Global Mean Surface Warming of the Past 15 Years
Mann seems to be at odds by five years even with the IPCC, which is not known for its eagerness to abandon its previous egregious claims for climate sensitivity, and this “hiatus” is far from faux, it is a serious problem, one that the devote an entire box of apologia to. However, this box does not explain what they plan to do as the hiatus continues to stretch out, placing ever greater pressure on their bogus treatment of the statistics of unvalidated GCMs that, in fact, are almost certainly seriously broken. The only visible sign of it is the CYA activity of adding things like paragraph/sections 9.2.2.2 and 9.2.2.3 and box 9.2, and backpedaling gently on their egregious claims of equilibrium climate sensitivity as incoming data continues to force the most plausible range inexorably down and as they are forced to recognize that they have the balance between natural variation and CO_2 forced variation seriously wrong. Hell, they may even have the sign of the total CO_2 feedback wrong, it might end up reducing the total warming expected from direct CO_2 driven increases. In a nonlinear system that is historically remarkably stable, this could hardly even be considered a surprise after the fact if it turns out to be so.
However, there are many signs that “the consensus” is crumbling even as it is being artificially trumped up and inflated by certain media and political groups. Box 9.2 is one of them. The more honest version of figure 1.4 in the leaked draft of AR5 is another. There are plenty of honest scientists in the world, and there are plenty of honest scientists in climate science. Data does indeed talk where theoretical bullshit walks. But it is hard to do anything at all constructive about people who present figures that look scientific without describing their basis or their sources of data or why they differ from related curves that do have a published basis except — call them on it. And even that won’t eliminate the damage done when they “publish” it in a venue that won’t accept any sort of comment or post-publication peer review.
Mann is actually abandoning the IPCC here. They obviously are too conservative and cautious for him. He and Hansen should get together and have a disasterfest somewhere. Mann can threaten 4.5 C warming (which nobody thinks is going to happen any more, but that won’t keep it off of his graph, will it?) and Hansen can chime in with his 5 meter SLR (which nobody thinks is going to happen any more, but the SLR that sane — well, saner — people are now asserting is so tame as to hardly be catastrophic, anywhere from 10 to 18 inches, where we had 9 inches over the last century plus without any cause for alarm or anybody really noticing. And a more realistic prediction might limit that to even less — 3 mm/year is probably an upper bound on what we’ll actually see and that WOULD represent an increase over the average of the last century.
rgb

Frank K.
March 24, 2014 10:17 am

Fun this should come out today. This morning we set yet another all time low temperature record for this date at my location in western New Hampshire (-3 F). Yes that’s right: -3 F on March 24th!! I’m sure other records were similarly broken in the region.

March 24, 2014 10:19 am

I’m perhaps confused, since it seems that this article perpetuates and incorrect allegation about “hide the decline”: that it hid the decline of temperatures.
The “decline” does not refer to a decline in temperature. It refers rather to a decline of a *proxy* for temperature. Specifically, Michael Mann used tree ring histories and a flawed variant of PCA to generate a function that would spit out a temperature if fed historical tree ring data. Because of overfitting, it did well in spitting out the average temperature for years prior to 1960 or so. Past that date, the function generated declining temperatures that did not follow the global average temperature.
This was, in fact, terribly inconvenient; if the function is not producing good outputs in the post 1960 era, how do we know it isn’t similarly failing for the period before anyone was measuring temperatures.
In order to prevent people from noticing this inconvenient truth in a paper published in Nature, Mann famously threw out the more recent outputs of his function and replaced them with actual historical data on the plots. This gave his function the veneer of accuracy.
Thus “Mike’s Nature trick to hide the decline” refers to a trick Michael Mann used in Nature magazine – the trick being to replace inconvenient model outputs with actual instrument data.

Monckton of Brenchley
March 24, 2014 10:21 am

In answer to Reaumur, the Boolan operater for truth value returns -1 for true in many computer programs, particularly older programs where the microcircuitry was not as reliable as it is today. In the old 8-bit registers (in the Z80 chip, for instance) the value 1 was stored as a 1 followed by seven zeroes (the binary digits being recorded right to left). If the first bit were defective, a false value might be obtained. So truth was represented as “11111111”, which the chip understood as -1, and falsehood was represented as”00000000″.
In many higher-level programming languages the convention of treating the truth value as -1 was followed. In just about all versions of BASIC, for instance, typing
IF unhappy THEN PRINT “I am sad” ELSE PRINT “I’m a happy little computer!”
would return “I’m a happy little computer!” unless the variable “unhappy” had first been set to a non-zero value, in which event the interpreter would set the truth-value of the protasis to -1 and the program would send “I am sad” to the console.

Mike Bromley the Kurd
March 24, 2014 10:22 am

Steve from Rockwood says:
March 24, 2014 at 9:58 am
Are you a lawyer? Or a scientist? Or neither? 0.01 degrees, although “positive” is 0.01 degrees, nonetheless. Unmeasurable. Only calculable. To say that still means ‘warming’ is to grab your legalese tweezers and pick fly sh*t out of the black pepper, in a most odious manner.

March 24, 2014 10:27 am

I assume that we won’t hear a demand from Mann for a public debate anytime soon. As in never.

Pathway
March 24, 2014 10:28 am

The final score is Mann 0, Pass the sick-bucket, Alice 1

Greg
March 24, 2014 10:28 am

Hide the decline deja vu? Mann’s ‘little white LIE’
Clearly this is more grafting and blending a la “Mike’s Nature trick”. It seems he now has his own personal global temperature dataset to which he makes his own, undocumented “corrections”.
Well since he is a “reluctant” Nobel prize climatologist , why not ?

Tim Obrien
March 24, 2014 10:29 am

Unfortunately it doesn’t matter how you refute them with facts or logic. They only recite and read their own Holy Words and we are all unbelievers to be ignored…

NotTheAussiePhilM
March 24, 2014 10:32 am

Not so much a Faux Pause, more of a Faux Pas …

Rud Istvan
March 24, 2014 10:34 am

For Scientific American, Mike could not even get his own Nature trick right. Had to invent a temperature record. This should be enough to call for retraction, but doubt ScAm cares, since it long since stopped reporting hard science. This reader’s solution som time ago was to cancel a long standing subscription.

hunter
March 24, 2014 10:35 am

The AGW hypesters are grasping at straws, and leaving the facts behind.
Mann falsely asserts that the step down of sensitivity in the recent IPCC report is due to one thing.
It is due to much more than that.

rogerknights
March 24, 2014 10:40 am

Typo (“d”” needed): “. . . less of the observe warming is anthropogenic . . .”
Eh?: “An earlier record, starting in 1973, showed a rapid growth in sea ice until it reached its peak extent in 1970.”
1980 maybe?

March 24, 2014 10:41 am

Science: Starting in Orwell’s Year (1984), and taking the mean of the five standard global temperature datasets since then”
1979-2002 Apr 280 months +0.153 Cº/decade.
1979-2013 Dec 420 months +0.145 Cº/decade.

In reality:
1979-1998 trend = +0.082 K/decade (warming)*
1998-2014 trend = -0.050 K/decade (cooling)**

And 1979 to 2013’s end averaged +0.125 K/decade warming (not 0.145 K/decade) but with that hiding the turning point.
The above is RSS global lower troposphere temperature data. There was a turning point in the late 1990s for the reasons suggested in the usual link in my name.
Monckton is right that the warming rate is declining (to say the least), but he didn’t show it significantly there. 0.153 K/decade versus 0.145 K/decade would be a trivial difference if it was true rather than junk from the alarmists. The above, however, does show the far greater real difference.
* http://woodfortrees.org/plot/rss/from:1979/to:1998/plot/rss/from:1979/to:1998/trend
** 1998 to up to now: http://woodfortrees.org/plot/rss/from:1998/plot/rss/from:1998/trend

Editor
March 24, 2014 10:46 am

Funny how Mann keeps referring to the “pre-industrial temperature”.
I wonder why he does not call it “The Little Ice Age temperature”?

Steve from Rockwood
March 24, 2014 10:46 am

Mike Bromley the Kurd says:

March 24, 2014 at 10:22 am
Steve from Rockwood says:
March 24, 2014 at 9:58 am
Are you a lawyer? Or a scientist? Or neither? 0.01 degrees, although “positive” is 0.01 degrees, nonetheless. Unmeasurable. Only calculable. To say that still means ‘warming’ is to grab your legalese tweezers and pick fly sh*t out of the black pepper, in a most odious manner.

Am I a lawyer? Well that was a bit harsh. No, I have a scientific background (professional geoscientist, geophysics).
My point was important given the fine line “warmists” are taking these days even though the global temperature trend for 17 years and counting is essentially flat to declining.
Mann is arguing (incorrectly) that there is no pause, that while the rate of warming has declined, it is still positive. Unfortunately, Monckton repeats Mann’s very statement in attempting to falsify Mann’s argument.
The rate of warming has not slowed as Mann claims. It has gone to zero and may even be negative. If Mann is allowed to keep his “rate of warming has declined” then he still gets to keep his warming.
The rate of warming has not declined. It has turned negative and may increase in the negative direction (global cooling). This could be the end of “the last decade was the hottest on record”.

Doug Proctor
March 24, 2014 10:47 am

Styen is accused by Mann of saying he is a liar. To be a liar is not the same as saying one has lied, it implies a continuing knowing propogation of falsehood.
A Scientific American article of this level that repeats refuted claims should be evidence that Mann continues to push falsehoods, and therefore is not just lying this time (or some previous time has lied) but is a liar by habit, nature or purpose.

Rob Ricket
March 24, 2014 10:48 am

Anthony,
Willis’ outstanding work aside, I thought the infamous “hide the decline” and “nature trick” referred to hiding a divergence between instrument and proxy records by attaching instrument temperatures (without attribution) to the proxy reconstruction? Accordingly, (as I understand it) the “nature trick” simultaneously selected the warmer of two data sets and disguised a divergence in proxies indicative of an unreliable reconstruction.
If Willis is correct, this latest bit of data tampering is different from the “nature trick” in as much as, it requires adjustment to the instrument record.
REPLY: I can see how you can be confused, but my point is that Mann elevated the last decade+ of surface temperatures, rather than showing them lower. I thought the parallel was apt, YMMV. I’ve updated the first paragraph to make my intent clearer. – Anthony

March 24, 2014 10:48 am

EDIT: A figure in my prior post is off, coming from carelessness in writing too fast. The links work, though.

JJ
March 24, 2014 10:53 am

Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm.”
The peer reviewed consensus story in 2007 was that that to limit warming to 2C, it was necessary to reduce CO2 from 385 to 350 ppm. The global warming alarmist organization 350.org took its name from that very assertion.
Now we’re at 400 with no increase in temps at all.
So, Mann tells us that the new magic peer reviewed consensus figure for the death of civilization is 450 ppm.
Baby steps.

March 24, 2014 10:58 am

Ahh…”faux pause” is a nice pun. Still, it was never so much a pause, as a correction to Mann’s optical illusion of a rise. His trompe l’oeil was chalk-drawn sidewalk art, and the disillusionment came when a few men rudely came along and stood (or imagine a more profane act) in the middle of the piece. The artist’s apoplexy at this denouement is a bit comical, but understandable. A bit of reality interjected into a bit of a scary fantasy. C’est la vie.
http://www.pinterest.com/pin/150026231306910130/

March 24, 2014 10:59 am

EDIT 2 to prior posts:
Part of what is going on is that the least squares fits are very sensitive to start/end points as can be seen from the website’s calculator.
Anyway, for each of the following, one can click on the “raw data” link underneath the plot then scroll down to the bottom to see the calculated trend line slope, though obviously are all effectively somewhat arbitrary when weather variation is much more than the climate signal:
1979-1998:
http://woodfortrees.org/data/rss/from:1979/to:1998/plot/rss/from:1979/to:1998/trend
1979-now:
http://woodfortrees.org/data/rss/from:1979/plot/rss/from:1979/trend
1998-now:
http://woodfortrees.org/data/rss/from:1998/plot/rss/from:1998/trend

Datatype
March 24, 2014 11:06 am

A good guess of why that data is shifted up may be that if the other data sets were used the graphic would show temps about to cross over the boundary of the lowest model estimate. Such a graph would highlight that the actual data was well on the way to showing the models have issues. It would be instructive is an example graphic showing what Mann’s graph would look like using those other data sets was created. Not sure if Willis’s tool can do that as well?
REPLY: that is an excellent point. I’ll ask. – Anthony

March 24, 2014 11:06 am

given this chart does it matter if it gets warmer? http://snag.gy/BztF1.jpg
they have to hide the ice age context of the warming and keep the debate within 100 years snapshot to scare people .
anyone looking at that chart would not be worried by any kind of warming. Indeed the cool conditions right now are the anomaly ?
they still have to prove a link to co2.

March 24, 2014 11:07 am

Re: links:
Or switch between data and plot by switching whether the .org part of the links is followed by /data/rss or by /plot/rss.

DirkH
March 24, 2014 11:08 am

Monckton of Brenchley says:
March 24, 2014 at 10:21 am
“In answer to Reaumur, the Boolan operater for truth value returns -1 for true in many computer programs, particularly older programs where the microcircuitry was not as reliable as it is today. In the old 8-bit registers (in the Z80 chip, for instance) the value 1 was stored as a 1 followed by seven zeroes (the binary digits being recorded right to left). If the first bit were defective, a false value might be obtained. So truth was represented as “11111111″, which the chip understood as -1, and falsehood was represented as”00000000″. ”
Reliability was not the reason to develop this convention of using -1 == 1111…111 as “true”. Rather, the highest bit of a register serves as sign in 2’s-complement arithmetic.
where 0 = positive number, 1 = negative number; and nearly all processors have an instruction to test the sign of an integer number and jump on negative. So that allowed slightly smaller and faster machine code.

March 24, 2014 11:12 am

Mann expects the “adjustments” to the data sets to reach his figures, and that is why he printed them. A bit premature, but his ego would not allow him to hide that reality.

March 24, 2014 11:19 am

Monckton says
1979-2013 Dec 420 months +0.145 Cº/decade
Henry says
amazing how good this compares to my own result of a finitely small (but balanced!) sample
1980-2011 Dec, +0.013/annum = +0.13 C/decade
(Means Table, bottom
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
Note however, that from 2000
I report
-0.17 C/ decade……
excluding UAH
http://www.woodfortrees.org/plot/hadcrut4gl/from:1987/to:2015/plot/hadcrut4gl/from:2002/to:2015/trend/plot/hadcrut3gl/from:1987/to:2015/plot/hadcrut3gl/from:2002/to:2015/trend/plot/rss/from:1987/to:2015/plot/rss/from:2002/to:2015/trend/plot/hadsst2gl/from:1987/to:2015/plot/hadsst2gl/from:2002/to:2015/trend/plot/hadcrut4gl/from:1987/to:2002/trend/plot/hadcrut3gl/from:1987/to:2002/trend/plot/hadsst2gl/from:1987/to:2002/trend/plot/rss/from:1987/to:2002/trend
It seemsI was right
we are globally cooling
until 2040
You can find the reason in the movements of the planets
(I doubt the story about the cosmic rays)

March 24, 2014 11:20 am

I wouldn’t worry, Mann’s graph will be 100% accurate after a few more GISS “adjustments.”

pottereaton
March 24, 2014 11:20 am

Col Mosby says:
March 24, 2014 at 10:27 am
I assume that we won’t hear a demand from Mann for a public debate anytime soon. As in never.
————————–
Mann only debates strawMenn in public.

March 24, 2014 11:22 am

Also, can’t comment on this without linking Goddard.
https://stevengoddard.wordpress.com/2013/12/21/thirteen-years-of-nasa-data-tampering-in-six-seconds/
It is just not remotely plausible that so many record highs from summer 1936 survive even though it was actually much colder then.

March 24, 2014 11:25 am

“Mann only debates strawMenn in public.”
There are two ways to win an argument. The first is to calmly and rationally argue your position, gathering evidence and presenting it in a reasonable way that considers all the tradeoffs of various policy options.
The other is to stamp your feet and scream and denounce and ban and delegitimize.
[…]
This is, of course, one of the uglier aspects of the politicized life. When those with whom you disagree are not just wrong but also evil they and their ideas are unworthy of debating with. They are to be mocked and vitriol is to be heaped upon them—but their arguments are not to be touched. To do so would be to grant them a veneer of validity and run the risk of having their ideas contaminate the public at large.

tom
March 24, 2014 11:27 am

I have also digitised the white line in the Scientific American infographic, and I think Willis Eschenbach is being too kind to Mike Mann by trying to fit the line to known datasets. The data I extracted is about 0.4 degrees C warmer than Willis’, and bears no relation at all to the known data. Climate data isn’t my thing, but data extraction is. I used the tool at http://arohatgi.info/WebPlotDigitizer/app/ and I’d send my comparative results if there was a way to submit images/spreadsheets.

Chuck L
March 24, 2014 11:32 am

jauntycyclist says:
March 24, 2014 at 11:06 am
given this chart does it matter if it gets warmer? http://snag.gy/BztF1.jpg
Or to quote a certain Secretary of State, “What difference does it make?”
Sorry, couldn’t resist.

Village Idiot
March 24, 2014 11:32 am

Sir Chris says:
“…….has arisen from the long pause, now 13 years 2 months in length on the mean of all five datasets”
That this statement is cobblers can be proved simply by looking at the figure above (amateurishly not labelled) just above “Now compare that plot above to this portion…” second to last graph in Tony’s bit.

Greytide
March 24, 2014 11:32 am

Excellent. Science will win in the end, is just needs the dedication of the likes of Monckton of Brenchley o keep plugging away.
A couple of edits needed?
Should this not be in bold:-
Mann: If the world keeps burning fossil fuels at the current rate, it will cross a threshold into environmental ruin by 2036. The “faux pause” could buy the planet a few extra years beyond that date to reduce greenhouse gas emissions and avoid the crossover–but only a few.
Mann: “The dramatic nature of global warming captured world attention in 2001, when the IPCC published a graph that my co-authors and I devised, which became known as the ‘hockey stick’. The shaft of the stick, horizontal and sloping gently downward from left to right, indicated only modest changes in Northern Hemisphere temperature for almost 1,000 years–as far back as our data went.”

Ed Reid
March 24, 2014 11:32 am

Mike Bromley the Kurd @ March 24, 2014 at 10:22 am
Mike, there is no tool which can calculate insignificant figures faster than a computer. If it can’t be measured, it is not significant.

thisisnotgoodtogo
March 24, 2014 11:45 am

Read the tricky description.The white line is not instrumental temperature.
It looks like a hybrid modeled sensitivity.

Mac the Knife
March 24, 2014 11:48 am

This Mikey Mann created faux pas is not an embarrassment to Mikey. It has become his raison d’etre. The world of disreputable climatologists knows no shame. Dishonesty, deceit and deception are their standard operating procedure and they willingly embrace them because they know what is good for all of us. They mean to ‘save the world’ by what ever megalomaniac means are necessary.
That’s why they won’t debate model output and cherry picked data analyses refuted by verifiable data.
That’s why they first turn to ad hominem attack.
That’s why they attempt to exclude climate realists from the media, science journals, and teaching positions.
That’s why they gravitate to teaching positions, at all levels of education.
That’s we must use this ‘pause/cooling phase’ to maximum advantage, to discredit them at every possible point and turn.

Bill Illis
March 24, 2014 11:50 am

The legend on Mann’s chart says “Northern Hemisphere Surface Temperatures”.
The line does bear some resemblance to Hadcrut4 for the NH (adding 0.45C to the anomalies to approximate starting at Zero).

March 24, 2014 11:52 am

Steve from Rockwood says:
March 24, 2014 at 9:58 am

The rate of warming would have to decline to zero or negative to falsify Mann’s statement (which it has).

You’re not taking into account the modifier Mann used: “… warming continues unabated ..” (emphasis added). It is not necessary to establish that warming has reversed and become cooling to declare Mann’s assertion false — any significant reduction in the rate of warming would do. Of course, warming morphing into cooling would be the most extreme form of abatement.

March 24, 2014 11:54 am

co2ers say they won’t debate for the same reason scientists don’t debate with creationists.
Which means they make an equivalence with scientists debating with other scientists and science debating with religion?
Which is another way of saying anyone who does not submit to the co2 snapshot warming narrative [decontextualised from any ice age cycle] is just the same as a creationist.ie they are not a scientist and not worth anything.
When you contextualise the co2 narrative within ice age cycles then there is nothing to worry about. We have at least 3 deg of warming before we reach the top of the range which at current rates will be when?
Rather than frightened of warming that has happened before and civilisation flourished we should look forward to it [if it happens]. Rather we should examine what the climate was in those warm times to see how it will change . Pelican pie anyone [for those in uk]?

TomB2
March 24, 2014 11:55 am

Steve from Rockwood, if you were a lawyer you might understand the meaning of the word “unabated.”

rogerknights
March 24, 2014 11:57 am

Time is on our side. We contrarians can ride the recline.

Bruce Cobb
March 24, 2014 11:57 am

Faux Nobel-prizewinner and faux scientist Mann issues his faux science FUD to desperately try to prop up a fast-failing faux hypothesis of manmade warming. He’s the king of faux.

Keith W.
March 24, 2014 12:08 pm

I think we have another “fraudulent” hockey stick from da Mann.

MarkB
March 24, 2014 12:15 pm

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.
Explanation of graph including links to source code and data were given here: http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/
REPLY: Yes, I’ve seen that, but there is a discrepancy, the label on the image is “Historical mean annual temperature” (white)
In http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/ it is written:

Historical Simulations. The model was driven with estimated annual natural and anthropogenic forcing over the years A.D. 850 to 2012. Greenhouse radiative forcing was calculated using the approximation (ref. 8) FGHG = 5.35log(CO2e/280), where 280 parts per million (ppm) is the preindustrial CO2 level and CO2e is the “equivalent” anthropogenic CO2. We used the CO2 data from ref. 9, scaled to give CO2e values 20 percent larger than CO2 alone (for example, in 2009 CO2 was 380 ppm whereas CO2e was estimated at 455 ppm). Northern Hemisphere anthropogenic tropospheric aerosol forcing was not available for ref. 9 so was taken instead from ref. 2, with an increase in amplitude by 5 percent to accommodate a slightly larger indirect effect than in ref. 2, and a linear extrapolation of the original series (which ends in 1999) to extend though 2012.

“Historical mean annual temperature” is NOT the same as “Historical Simulations” It looks to me like a bait and switch.
– Anthony

David Kleppinger
March 24, 2014 12:16 pm

In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.

Reply to  David Kleppinger
March 25, 2014 5:26 am

Kleppinger – In machine language, 0 is off, and 1 is on. Get down to basics.

March 24, 2014 12:27 pm

‘hide the decline’ is more ‘hide the ice age’ that shows the current cool period is an extreme and that since the last ice age warmer conditions are the norm.
http://jonova.s3.amazonaws.com/graphs/lappi/gisp-last-10000-new.png

Bob Koss
March 24, 2014 12:30 pm

Anthony,
In addition to the faux hump he added after 1998, no year after about 1917 is below the zero anomaly line. It appears to me he either used only a single year or two very early in the record as his reference period or arbitrarily put the zero anomaly line where he thought it would be most impressive.

C.M. Carmichael
March 24, 2014 12:30 pm

It is only a “pause” if it resumes its original direction. It may be a plateau, but until it moves one way or the other it has stopped.

thisisnotgoodtogo
March 24, 2014 12:32 pm

Anthony,
Since Mann used a simulated temperature that goes higher than instrumental, then picked the model projection that best matches his simulation, he arrived at the quick rise to 2 degrees.
REPLY: No doubt, but the label for the graph on the white line (implying actual historical global surface temperature) says more about “bait and switch” than it does about anything else. – Anthony

Editor
March 24, 2014 12:37 pm

Datatype says:
March 24, 2014 at 11:06 am

A good guess of why that data is shifted up may be that if the other data sets were used the graphic would show temps about to cross over the boundary of the lowest model estimate. Such a graph would highlight that the actual data was well on the way to showing the models have issues. It would be instructive is an example graphic showing what Mann’s graph would look like using those other data sets was created. Not sure if Willis’s tool can do that as well?
REPLY: that is an excellent point. I’ll ask. – Anthony

Actually, all of the observational results fit within Mann’s floor-to-ceiling range of predictions, as would just about any projection. I suspect he jacked the post-2000 results just so he could claim it is a “faux pause”.
The other oddity is the red-orange-gold-yellow group of lines. Theoretically these are different model runs at different sensitivities. However, they all come together in ~ 1930, and then again in about 1975. At other times, however, they spread out widely … how does that work?
It is also quite bizarre to me that in 1975, all five different climate sensitivities gave about the same answer. 13 years later, the temperature had gone up by 0.6 degrees, and the five models had an intermodel spread of only about 0.2 degrees.
But over the next decade a half, the temperature barely changed … but the spread in the models by 2013 is a full eight-tenths of a degree … say what? What would cause such a large increase in the inter-model spread from 1998 to 2013, when there is almost no increase in the spread in the models from 1975 to 1998?
Whole thing is a farrago of nonsense if you ask me …
w.

Steve from Rockwood
March 24, 2014 12:44 pm

Watt and TomB2. The essence of Mann’s argument is summarized by Mann himself:

The misunderstanding stems from data showing that during the past decade there was a slowing in the rate at which the earth’s average surface temperature had been increasing. The event is commonly referred to as “the pause,” but that is a misnomer: temperatures still rose, just not as fast as during the prior decade.

Mann’s assertion is false. It hasn’t been a slow-down in the rate of increase (a negative second derivative resulting in a lower first derivative), it has been shut-off (second and first derivatives zero) and in some cases a reversal (negative second and first derivatives).
As for “unabated” if a truck has been speeding toward you reaching a rate of 100 km/hr (first derivative) and the acceleration was increasing (positive second derivative) but that rate went to zero when the truck reached 140 km/hr (zero first and second derivative), it is still coming toward you “unabated”. Only a lawyer would argue otherwise.
Attack Mann where he made his mistake. He believes the rate of increase is still positive (positive first derivative) and indeed his graph seems to show this. His graph doesn’t look like any other data set so perhaps he “showed the incline” with falsified data.

milodonharlani
March 24, 2014 12:50 pm

Bruce Cobb says:
March 24, 2014 at 11:57 am
Faux him!

thisisnotgoodtogo
March 24, 2014 12:52 pm

Not only did Mann say “historical temperature” (though it isn’t), he used North American.
It leads one to wonder whether his CO2 scenarios are North American and projections are therefor for North America only 🙂
It’s mix’n’match whichever shows highest fastest will do!

KNR
March 24, 2014 1:01 pm

If Mann told you the time of day you still check with at least two other people even if he was standing under a great big clock.
But to be fair when its comes to artistry , very few bet Mann for BS artistry .

MarkB
March 24, 2014 1:01 pm

REPLY: Yes, I’ve seen that, but there is a discrepancy, the label on the image is “Historical mean annual temperature” (white)
. . .
“Historical mean annual temperature” is NOT the same as “Historical Simulations” It looks to me like a bait and switch.

The colored lines are simulations using the EBM model and various sensitivities. The white line is HADCRUT Northern Hemisphere. It should have been indicated that this is supposed to be a Northern Hemisphere-only analysis. There are, no doubt things that can be criticized about this plot, but speculating about the process when one has been provided well commented source code and data is pretty lame.
REPLY: and Putting a Northern Hemisphere plot on a graph labeled “Global temperature rise” is really the ultimate in lameness. Like I said, the whole thing is bait and switch. – Anthony

Daniel H
March 24, 2014 1:02 pm

I noticed a similar distortion of the temperature record on The Weather Underground web site. A couple of weeks ago I was checking my local weather and I noticed a tab called “climate change” which I hadn’t seen before. So I clicked on it and the first thing I noticed was a graph of the “Global Surface Temperature” with a distorted 1998 Super El Nino year. I mean it has obviously been adjusted downward. It’s simply not accurate.
I can’t say I was too surprised given all the adjusting that goes on these days so I didn’t make much of it. But now that I look at it again I realize that it’s very similar to Mann’s chart. Anyway, the source of the data is NOAA, so I’m assuming it’s based on GISS. Check it out:
http://www.wunderground.com/climate/

bw
March 24, 2014 1:29 pm

Recall that Hadcrut4 is a “processed” index, and reflects reality only in the imaginations of the programmers. Doubt that any independent analysis would be able to replicate. If it can’t be replicated it is not science.
There are no measured global temps before the satellite era (1979), only individual stations. Rural stations with reliable records show zero warming since records began.
The 1930’s were the warmest decade in the “modern” era, both land thermometers and paleo-proxy estimates show this. Anyone claiming current temps are greater than the 1930s need to show their data.
To support the claim that the 1930s are about the same as today is a plot showing the “modern” era temps on the same scale as Hadrut4, but using BEST land temps. Now add a second plot showing the available satellite data. It is obvious that the land temps since 1980 do not match the satellite based land temps. BEST plot is scaled to show a more realistic Y-scale for longer timelines.
http://www.woodfortrees.org/plot/best/scale:3/from:1880/plot/rss-land
If the satellite data are good, why does the BEST data diverge since 1980??

Ed Reid
March 24, 2014 1:29 pm

Data simply are. Data which require “adjustment” are bad data; and, they do not become good data after adjustment, even if they do become good estimates. Infilling does not produce data, though it might produce good estimates. While Rumpelstiltskin was reputed to be able to spin straw into gold, even he could not spin nothing into gold.
It is important to remember, in discussions like this, that we are talking about anomalies which are not as large as the expected errors in the temperature data used to calculate them. Anomalies are being reported to two decimal places when the underlying data is not accurate in the first place to the left of the decimal point.
In reality, we are involved in an argument over data that aren’t and models that don’t. If that appears to be irretrievably silly, it is probably because it is irretrievably silly. There is nowhere that should be more obvious than here.

Editor
March 24, 2014 1:37 pm

Bill Illis says:
March 24, 2014 at 11:50 am

The legend on Mann’s chart says “Northern Hemisphere Surface Temperatures”.
The line does bear some resemblance to Hadcrut4 for the NH (adding 0.45C to the anomalies to approximate starting at Zero).

Huh? Which chart? The chart shown as Figure 1 says:

Historical Mean Annual Temperatures (white)

and that is the line I digitized.
Having said that, however, I agree with you regarding the white line. I just looked at the HadCRUT4 Northern Hemisphere graph, and it’s quite a good match (altho not perfect) to the white line.
w.

P.D. Caldwell
March 24, 2014 1:41 pm

Thanx. This is a keeper critique to MM’s latest propaganda. It should be in every science portfolio.

Svend Ferdinandsen
March 24, 2014 1:45 pm

Really funny:
“think that a single decade is too brief to accurately measure global warming”
It must work both ways, so the warming seen 1980 to 2000 may be too short to determine if there where any so called global warming!

March 24, 2014 1:47 pm

how can billions have been bet on such flimsy reasoning? temps are in a downtrend, we are in a cold period yet everyone excited about ‘warming’ like it is a bad thing when the evidence is warmer is historically more normal than what we got now..

DAV
March 24, 2014 1:48 pm

Réaumur says:
March 24, 2014 at 10:04 am
Where is -1 used to represent true?
Setting all bits of an integer of character to 1 is often used, which can be interpreted as -1 in twos complement arithmetic.

March 24, 2014 1:54 pm

Interesting:
the label on the image is “Historical mean annual temperature” (white)
yet in the description it reads “The estimate that best agrees with the recorded data reflecting the sensitivity of the earth’s climate (white)”
So, is the white line an “estimate that best agrees with the recorded data” or is it “historical mean annual temperature”?
If the latter, which historical recorded data record is it since it doesn’t match any of the widely accepted ones?
If the former, than who here would assume that a Mike Mann “estimate” is reasonable when it doesn’t match observational data?

thisisnotgoodtogo
March 24, 2014 1:55 pm

[DIR] Parent Directory –
[TXT] A_README 27-Feb-2014 15:50 1.7K
[ ] BEST_annual_nh.dat 01-Oct-2013 16:45 8.4K
[ ] GISTEMP2013_NH.dat 28-Jan-2014 10:48 1.4K
[ ] GISTEMP_NH.dat 18-Dec-2013 14:48 1.3K
[ ] HadCRUT4_annual2013_nh.dat 28-Jan-2014 10:52 1.9K
[ ] HadCRUT4_annual_nh.dat 18-Dec-2013 11:14 1.8K
[ ] aerosol.dat 15-Dec-2013 18:08 6.8K
[ ] co2.dat 29-Sep-2013 16:54 21K
[ ] solar.dat 21-Oct-2013 14:37 20K
[ ] volcanic-ammann.dat 29-Jul-2011 16:11 37K
[ ] volcanic-crowley.dat 16-Nov-2003 17:43 32K
[ ] volcanic-robock.dat 12-Jul-2011 16:53 95K

Richard Day
March 24, 2014 2:07 pm

Clearly it’s the FOMI* dataset.
* Figment of Mann’s Imagination.

March 24, 2014 2:21 pm

“…it gave almost 400 times as much weighting to datasets showing the medieval warm period as it did to datasets that did not show it…”
Other way around perhaps?

JJ
March 24, 2014 2:26 pm

Anthony Watts says:
@Willis and the lead in text says “Global temperature rise…”

Not only that, but the rest of Mann’s text refers to repeatedly to “… the earth’s climate…” and “… the world…”.
Specifically, he identifies the white line as:

“… recorded data reflecting the sensitivity of the earth’s climate (white)…”

(emphasis mine)
Not merely “bait and switch”. Instead, an explicit lie.
And above, poster Daniel H identifies that there is a similar “disappear the 1998 El Nino” graph being promoted by Weather Underground as part of its coordinated support of the “Year of Climate Change Activism.” That one is clearly labeled “Global Surface Temperature” …

Admad
March 24, 2014 2:31 pm

Colorado Wellington
March 24, 2014 2:41 pm

… “greatest disinformation campaign ever run”.

Just because Dr. Mann has problems with his model projections doesn’t mean he’s not good at projection.

Jeff
March 24, 2014 2:46 pm

I can only say I think he Mann ipulated the data to come up with that graph.
Some people have no shame….

rogerknights
March 24, 2014 3:01 pm

First he disappeared the MWP, now he’s disappeared the Southern Hemisphere. What a magician!

Bill Illis says:
March 24, 2014 at 11:50 am
The legend on Mann’s chart says “Northern Hemisphere Surface Temperatures”.

How apt: In spycraft, “legend” means “cover story.”

rogerknights
March 24, 2014 3:04 pm

Jeff says:
March 24, 2014 at 2:46 pm
I can only say I think he Mann ipulated . . .

Wonderful–let’s add “Mannipulated” to our repertoire.

March 24, 2014 3:10 pm

Science: During the decade February 2005 to January 2014, on the mean of all five datasets, there was a warming of 0.01 Cº, statistically indistinguishable from zero.
This looks like a typo. Should it be February 2004 to be a decade?
However that does not change the basics of the discussion. But if this should be submitted to Scientific American, this typo should be fixed and it should be verified that the 0.01 C applies to February 2004.
I can only do WTI on WFT, but that gives similar low numbers.
Using WTI, WFT gives a negative slope of: slope = -0.0001374 per year for 13 years and 2 months from December 2000 to January 2014.
However for the last 120 months, it is slightly positive at:
slope = 0.000347889 per year
See:
http://www.woodfortrees.org/plot/wti/from:2000.9/plot/wti/from:2000.9/trend/plot/wti/last:120/trend

Editor
March 24, 2014 3:12 pm

Typo? “[The model] gave almost 400 times as much weighting to datasets showing the medieval warm period as it did to datasets that did not show it “.
Step by inexorable step, the IPCC is being driven to abandon one extremist prediction after another, as real-world observation continues to fall a very long way short of what it had been predicting. “. This is a tragedy. It should never have reached this stage, because the very obvious shortcomings of the IPCC analysis should have been picked up and aired by large numbers of scientists right from the beginning. When it’s all over, we need some sort of enquiry into how scientists were so easily bullied into silence (or whatever else might be the explanation), and an attempt to set things up in future to prevent re-occurrence in this or any other field.
Christopher Monckton – a couple of your points are a stretch, and will give ammunition to your opponents. I suspect that your contempt for them is sufficient that you won’t care, but nevertheless may I repectfully suggest next time a bit of peer-review? (no pum intended).

March 24, 2014 3:15 pm

Steve from Rockwood says:
March 24, 2014 at 12:44 pm

As for “unabated” if a truck has been speeding toward you reaching a rate of 100 km/hr (first derivative) and the acceleration was increasing (positive second derivative) but that rate went to zero when the truck reached 140 km/hr (zero first and second derivative), it is still coming toward you “unabated”. Only a lawyer would argue otherwise.

Steve: I think your analogy is off by a derivative. What Mann actually claimed was “Global warming continues unabated.” (I am taking on faith here that Christopher Monckton has accurately quoted him). Any normal person, as well as 97% of lawyers, would take the word “warming” to indicate a discernable movement toward higher temperature. Indeed that is the only interpretation possible in this context as Mann is arguing against the claim that warming has stopped, which he calls the “faux pause”. Any reduction in the rate of warming is therefore an “abatement”, which Mann claims does not exist.
Your claim that warming has gone negative is stronger than Monckton makes — he just asserts that by all the common data sets net global warming over a period of at least 15 years is statistically indistinguishable from zero. Your analogy creates a false impression of danger (truck heading at 140 km/hr towards stationary person which doesn’t apply to the climate — there is no stationary fragile target to be “hit” by the crushing mass of a warming climate.
We are arguing relative nits here, but I think Mann’s assertion is effectively refuted by Monckton’s observation; it does not require the stronger claim you make, and I believe support for your claim is much less solid that for Monckton’s.

rogerknights
March 24, 2014 3:16 pm

ScAm will have to allow a letter to be printed, or voluntarily acknowledge, that the chart was mislabeled. Reader’s will do a double-take and re-assess Mann’s overall credibility in light of that–as well as re-assessing the stringency of warmist pal review. Mann’s typical over-reaching is going to result in a wrist-slap, at a minimum. That’s in the short term.
In the long term, this will be an albatross around his neck forever–one our side can harp on by always referring to him henceforth as Mr. Mannipulator. He has definitively revealed his deceitful character. This is a simple situation with no “outs” for him, not one he can semi-credibly tap-dance away from, like his hockey stick.
(Hmm–maybe he can say ScAm wrote the legend–and ScAm will be his fall guy?)

March 24, 2014 3:26 pm

When I was a kid there was a saying, “Don’t take any wooden nickels.”
It would seem that Mann thinks they have actual value.

March 24, 2014 3:47 pm

‘What needs to be done is to create a graph that shows what this would have looked like had Mann not cherry picked the NH and presented it on a graph with the text “Global temperature rise…”’
Has anyone stated they will be working on this?

David Ball
March 24, 2014 3:52 pm

“Imagine if a climate skeptic made a graph like this. We’d be excoriated.” – Anthony Watts
Some amazing words from our host.
History being made right there.
It doesn’t even mention the creator of this graph. Beautiful.

James Hein
March 24, 2014 3:52 pm

Poor Mr. Mann even Or-ing the values together didn’t give him a True

David in Texas
March 24, 2014 3:52 pm

Something seems amiss with Willis’s “graph digitizer”. On his graph, the inflection point at 2002 is clearly above the 1999 high, but the insert from Scientific American you included in the article clearly shows it being below. I downloaded the Excel file that you included to verify that was the case.

Justus
March 24, 2014 3:53 pm

Climate change proponents remind me of the Farmers’ Almanac. Every year, there is supposed to be the most severe winter in history.

Steve from Rockwood
March 24, 2014 3:56 pm

Alan Watt, Climate Denialist Level 7 says:
March 24, 2014 at 3:15 pm
————————————————————
While we are picking at nits…
T = temperature
dT/dt = rate of change in temperature with respect to time (first derivative)
d2T/dt2 = rate of change in the rate of change in temperature (second derivative)
Mann is claiming that while the second derivative has declined, the first derivative has not – that in fact temperatures continue to rise.
Monckton’s comment was “since the warming rate is declining”. This reads like it refers to a negative second derivative, which has the effect of reducing the first derivative but necessarily making it zero or negative.
At the risk of seeming very nit pickity, Monckton’s comment does not invalidate Mann’s claim. I think Monckton meant to write “since the warming is declining”. This would invalidate Mann’s claim.
It’s important to pick nits here because that is exactly what Mann is doing. He also produces a graph that seems to support his claim even though the graph doesn’t seem to match any temperature record.
The truck analogy would have been more suitable if I had the truck “accelerating” to 1.0 km/hr and then acceleration ending when the truck had reached the unabated speed of 1.2 km/hr.

March 24, 2014 4:14 pm

First he hides the decline, now he hides the southern hemisphere. What? It’s “Mike’s SciAm Trick™!” Carbon dioxide heats the northern hemisphere, but reverses polarity in the southern hemisphere and cools it? Tiljander was too small. Real Climate Scientists think bigger than that.

hunter
March 24, 2014 4:16 pm

This will not end well for Dr. Mann, if his hope is to be ermembered as a positive force in history or science.

Arno Arrak
March 24, 2014 4:52 pm

I quote Michael Mann: “The rate at which earth’s temperature has been rising eased slightly in the past decade, but the temperature is still rising…” That statement is entirely wrong. I have closely monitored earth temperature for years and can tell you that there has been no global temperature rise for 17 years. I assume the reader is aware that the temperature rise the author is talking about is greenhouse warming, or more correctly, the enhanced greenhouse warming, from carbon dioxide we add to the atmosphere. That must be distinguished from natural warming that may also happen at unpredictable times. One example of such natural temperature rise is the super El Nino of 1998 that came out of nowhere and subsided quickly. It did leave a legacy, though, by causing a short temperature rise in its wake that raised twenty-first century temperatures that followed one third of a degree Celsius higher than the last decades of the twentieth century had been. As a result, all the record temperatures are now twenty-first century temperatures despite the lack of any warming. That makes our century warmest on record as well as greenhouse free at the same time. It is well to recall here that the existence of greenhouse warming was first recognized by James Hansen who reported this to the US Senate in 1988. He showed a rising temperature curve that started in 1880 and reached a peak in 1988. That was the highest temperature within the last 100 years, he said. There was only a one percent chance that this could happen by chance alone. Hence, it followed that the greenhouse effect was detected. The newly established IPCC took that as a fact and It is now touted as the cause of dangerous warming to come. The fear that the two degree threshold may be exceeded by 2036 comes from application of this doctrine to temperature calculations by means of climate models. But did Hansen truly prove the existence of the greenhouse warming? An examination of the temperature graph he supplied to the Senate reveals huge problems with it. He speaks of a hundred year temperature rise but the record shows something else. First, the early part of the curve does not count as being caused by the greenhouse effect. According to IPCC-AR5 the effect of anthropogenic warming does not become observable until about 1950. Hence, the first seventy years of his temperature curve do not count as greenhouse warming years. Secondly, according to his curve there was no warming from the fifties to the mid-sixties. That also gets deducted from his 100 year curve, leaving just the period from mid-sixties to 1988 as a warming period he can use as proof of greenhouse warming. But this is not all. Examination of the satellite temperature record where it overlaps Hansen’s graph shows that there were three El Nino peaks between 1980 and 1988. His temperature graph is too course to show their presence because he uses a one year interval between his data points. On top of that he imposes a 5 year running mean upon his data so that there is no way to see the true shape of the last part of his curve. His relative peak heights are also wrong. All this makes him think that his last data point is a culmination of his hundred year warming when in fact it is nothing more exotic than an ordinary El Nino peak, the 1987/88 El Nino to be precise. No way can this misplaced and misidentified datum that marks an El Nino peak be the discovery point of greenhouse warming. He simply did not know what was in his own data and made grandiose claims about it. He should now properly withdraw his claim that he obsereved greenhouse warming in 1988. In the meantime, IPCC has gone ahead and used his “discovery” to justify their prediction of a coming greenhouse Armageddon. And this meaningless concept is what Michael Mann is still clinging to against all observations of nature.

Niff
March 24, 2014 5:07 pm

We should all be thankful that Lord Monckton applies himself so diligently to challenging all the falsehoods.
The real question is can we actually PROVE that Mickey KNOWS that these manipulated versions of the truth are false or is he a ‘useful idiot’ or ‘scientific fantasist’?
He’s clearly a faux scientist…..goes without saying.

crakar24
March 24, 2014 5:10 pm

I dont get this bit, how does LM establish the lower figure of CO2 rather than the 405 by Mann
Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm. Ironically, if the world burns significantly less coal, that would lessen CO2 emissions but also reduce aerosols in the atmosphere that block the sun (such as sulfate particulates), so we would have to limit CO2 to below roughly 405 ppm. We are well on our way to surpassing these limits.”
Science: What we are concerned with is not CO2 simpliciter, but CO2-equivalent. CO2 itself contributes only 70% of the anthropogenic enhancement of the greenhouse effect. The (admittedly arbitrary) target of 450 ppmv CO2-equivalent is thus a target of only 315 ppmv CO2 – the concentration that prevailed in 1958. Mann’s suggested target of 405 ppmv CO2e would represent just 284 ppmv CO2. And that would fling us back to the pre-industrial CO2 concentration.
Verdict: Truth value 0. We are not “well on our way to surpassing these limits”: we passed them as soon as the industrial revolution began. The current CO2-equivalent concentration of 398 ppmv already exceeds the pre-industrial 284 ppmv by 40%, yet the world has warmed by only 0.9Cº since then, our contribution to that warming may well be 0.33 Cº or less.
Can anyone explain this in more detail for me.
TIA

Björn
March 24, 2014 5:19 pm

Like Tom I also digitized Manns white line , and get a similiar result , the the average diffrence to the data in the spreadsheet linked in Anthony’s foreword (rounded to two digits) 0.38, and has a standard population deviation of 0.03 ( i.e the wiggles dance in sync, but do not hug or touch ). I used ‘engauge’ a digtizer tool i have come to prefer , and I had indeed noticed when I was setting axis-points in the pictured graph , that the Fauxline ( a.k.a Manns white ) was somewhat diffrent from the all of the usual anomaly temperature datasets , in that ít’s graph rarely drops below zero, and assumed Mann had just added an offset to one of the standard sets, in order to be able to plaster it onto some ‘űberheiss’ model simulation results . If that is the case then there is of course comparing it to one of the standard set is no problem as long as you know the value of the offset used , just add it to the standard set or subtract it from the fauxline. But I do not really know how that Mann whiteline is optained , and can find no indication of where it comes from in the article, or what explaines the obvious pushup from the standard global anomaly sets, though perhaps the sentance ” If the Northern Hemisphere’s temperature … blabla….” , is telling us that he is really not using a globl anomaly set but only “half a globe set” to cry wolf, it could also be an explanation of the disreparancy Anthony noticed.
But be that as it may, I will at least not put any more effort into figure this out, but If Willis Is around a comment from him on how or if he went about to ¨normalize¨ the fauxline for to enable a comparision of it to the standard anomaly set would be appreciated.
On the side , if you need to digitize to extract data from a graph image the “engauge” tool is good application for it a it has a good good selection of discertization settings that can makes it an almost automatic process to pull a single data line out of a spaghett-rainbow. It is my preferred tool for such endeavours. I run it in a Linux OS- environment and do not know if it is available for other OS-system but if it is check it out. And should it be the case that there is no version available for your OS , my second choice would be the OODigitizer-Standalone version, It orginally came about as a plugin extension to Open Office and in that incarnation works quite well in Libre Office too so if you have either one of those installed it’s but a simple matter to a fetch and install it through the addon manager in those apps, but if you like me have neither one installed there is a exists a standalone version of it written in java, that runs quite happily on any java enabled machine regardless of which OS is runnig.

March 24, 2014 5:32 pm

Willis;
Having said that, however, I agree with you regarding the white line. I just looked at the HadCRUT4 Northern Hemisphere graph, and it’s quite a good match (altho not perfect) to the white line.
>>>>>>>>>>>>>
I tried a few different running means and 20 months is a pretty good match, at the wiggle level anyway.
At the giggle level, I ran SH and global for comparison and it is clear why Mann used NH only.
http://www.woodfortrees.org/plot/hadcrut4nh/mean:20/from:1980/plot/hadcrut4sh/mean:20/from:1980/plot/hadcrut4gl/mean:20/from:1980
When I first started following this debate, I couldn’t reconcile my understanding of the physics with the claims being made. Now, I no longer even bother to try. There’s simply no point in this kind is misleading advertising unless your product is of such poor quality that there’s no other way to sell it.

Jim Brock
March 24, 2014 5:42 pm

ckb: In Father Eisele’s class at Springhill we used a metal airplane and a bb gun. The airplane had lasted for lo! those many years. Wonder where it is now some 65 years later.

March 24, 2014 5:43 pm

Steve from Rockwood says:

Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.
The warming “rate” can be “declining” and global warming can still “continue” as long as the rate remains above zero. The rate of warming would have to decline to zero or negative to falsify Mann’s statement (which it has).

You need to look up the meaning of the word “abate”:
http://www.thefreedictionary.com/abate‎
v. a·bat·ed, a·bat·ing, a·bates. v.tr. 1. To reduce in amount, degree, or intensity; lessen
“continues unabated” means: “continues without lessening, continues at the same rate or even higher”, it does not mean “hasn’t stopped”.

Ray Blinn
March 24, 2014 5:59 pm

I think the whole idea of using tree rings to determine temperature is ridiculous. That might be why Mann’s Hockey Stick might not be showing the Medieval Warm period (even without his Nature Trick).
I have three ash trees in my back yard that so far have avoided the emerald ash borer. They were all planted from seedlings about twenty years ago. They are all within fifty feet of each other and get equal sunlight. The trunks of the three trees vary in diameter from about 7″ to 12″. If you would cut them down and try to determine the average temperature per year based on the rings you would come up with three vastly different temperature graphs. There are obviously more factors involved in tree growth than temperature that can be very difficult to factor out of the data set.

rgbatduke
March 24, 2014 6:00 pm

In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.
Dearest David,
I have written, well — a lot — of code, most of it in C.

rgb@lilith|B:1033>cat main.c
#include
main()
{
 if(-1){
   printf("Actually, -1 is true...\n");
 }
 if(0){
    printf("This would be executed if 0 was true...\n");
 } else {
    printf("...but it's not -- -1 (or any nonzero value) is true.\n");
 }
}
rgb@lilith|B:1031>gcc -c main.c; gcc -o main main.o
rgb@lilith|B:1032>./main
Actually, -1 is true...
...but it's not -- -1 (or any nonzero value) is true.

Thus, as Samuel Johnson might say, I refute you.
rgb

rgbatduke
March 24, 2014 6:18 pm

Oh, and your claim isn’t -1 in the following languages, either: perl, awk, fortran, mostly python, c++, bash,
The code I submitted dropped the brackets around stdio.h, but if you fix the include line to read:
#include <stdio.h>
you can even compile it and run it to test it yourself.
rgb

Adam
March 24, 2014 6:27 pm

The rate of warming has not hit a plateau. It has fallen to zero. It is the temperature that has hit a plateau.

gnomish
March 24, 2014 6:28 pm

yeah, that was weak but it doesn’t make any difference to the thesis- i thought it was a hyphen rather than a monad.
if one looks for sense from monckton, it’s there to be found so consistently that it’s expected.
JNZ is an opcode in various assembly languages. (jump if not zero)
there is no jump.if.minus.one opcode in any
not to put too fine a point on it, but if computers were so error prone you needed multiple bits for redundancy in the normal course of events- you’d have no use for such a device.

Steve Garcia
March 24, 2014 6:29 pm

Much here especially in regards to the one-man Mann claim of the “faux pause” that should label Mann as a climate change denier. The climate in our post-LIA period changed (as it periodically does) from rising to not rising, and THAT IS IN THE DATA, not in models or predictions. Hence, Mann is guilty of denying that the climate has changed its very changing. The pause is real. Man”s assertion of “faux pause” is on contravention of the facts – meaning he is a climate denier.

Steve Garcia
March 24, 2014 6:43 pm

@Ray Blinn –
Yes, tree rings as temperature-only is dicey proxy assertion at best. That is what the “hide the decline” was all about: Briffa and his tree ring proxies simply did not keep following the rise in CO2, nor the global temps. It was necessary to hide this discrepancy – known as the “Divergence Problem”, which still exists. Tree rings started diverging from the global temps in about 1940 and by 1960 it was readily apparent. As of the “hide the decline” email, the divergence was severe – and still is.
Biologists – unlike climatologists – use tree rings as proxies for precip. As your own logic will tell you, tree rings cannot be reliably read as proxies for BOTH temp and precip. BOTH groups of science have to ASSUME that the OTHER forcing (temp or precip) is constant. And since we all know that neither one is constant, BOTH groups of scientists are barking up the wrong tree rings.
And now take away the tree rings, and how much temp data exists for the period before about 1880? Ice cores are useless for high-res and for recent centuries – the first few decades are corrupted by recent melt-offs and such so they don’t know exactly where the starting point is. Corals? Not high-res at all. Varves? Probably not bad as a proxy for temps, as long as sufficient knowledge exists about agricultural and other activity around the body of water. But varves are most directly forced by rainfall, not temps.
So without tree rings, they basically got NUTTIN. And they don’t GOT tree rings.
So your thinking is in the right direction. And your assessment of even OTHER forcings is correct, too. Where climatologists assume one factor changing and all others are constant, they are simply retarded in their capacity to apply logic and knowledge to the problem.
Yep, tree rings are really bad as proxies for temps.

GregB
March 24, 2014 6:45 pm

The problem with CO2 is that back radiation is effectively a zero sum game. Exery particle that emitted that back radiation (ie the atmosphere) must have cooled by exactly the opposite value of net energy. At the end of the day it’s effect is zero.

rgbatduke
March 24, 2014 6:59 pm

One final comment. Here is Hadcrut4:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1850/to:2014
There is no possible way to make any smoothed HADCRUT4’s average come out in excess of 0.6 C post 2000. HADCRUT4 on average is roughly 0.5C across the entire interval from 2000 to the present. The figure presented above (NOT by Mann) bounces around 0.6 C and only rarely drops to 0.5 C. One doesn’t do the argument any favors by adding 0.1 C to HADCRUT4 over the last decade.
While we’re at it, GISS LOTI is not, as the graph above also incorrectly shows, almost on top of HADCRUT4 in the 2000-2014 stretch:
http://www.woodfortrees.org/plot/gistemp/from:1850/to:2014/plot/hadcrut4gl/from:1850/to:2014
It is, in fact, almost 0.1 C higher. This is quite puzzling. GISS LOTI supposedly corrects for UHI and HADCRUT4 does not. UHI should, without any question, be a net negative correction in the recent past compared to the remote past (where GISS LOTI and HADCRUT4 are indistinguishable). Yet GISS LOTI is almost always either equal to or greater than HADCRUT4, and the excess strictly grows towards the present. One has to wonder whether or not these guys know the difference between addition and subtraction.
Just a few minor inconsistencies. But they matter. Mann’s little white line is 0.1C over even GISS LOTI, and is a solid 0.2 C over HADCRUT4, which has to be the more reliable of the two given the egregious relative sign error in LOTI.
Although both of them are a joke when compared to a truly global temperature measurement:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1850/to:2014/plot/rss/from:1979/to:2014/plot/gistemp/from:1850/to:2014
RSS lower troposphere shows an anomaly of only 0.2C give or take over an interval where HADCRUT4 increases to 0.5C and GISS LOTI increases to 0.6 C from a more or less common base. A gap of 0.3 to 0.4 C over 35 years. But then, RSS shows basically no warming over its entire range:
http://www.woodfortrees.org/plot/rss/from:1979/to:2014
at this point. Any sane eyeball would pick out a total warming on the order of 0.2 to 0.3 C over 35 years of data, less than 0.1 C/decade, and trending flat to down for half of the period covered. One can fit the data quite nicely (given the obvious uncertainties visible in the range of fluctuation) with a flat segment from 1979 to 1993 or 1994 at an anomaly around -0.1C, a flat segment from 1997 or so to 2014 around 0.2C, and a rapid jump from 1993 to 1997 in between (or any of a number of similar e.g. logistic curve shapes). This is a pure Hurst-Kolmogorov jump.
At this point both GISS LOTI and HADCRUT4 are stuck and they know it. RSS and UAH have already deviated from them so much that the game is up, but true believers can at least hold one to their belief in a lack of bias as long as the gap grows no wider. But LOTI cannot add another 0.1C by hook or by crook at this point, and HADCRUT4 has already read the writing on this particular wall and has solidly split from GISS so that it can retain SOME credibility in case Hansen’s operation tries to continue the game and is called on it in a way nobody can even pretend to ignore. All the global surface records would then be subjected to the kind of careful analysis that IMO they rather fear. What if somebody recomputes all of the means after eliminating all of the changes they made that exaggerate the late stage warming, fix the UHI problem, and find that the global records track well with RSS and that maybe half of the warming previously reported was an artifact of the corrections? Disaster! And I don’t mean of the climate kind.
rgb

March 24, 2014 7:04 pm

rgbatduke says:
March 24, 2014 at 6:00 pm

In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.

I can absolutely assure you that is not true for C. Give me 10 minutes to look it up in my copy of Harbison & Steele and I will quote you the section of the standard.
Or you can just compile into machine code the fragment:

int a;
if (a) {
itsTrue();
} else {
itsFalse();
}
and examine the branch statements generated.

GregB
March 24, 2014 7:10 pm

All assemblers I’ve seen says -1 = True.(all bits set)

rgbatduke
March 24, 2014 7:17 pm

The problem with CO2 is that back radiation is effectively a zero sum game. Exery particle that emitted that back radiation (ie the atmosphere) must have cooled by exactly the opposite value of net energy. At the end of the day it’s effect is zero.
This is simply and obviously untrue. Suppose you are trying to throw a bunch of basketballs out of bounds from the center of a basketball court. Ball boys come and drop basketballs into a big hamper at a steady rate. You throw them out of bounds at a rate that is proportional to the number of basketballs in the hamper, so that (say) if the hamper is half full you match the rate that the boys are putting them in.
Now suppose that a few players come on to the court and try to intercept your throws. Whenever they manage to catch a basketball, about half of the time they throw the ball the rest of the way out and half the time they fire it back into your hamper. The hamper now is getting balls from two sources — the steady rate of ball-boy delivery and a SECOND source that rejects some of your attempts to throw the balls out with some probability. Your rate of removal is monotonically related to the number of the balls in the hamper, and that number will increase until you manage to throw balls out at a net rate equal to the rate at which the ball boys deliver them PLUS the rate that the players pitch them back.
If you replace the basketballs with photons carrying energy, the hamper with the heat capacity of the surface, the ball boys with Mr. Sun, your throw rate with Stefan-Boltzmann blackbody radiation, and the confounded players who catch some of the balls and throw them back with greenhouse gas molecules, you have a rough idea of how the back radiation works. Sure, the players on the court don’t hold on to the balls for more than a second or two before throwing them back or throwing them out, but that doesn’t matter. What matters is that they throw them back at you, and you have to increase the rate at which you throw balls TRYING to get them out to compensate for the rejected attempts and keep up with the rate the ball boys deliver. The only thing that increases the rate at which you work is higher temperature — more balls in the hamper.
(Note to all — yes, one could probably dress this analogy up with cheerleaders who mug the ball boys on the way in and take their balls away — so to speak — to represent albedo, add a group of referees who grab balls from the hamper and carry them through the blocking players to give them a random toss 2/3 of the way to the edge of the court — latent heat transport — and so on, but of course all it is good for is helping people understand that the greenhouse effect is real when they make absurd statements about how it does not or cannot work, it isn’t in any sense a quantitative model. Quantitative models are incredibly difficult to build, because the basketball court in question is planet sized and the players in question are molecule-sized, and out of bounds is the whole bloody universe.)
rgb

thingadonta
March 24, 2014 7:36 pm

The guy-Mann- is an idiot.
In one of his books (the hockey stick wars) he talks about the Club of Rome and Elrich’s past doom and gloom predictions back in the 60s and 70s as coming true, and vindicated, and having the last laugh etc, because in 1992 in Rio a whole bunch of scientists signed a statement about impending disaster-which was also untrue.
According to Mann, a prediction becomes true if someone else he considers important also makes the same prediction, it is irrelevant whether such predictions actually occur in reality. He has taken peer review to an absurd level, something is true if you can simply find someone else to say it.

GregB
March 24, 2014 7:40 pm

rgbatduke, I apologize if this sounds rude but debate me on the physics rather than a “made up” by your mind analogy as to how it works . Frankly I thought you missed the corresponding cooling by a country mile. I actually enjoy most of your posts.

rgbatduke
March 24, 2014 7:48 pm

I can absolutely assure you that is not true for C. Give me 10 minutes to look it up in my copy of Harbison & Steele and I will quote you the section of the standard.
Um, did you actually read my comment? I was responding to one far upstream where it was asserted otherwise, and I actually included a code fragment I coded, compiled, and ran to prove it that you and I are, of course, correct. Not only correct, but correct in most of the dominant languages. 0 is false, any nonzero value in any bit is true, although nowadays one has to be a bit more careful with typing than one did in the good old days of K&R C when an int was an int and not /usr/include/bits/types.h and beyond. All to the good, all to the good, mind you.
I just couldn’t tell — you quoted me quoting somebody else and one might be left with the impression that you thought I asserted that 0 was true in C, when I was rather proving the opposite by the most reliable of methods — an actual program.
You can also do a perl one liner to the same effect:

rgb@lilith|B:1043>perl -e 'if(-1){ print "Hello World\n";}'
Hello World

In bash it is harder to do a one liner but the following fragment works:

#!/bin/sh
if [ 1 ]
then
   echo "Yes"
else
   echo "No"
fi

Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”:
http://en.wikibooks.org/wiki/X86_Assembly/Control_Flow
Rather one does e.g. cmp (comparison) of two registers and then jmp (jump/branch) based on one whatever criterion you like from the comparison. Although it has I admit been many moons since I wrote much assembler (not really since the days of the 8088/8087 when if you wanted to do certain things, you pretty much did them in assembler as compilers were horrendously inefficient and memory was tiny and expensive).
But YES YES YES, true is “not false”, and false is 0, in the most common compilers and scripting languages.
rgb

bushbunny
March 24, 2014 7:48 pm

His hypothesis based on tree rings is the worst way to gain knowledge of global temperatures. Bristlecone pines are evergreen try a deciduous tree and you will get more information, as they remain dormant for three months of the year, if you have a dry spring or summer, they won’t grow much.

GregB
March 24, 2014 7:54 pm

Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”
Right about machine code but virtually all assemblers have codified standards of -1 = true and use it extensively.:

ttfn
March 24, 2014 7:56 pm

rgbatduke says:
March 24, 2014 at 6:00 pm
“In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.”
I think it would be more accurate to say that in c anything non-zero is true and 0 is false.
#include “stdio.h”
main() {
int a = 1 == 1;
int b = 1 == 0;
printf(“a=%d b=%d\n”, a, b);
}
cc a.cc
./a.out
a=1 b=0

March 24, 2014 8:06 pm

Whoa.
Whoa whoa whoa.
The effects of a CO2 doubling aren’t felt until 1000 years later?
So if we hit 520ppm we’ll in theory get 2.5°C of warming. But only 1.25°C will happen IN THE FIRST TWO HUNDRED YEARS???
Am I getting this right? Can anybody please confirm?

March 24, 2014 8:08 pm

rgbatduke says:
March 24, 2014 at 7:48 pm

Um, did you actually read my comment?

Um, well, no I didn’t — guilty as charged. Apologies for careless reading.

rgbatduke
March 24, 2014 8:14 pm

rgbatduke, I apologize if this sounds rude but debate me on the physics rather than a “made up” by your mind analogy as to how it works . Frankly I thought you missed the corresponding cooling by a country mile. I actually enjoy most of your posts.
OK, if you want the physics, perhaps you can invest in a copy of Grant Petty’s “A First Course in Atmospheric Radiation” and save me the trouble. Of course, I’m certain that (since you want to debate the physics) you already own a copy and have taken the time to learn quantum mechanics and quantum optics and classical electrodynamics and so on so that you understand the physics in this book pretty well. In which case, I’m sure you are familiar with section 6.4.3, Simple Radiative Models of the Atmosphere, in particular with the single layer, non-reflecting atmosphere that Petty walks you through, line by line, over the next four page, to arrive at a steady state surface temperature estimate of:
T_s = \left( \frac{S_0}{4\sigma} \left(\frac{2 - a_{sw}}{2 - a_{lw}}\right) \right)^{1/4}
(Petty’s equation 6.37). This formula absolutely incorporates — explicitly — Kirchoff’s Law (absorptivity equals emissivity, which you seemed to want to state as molecules cool by as much as they heat as they absorb and emit photons, which is entirely irrelevant to the process). In this equation the short wavelength atmospheric absorptivity and long wavelength absorptivity are explicit parameters and as I noted symmetric, they are also used as the emissivity in the development of the formula.
In the simple limit of a perfectly transparent atmosphere in the visible light spectrum (short wave) and perfectly opaque atmosphere in the IR (long wave) spectrum, this expression simplifies to:
T_s = 2^{1/4} T_0
where T_0 is the mean surface temperature if there is no atmosphere at all (a_{sw} = a{lw} = 0). This is the practical maximum heating of a single layer atmosphere model, roughly 1.19 times the vacuum temperature.
This is a simple greenhouse model, and most unlikely to be more that qualitatively correct. It doesn’t include any of the subtlety of Beers Law (exponential attenuation of the light according to the optical depth or mean free path of the medium), the modification of this law associated with variable molecular density as one ascends the atmospheric column, the Lorentzian (or not!) shape of the pressure-broadened or doppler broadened molecular lines, and so on. It doesn’t include the effects of albedo, aerosols, soot. It is utterly blind to water vapor, ignores both rotation and axial tilt, pretends there are no oceans nor mountains, treats TOA insolation as if it is a constant (it’s not) and doesn’t even specify a heat capacity and try to solve an actual single layer dynamical system of first order ODEs for the temperature but satisfies itself with looking for the stable steady state where the derivatives vanish.
However, it does suffice to show that the greenhouse effect cannot be argued away by any naive arguments. Your argument (and I mean this entirely respectfully) was a naive, incomplete, and IMO erroneous argument. In particular, it contained no meaningful physics aside from a misstatement of Kirchoff’s Law. That is why I replied with a physics free analogy. If you would like to try again to formulate your argument against a greenhouse effect, and this time use actual physics and equations and so on, I’d be happy to debate at that level. Hopefully we can begin with the single layer model (or better yet, the simplified single layer model with transparent SW and opaque LW that Willis once covered on WUWT as a “steel greenhouse”, although he was reinventing a very old wheel as he did so) so that we can focus on comparatively simple mechanisms that have some representation as equations instead of stating our various opinions about the solutions to insoluble Navier-Stokes equations.
rgb

bushbunny
March 24, 2014 8:22 pm

LOL. Maybe Mikkie will produce the bristlecone pine he used to work out his tree ring theory. Presuming he hasn’t burnt it. That would be easy, then we get another and compare trunks. (Tree not human!) You don’t need to be an egg head to work out tree rings, although the tree rings of the bristlecone pine have been used to gauge bombardment to calculate C14.

Rob Ricket
March 24, 2014 8:22 pm

rogerknights says:
March 24, 2014 at 3:01 pm
First he disappeared the MWP, now he’s disappeared the Southern Hemisphere. What a magician!
Exactly right Roger! Mann’s ostensible reason for dropping the MWP from reconstructions was a dearth of Southern Hemisphere proxies and supposed localized warming confined to Eurasia. In other words, lack of data was used to justify expunging the MWP from Mann’s reconstruction.
In this latest (and no less egregious) perversion of science; Mann willfully (and not inconveniently) neglects to include perfectly good Southern Hemisphere data in a graph purporting to detail global temperature.
Never trust a liar a thief or the Mann!

rgbatduke
March 24, 2014 8:26 pm

I think it would be more accurate to say that in c anything non-zero is true and 0 is false.
And I do, two or three comments up. And the quote is not my words, it is a quote of me quoting somebody else’s words way farther up.
Look, guys, my personal projects source directory is currently around 7 GB in size (and isn’t exhaustive — I have sources in other trees as well). That isn’t all C source (I have coded in fortran, perl, awk, bash, python, pl/1, c, c++, tcl/tk, php, apl, basic, and probably other languages I can’t offhand remember), and it isn’t all C source I personally have written, but the part that is C source that I have written is, really, really big. Probably well over a GB (where of course “written” in many cases means block copy old code and then tweak it — I’m not insane:-). But you are correcting me CORRECTING SOMEBODY ELSE, which is sort of like… er… not correcting me.
I’m just sayin’…
rgb

rk
March 24, 2014 8:32 pm

Looks like our friend M. Steyn has lawyered up big time. No fool he.
http://www.steynonline.com/6201/what-kind-of-fool-am-i#.UzDxUaaDiXs.twitter
So what happens when Mann’s lawyers say, listen Mike, let’s just drop the whole thing…real quiet like. More popcorn please.

Reply to  rk
March 25, 2014 11:23 am

@rk – You are SOOOO right! I suspect that Mikey’s nanny is going to have a messy laundry this week!

ttfn
March 24, 2014 8:37 pm

rgbatduke says:
March 24, 2014 at 8:26 pm
I blame Alan 🙂 I was just adding to the discussion that whether it’s -1 or 1 or 100, true is pretty compiler specific, and usually not 0. Gnu C seems to use true=1. I’m guessing Stallman was trying to save energy or something. I’ll go now.

rgbatduke
March 24, 2014 8:42 pm

Right about machine code but virtually all assemblers have codified standards of -1 = true and use it extensively.:
No arguments from me.
It’s just that one can do a direct comparison and branch (which is all if/then ever really is) without the concept of true at all. “True” is basically an upper-level language construct, but nowadays assembler isn’t as pure a textual rendering of machine language as it was way, way back when I took machine programming (and even then there were assembler macros that imposed a veneer of upper level language sensibility on the low level commands).
And even then, it is (as one other person pointed out) probably more appropriate to say that 0 is false and anything else is true, more than -1 (all 1’s in two’s complement in some register) is true, 0 is false, and 27 is neither. Usually 27 will work just as well as 1 as well as -1 as “not false” on a conditional.
However, we now stray far afield from the original minor correction of Mr. Monckton’s sort-of-correct original statement that caused the original completely incorrect refutation of Monckton’s that I replied to with a refutation, that caused several people to think somehow that I had personally claimed that C treated 0 as true even though the statement I quoted was italicized as usual.
Blame WordPress — Slashdot has buttons that allow one to include and quote with attribution. Goodreads has a bloggish interface that allows previewing and correction and more. Piazza has a built in latex translator that allows previewing and editing, as does Wikipedia. There are tools out there that might better support algebraic threads, quoting, inclusions of graphics and links, and more. WordPress seems to be little more than a text entry box with a limited ability to handle embedded latex and some simple html markup. It’s fine, it works, but there are things out there that work better and more powerfully, maybe.
rgb

GregB
March 24, 2014 8:50 pm

rgbatduke, You can deke and dive all you want with irrelevant claptrap but I ask you, “Do you really believe despite all of the physics out there that somehow CO2 is amplifying (only way to heat) energy coming in from the sun? No, this isn’t possible physically. You can change a temperature profile but “in energy” is going to equal “out energy” and balancing happens mostly at the speed of light except where it intercepts our atmosphere for brief moments.

gnomish
March 24, 2014 9:07 pm

if you’re talking logic- which is binary, then you have only 2 choices to represent the only 2 possible choices.
if you are talking about a bit comparison, you’re working with one bit vs one bit.
nobody uses 8 bits to represent a logical state. but the processor has a built in circuit to test for zero.
when a processor does a comparison for equality, it is performing a logical computation:
if (A and B) or (not (A or B)) for many bits in parallel.
you know the truth table for that is not gonna have anything but ones and zeros, right?
anyway, the issue is phony. 1 and 0 are abstract convention for representing a logical state, which is binary and not a definition of any particular numerical value.

March 24, 2014 9:08 pm

GregB;
You can change a temperature profile but “in energy” is going to equal “out energy” and balancing happens mostly at the speed of light except where it intercepts our atmosphere for brief moments.
>>>>>>>>>>>>>>>>>>>>
Greg, you’ve been given an analogy and the specific physics by RGB. They’re both accurate to the extent possible in a blog post. You’re own explanation is wrong for the simple reason that you are confusing two very different concepts. Yes CO2 molecules absorb and emit and the net is zero. But it isn’t that net energy that is the issue it is the energy flux that is the issue. The process of absorbing and emitting changes the flux at any given point, not the net energy in and out of the system (unless we’re talking about the transition between equilibrium states, which we aren’t).
Consider a slightly different analogy, that of a damn that forms a lake. The water flowing into the lake does so at exactly the same rate as water flows over the damn. Raise the damn a few feet. There’s a period when no water flows (between equilibrium states). But once equilibrium is reached again, the water flowing in does so at the exact same rate it did before, and it goes over the damn at exactly the same rate that it did before. But there’s no argument that the lake is deeper.
I suggest you apologize to rgb. You asked for the physics, he gave you what you asked for, and you called it claptrap. Either you didn’t understand a word of what he said, or you just weren’t interested in the facts in the first place.

March 24, 2014 9:10 pm

I cannot believe I misspelled dam that may times in a row. Dam spell checkers ought to know which dam damn I dam well meant.

JJ
March 24, 2014 9:44 pm

davidmhoffer says:
I cannot believe I misspelled dam that may times in a row. Dam spell checkers ought to know which dam damn I dam well meant.

And which ewer your using to make you’re point, huh?
🙂

March 24, 2014 9:44 pm

Réaumur says March 24, 2014 at 10:04 am
“In logic, every declarative statement is assigned a truth-value: 1 (or, in computer programs, –1) for true, 0 for false. ”
Where is -1 used to represent true?

Perhaps when the variable is viewed arithmetically, as this will be the *arithmetic value* of the internal integer (pos and negative *whole* number) representation with the number “-1” represented internally in “two’s complement” form, when all bits within the ‘referenced’ byte (or word, or double word etc) variable (ostensibly a BOOL in this case) will be set to a “1” for a value of -1.
The way this works, is: For an integer byte (8 bits, 7…0) value, the MSB (bit 7) represents “-128”. The next MSB (bit 6) represents +64 and so on down to the LSB (bit 0) which represents “+1”. Set all bits high, sum the values they represent, and one gets the sum “-1”.
Depending on the processor language and compiler, for a BOOL (Boolean value) variable, YMMV.
.

GregB
March 24, 2014 9:48 pm

davidmhoffer, I don;t give a dam. \i actually wanted to get to the point you are making. You call it flux, I call it temperature profile, i believe that a profile allows one to measure where the heat is coming from.. Yes, energy can be delayed on it’s exit from the atmosphere but for every watt delayed there is a corresponding greater thirst by the above. The 3 K temp of space has an absolute damping mechanism on energy build up.

March 24, 2014 9:49 pm

rk says:
March 24, 2014 at 8:32 pm
Looks like our friend M. Steyn has lawyered up big time. No fool he.
http://www.steynonline.com/6201/what-kind-of-fool-am-i#.UzDxUaaDiXs.twitter
======================
This is a huge development.
Mr Steyn, if you’re reading this, don’t insult the judges, just educate them. I’ve been there ….
… and won.
The legal win is way more important for the world than your personal ability to pee further than Michael Mann’s half an inch.

bushbunny
March 24, 2014 10:01 pm

I think Mark is rubbing more salt in the wound and hoping to irritate Mann. Just get your own bristle cone pine and demonstrate how it is a faulty way to do research. Get an archaeologist who knows how to do it. Tree growth is governed by environmental conditions, and rain levels. As bristle cone pines are evergreen, they can live in temperate areas rather than tropical because they need a lot of rain to grow and are naturally slow growers anyway. Although long lived. They were used to calculate or recalculate C.14 (carbon 14) dating methodology, as the planet has been receiving varying levels of C.13-c.14 over the years, and that comes from outer space or atmospheric atom bombs explosions. That is why you will see on an dating of organic artifacts, a plus or minus from the date proposed.

rogerknights
March 24, 2014 10:03 pm

Steve Garcia says:
March 24, 2014 at 6:29 pm
Much here especially in regards to the one-man Mann claim of the “faux pause” that should label Mann as a climate change denier. The climate in our post-LIA period changed (as it periodically does) from rising to not rising, and THAT IS IN THE DATA, not in models or predictions. Hence, Mann is guilty of denying that the climate has changed its very changing. The pause is real. Man”s assertion of “faux pause” is on contravention of the facts – meaning he is a climate denier.

Hence my earlier posting:

Here’s a subtle but very pointed poster that should be put on 1000 billboards across the world:
Image—A hockey stick with its shaft slanting upwards & to the right and its blade flat.
It’s transparently overlaid on a graph of the running mean of GASTA averaged from five sources.
Caption—”Who’s in Denial Now?”
Make that 10,000 billboards.

Bryan
March 24, 2014 10:04 pm

I am interested in how closely the model-generated curves match the historical data. I’m thinking they have to be curve-fit. I don’t know a lot about these Energy Balance Models, but from reading the article it looks like the numbers they have to play with are the solar constant (S) and forcing from volcanic aerosols (which they call a). They briefly described how they varied S. It sounded like it took solar cycles into account, but I couldn’t tell if they also gave themselves leeway to play with it. Even if the changes in “S” were tied to actual cycles, that still leaves “a” to play with. Has anyone looked at the data for any sign that the “S” and “a” data come from anything other than curve fitting? If either of them were chosen purely to force the model runs to follow the white line through history, then readers might get an incorrect impression that these models impressively match historical temperatures by using actual data concerning solar output and volcano activity, combined with actual C02 data.

March 24, 2014 10:07 pm

re: DirkH says March 24, 2014 at 11:08 am
… Rather, the highest bit of a register serves as sign in 2′s-complement arithmetic.
One sees this in print, in various places, and if one presses that into practice one finds oneself at odds with the practical implementation of numbers in Two’s Complement form.
If, instead, one treats the MSB as the minimum value representable in the memory allocation ‘chunk’ (e.g. -128 for a byte,. -32768 for a 16-bit word, etc) or *register* one is working with, then ‘things’ work out in a much more straight-forward fashion; For human interpretation it is a simply matter of ‘summing’ the bit representations together to arrive at a value.
Then there is the practical matter of summing and subtracting these values as performed at the ‘hardware’ level (where normally *today* your compiler or interpreter ‘hides’ intimate register-level manipulations of bits in the CPU core) … all this becomes VERY apparent when writing and debugging assembly-level apps, targeted for a nominal ‘8-bit’ (there are some 16 bit ops using pairs of registers; addressing computations involve 16 bits for instance) CPU like a Z80 (some of the happiest days of my life, I might add.)
.

March 24, 2014 10:09 pm

GregB;
You call it flux, I call it temperature profile, i believe that a profile allows one to measure where the heat is coming from..
>>>>>>>>>>>>>>>
How many times in a row do you intend to conflate two completely different things? Until and unless you understand the difference, you’re going to have trouble understanding the explanation. If you were right, the earth would not be warmer than the moon. But it is. Venus would not be warmer than Mercury. But it is. Putting on a an extra blanket would not keep you warmer at night. But it does. You are wrong, and demonstrably so. I suggest that you read the following:
http://wattsupwiththat.com/2011/05/07/visualizing-the-greenhouse-effect-light-and-heat/
http://wattsupwiththat.com/2011/03/29/visualizing-the-greenhouse-effect-molecules-and-photons/
http://wattsupwiththat.com/2011/02/28/visualizing-the-greenhouse-effect-atmospheric-windows/
You either want to understand how the physics works, in which case those are excellent starting points, or you want to cling to a belief system. If your belief system is correct, then millions of engineers the world over are designing everything from easy back ovens to nuclear reactors that don’t work. You’ll find that those things do in fact work, and they are based on the precise same physics that RGB pointed you to, and which is detailed at a less technical level in those three articles.

GregB
March 24, 2014 10:56 pm

davidmhoffeer,
What I am saying about temperature profile relates directly to flux. It’s a derivative – think about it. It is simply a different way of looking at it and possibly simplifying the variables. There is no doubt that dragging a photon of heat energy on the back of a molecule will translate into our environment better than a photon on it’s way to outer space. The mathematical fact remains that a molecule giving up a certain amount of energy will cool by an “exactly equivalent amount of energy”. Try and deny this — I double dare you.. I also double dare you to say, ” It isn’t from the atmosphere that this cooling back radiation emanates from”.

Nik
March 25, 2014 12:08 am

What if…. we had a magic wand to reduce CO2 and attain an ideal global temperature and this wand was given to the warmistas? Do they have such “ideal” levels or would they fight each other tooth and nail to establish, each his own, preconceived notions of what is “best for the planet”?

rgbatduke
March 25, 2014 12:54 am

You can deke and dive all you want with irrelevant claptrap but I ask you, “Do you really believe despite all of the physics out there that somehow CO2 is amplifying (only way to heat) energy coming in from the sun? No, this isn’t possible physically. You can change a temperature profile but “in energy” is going to equal “out energy” and balancing happens mostly at the speed of light except where it intercepts our atmosphere for brief moments.
Thereby demonstrating my point. All you are capable of grasping is indeed a simple analogy, if the best you can do when you ask me to reply with physics is call the physics I reply with “irrelevant claptrap” and then try to put words in my mouth about belief in non-existent CO_2 “amplification” (whatever that might be, however much it is the “only way to heat”).
Look, why not admit that you don’t understand one thing about radiation physics? It’s really ok not to — you are in the position of 99.99% of the world’s humans. But please try to understand — there are Real Physics Books that do, in fact, go through the physics a line at a time, written by and read by people that do understand what they are writing and reading and who can in turn reproduce it, stuff that has been repeatedly tested experimentally to where nobody really doubts the general ideas any more, even though some of what is written is only known to some approximation or in some idealized context.
I don’t believe “despite” all of the physics out there that CO_2 is “somehow” contributing to a greenhouse effect. I understand how CO_2 differentially interacts with incoming (shortwave) and outgoing (longwave) radiation to produce a relative surface heating compared to what one would have in a CO_2-free perfectly transparent atmosphere. Understanding — especially when it is quantative and can be derived, if only in an oversimplified model — is not really the same thing as blind belief or the use of words you do not understand, out of context.
One of the things you clearly do not understand is the idea of detailed balance. The rate at which energy is received by the surface (incident power) has to balance the rate at which energy is lost by the surface. In an imaginary atmosphere that is perfectly transparent to the short wavelength energy of sunlight which falls onto a perfect absorber surface, the first rate is determined by the intensity of sunlight only. However, the rate at which the surface loses heat is determined by its temperature. The wavelengths at which the energy is lost by the surface are not the same as the incoming energy — they are long wavelength IR radiation. If the atmosphere above is opaque to LWIR — a perfect absorber/emitter of LWIR — it will on average redirect 1/2 of the outgoing LWIR back towards the surface. The rate that the surface receives energy is then SW visible sunlight (unchanged) PLUS this back radiation, 1/2 the outgoing surface radiation.
In order for the surface to be in detailed balance and lose energy to outgoing radiation at the same rate it receives it from the mix of sunlight and back radiation from the atmosphere, it has to be at a higher temperature than it would be if there were no back radiation. It is really very simple.
We can work through the equations a line at a time if you want. But I’m going to have to invoke a “Willis clause” on our discussion. YOU asked for me to discuss the physics with you and not give you simple analogies that would help you see why your assertions that CO_2 has no effect, or a cooling effect, are wrong (quite aside from the direct spectrographic data that demonstrates that CO_2 without any question at all creates a greenhouse effect, reproduced in Petty’s previously mentioned book). If you want to discuss the physics, that’s fine. I’m a Ph.D. physicist. I make a living teaching physics. I write physics textbooks. I’m happy to teach you the physics you need to know to understand that the greenhouse effect is real and is responsible for a substantial net warming of the Earth’s surface compared to what it would be with identical albedo and emissivity with either no atmosphere or with a perfectly transparent atmosphere. That doesn’t mean GCMs are right, that CO_2 will lead to catastrophe, or anything of the sort.
It does mean that it is simply silly to assert that CO_2 is not a greenhouse gas and does not produce any greenhouse effect warming of the planetary surface, especially backed by a verbal, not even qualitative or conceptual argument containing phrases that nobody has ever even heard in this context. You’ll have to decide for yourself if it is silly to argue with a physics professor about physics when you don’t really know any physics (or, I’ll guess, calculus or higher math). But if you want my further attention, it will come when you provide some actual algebra and argument founded on actual laws of physics, not statements about energy escaping “at the speed of light” that have very little to do, really, with the process involved in general or with detailed balance in particular. The process is quite adequately modelled by basketballs at the speed of basketballs. So far, you haven’t even managed to address that supersimple model. Backradiation is real, because we can direct measure it with simple tools. It is added to equally directly measurable sunlight. Outgoing surface radiation is real, and we can directly measure it. Outgoing radiation at the top of the atmosphere is real, and we can directly measure it. When one does simple bookkeeping on the rates of these experimentally observed rates, one concludes that the existence of backradiation from greenhouse gases in bands where the atmosphere is optically opaque with a short mean free path causes net differential warming of the surface in order for the surface to remain in detailed balance.
It is really quite simple.
rgb

richardscourtney
March 25, 2014 3:19 am

GregB:
Your post at March 24, 2014 at 10:56 pm is addressed to davidmhoffer and says in total

What I am saying about temperature profile relates directly to flux. It’s a derivative – think about it. It is simply a different way of looking at it and possibly simplifying the variables. There is no doubt that dragging a photon of heat energy on the back of a molecule will translate into our environment better than a photon on it’s way to outer space. The mathematical fact remains that a molecule giving up a certain amount of energy will cool by an “exactly equivalent amount of energy”. Try and deny this — I double dare you.. I also double dare you to say, ” It isn’t from the atmosphere that this cooling back radiation emanates from”.

Allow me to help you to grasp why you are failing to understand what davidmhoffer is explaining to you.
All apples are fruit. Not all fruit are apples.
Similarly,
All heat is energy. Not all energy is heat.
There is no “photon of heat energy”.
A photon is a quantum of electromagnetic energy. It has an energy which is related to its wavelength.
The heat of a gas is kinetic energy of its molecules. It is expressed as the temperature of the gas which is a function of the average (i.e. RMS) speed of the gas molecules.
A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.
In other words, you are plain wrong when you write

The mathematical fact remains that a molecule giving up a certain amount of energy will cool by an “exactly equivalent amount of energy”.

The mathematical and physical facts are that the molecule does not “cool” by any amount when it gives up vibrational or rotational energy because they are not heat.
I hope that helps.
Richard

Rob Ricket
March 25, 2014 4:31 am

Richard C. regarding:
“A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.”
I’m just a laymen who has completed some light reading on Physics. I am confused by your explanation; which seems to be partially correct. Is it not true that photon bombardment/absorption of sufficient intensity will induce electrons to jump orbit at discreet intervals/quanta? isn’t vibration of material kinetic energy? Is the vibration of boiling water greater or less than non-boiling water?

Chris Wright
March 25, 2014 4:49 am

“the IPCC published a graph that my co-authors and I devised….”
That’s an interesting word. It smacks of a Freudian slip. Scientists can certainly devise theories, there’s nothing wrong with that. But can you ‘devise’ data? Can you ‘devise’ a graph that shows data?
According to Mann, apparently you can. Of course, anyone can devise – i.e. make up – anything they like. But if it’s passed off as real data in a scientific paper, then it’s fraudulent, pure and simple.
Perhaps climate scientists should make greater use of empirical scientific data rather than ‘devising’ stuff.
Chris

DirkH
March 25, 2014 5:06 am

_Jim says:
March 24, 2014 at 10:07 pm
“re: DirkH says March 24, 2014 at 11:08 am
… Rather, the highest bit of a register serves as sign in 2′s-complement arithmetic.
One sees this in print, in various places, and if one presses that into practice one finds oneself at odds with the practical implementation of numbers in Two’s Complement form.
If, instead, one treats the MSB as the minimum value representable in the memory allocation ‘chunk’ (e.g. -128 for a byte,. -32768 for a 16-bit word, etc) or *register* one is working with, then ‘things’ work out in a much more straight-forward fashion; For human interpretation it is a simply matter of ‘summing’ the bit representations together to arrive at a value.”
These are just two ways of looking at it, Jim. No difference on the binary function level. And no, I don’t find myself at odds with the practical implementation when doing 2’s complement arithmetic. It’s all implemented exactly as one would expect it to.

techgm
March 25, 2014 5:18 am

SciAm ceased being “scientific” (and “American”) 25 years ago.

fadingfool
March 25, 2014 6:06 am

rgb – if back radiation actually existed and could be directly measured then we could harness it. Given as I have no “back radiation” powered flashlight I suspect it is likely to not be a physical force and more likely only an artefact of misguided thinking.
Given as H2O dominates within the entire climate zone (up to an altitude where there is no weather) explain again how CO2 absorbing IR (that doesn’t have the energy to increase the kinetic energy of the CO2 molecule – dipole moment change is the mechanism in CO2) – somehow “warms” the lower atmosphere via molecular collisions at an altitude of reduced molecular collisions? How does this top down mechanism work when the temperature profile is bottom up?

David Ball
March 25, 2014 6:16 am

All this talk, Yet Co2 keeps going up, and temperature does not.
It really is quite simple.

March 25, 2014 6:24 am

These are just two ways of looking at it, Jim. No difference on the binary function level.
Not everything one reads on the internet is true; the above is just one more case.
There is a form of ‘binary’ notation where the MSB is used as the ‘sign’ bit, called “sign-magnitude” or “sign and magnitude” representation (such as was used on early IBM computers like the IBM 7090 which used this representation), but, making this assumption in Two’s Complement notation is case of messed up thinking (and has screwed up MORE than one student’s understanding of how Two’s Complement works.) Also, sign-magnitude form should not be confused with One’s Complement representation which shares similarities to Two’s Complement form.
Take this 8-bit binary value: 1000 0000 as a number in Two’s Complement form … what is the value of this number in base 10 *IF* the MSB is simply taken to represent the “sign” bit as DirkH contends?
I proffer that DirkH IS thinking in terms of Sign-Magnitude, which results in this series of ‘numbers’ in binary form vs Two’s Complement form:
. . . . . . . . . Binary . . . . . . . . . . Binary
. Dec .Two’s Complement . . Sign-Magnitude
127 . . . . .0111 1111 . . . . . . . 0111 1111
+2 . . . . . 0000 0010 . . . . . . . 0000 0010
+1. . . . . . 0000 0001 . . . . . . . 0000 0001
0 . . . . . . .0000 0000 . . . . . . . 0000 0000 (also S-M possible: 1000 0000)
-1 . . . . . . 1111 1111 . . . . . . . 1000 0001
-2 . . . . . . 1111 1110 . . . . . . . 1000 0010
-127 . . . . 1000 0001. . . . . . . 1111 1111
-128 . . . . 1000 0000 . . . . . . . No representation possible
As one can see, the MSB represents a value of negative 128 (-128) vs a simple sign bit. Simply calling the MSB in a Two’s Complement number a sign bit leads to confusion, especially those (like students) new to the ‘handling’ and representation of numbers in the various binary forms.
.

richardscourtney
March 25, 2014 6:26 am

Rob Ricket:
Thankyou for your request for clarification at March 25, 2014 at 4:31 am where you write

Richard C. regarding:

“A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.”

I’m just a laymen who has completed some light reading on Physics. I am confused by your explanation; which seems to be partially correct. Is it not true that photon bombardment/absorption of sufficient intensity will induce electrons to jump orbit at discreet intervals/quanta? isn’t vibration of material kinetic energy? Is the vibration of boiling water greater or less than non-boiling water?

OK. What I wrote is completely correct, but is not It was not intended to be (and could not be) a complete explanation of all the issues.
Please note that my post at March 25, 2014 at 3:19 am said and tried to explain the importance of

All apples are fruit. Not all fruit are apples.
Similarly,
All heat is energy. Not all energy is heat.
There is no “photon of heat energy”.
A photon is a quantum of electromagnetic energy. It has an energy which is related to its wavelength.
The heat of a gas is kinetic energy of its molecules. It is expressed as the temperature of the gas which is a function of the average (i.e. RMS) speed of the gas molecules.
A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.

Your question about boiling water is not relevant to that explanation: water is a liquid – not a gas – and is only boiling in trivial amounts (e.g. at hot springs) around the globe.
All atoms absorb and release photons of appropriate energy. As you say, they do this by inducing electron shell jumps within the atom. But these atomic effects are trivially small in the atmosphere. Indeed, they are so very, very small compared to the molecular effects of greenhouse gas (GHG) molecules that these atomic effects are usually ignored.
GHG molecules such as H2O and CO2 can store energy by vibrating and rotating parts of their structures.
A CO2 molecule
O-C-O
can vibrate by changing the ‘angle’ between its oxygen molecules, and this vibration energy is internal to the molecule so does not alter its speed relative to other molecules in the air. Hence, this vibrational energy has no direct effect on the temperature of the gas.
Oxygen (O2) and nitrogen (N2) molecules are not GHG molecules because they cannot store energy by vibrating and rotating parts of their structures.
O-O and N-N each has no ‘angle’ to vibrate.
I hope that provides the needed clarification.
Richard

crakar24
Reply to  richardscourtney
March 25, 2014 3:03 pm

Thank you Richard for your comments, your explanation makes things clearer for me. So what you are saying is a GHG will absorb and release IR energy without any heat exchange because the kinetic energy of the GHG is not altered. However if this GHG collides with N2 etc (non GHG) the energy imparted onto the non ghg molecule will affect its kinetic energy and thus be seen as heat.
Is this explanation close to reality
Cheers
Crakar24

DR
March 25, 2014 6:56 am

@bw
I’ve been asking the same question for years. Why do satellites and SAT diverge from 1980 forward? The greenhouse effect hypothesis specifically states the troposphere should be warming at a significantly higher rate than the surface because it is “trapping heat”, yet observations show just the opposite.
Where is the hot spot?

March 25, 2014 6:56 am

re: GregB says March 24, 2014 at 7:54 pm
Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”
I simply used an LSB ‘bit’ set in bytes assigned as flags while doing assembler on a Z80, with a value initialized quickly in “A” (the Accumulator) as:
XOR A . . ; Clears reg A (only 4 clk “T” states and a 1 byte op code)
INC A . . . ; Sets LSBit in reg A (only 4 clk “T” states and a 1 byte op code)
LD mem_loc_flg,A . . ; Put init’d value to memory location for flag
Intrinsic in many operations with Reg A (the ‘Accumulator’) such as loads from memory are comparisons with (or against) zero, checks of the so-called “sign” bit (MSB) position resulting in various flag bits (in the Flag register) being set as a result of these intrinsic compares; subsequent checks of those flags (e.g. “JNZ”, “JP cc,nn” etc) can then effect ‘program flow’ (branching on various conditions). Keeping these intrinsic compares in mind is paramount when trying to write tight, fast-executing code … these kinds of features must have driven the first compiler-writers nuts too! One can also see why some compilers might result in tighter or faster-executing code as well, depending how much/how many features of the ‘native’ CPU the compiler is ‘aware’ of.
.

Leo Smith
March 25, 2014 7:40 am

Does anyone take Mann seriously any more?

DirkH
March 25, 2014 7:48 am

Nik says:
March 25, 2014 at 12:08 am
“What if…. we had a magic wand to reduce CO2 and attain an ideal global temperature and this wand was given to the warmistas? Do they have such “ideal” levels or would they fight each other tooth and nail to establish, each his own, preconceived notions of what is “best for the planet”?”
They do not answer when asked for what the ideal temperature would be. Because they don’t care. We just asked one:
http://notrickszone.com/2014/03/22/sks-hiroshima-bomb-heat-clock-fraud-claim-2-1-billion-climate-ground-zeros-yet-cant-find-a-single-one-of-them/#comment-926047

catweazle666
March 25, 2014 7:52 am

“Not even wrong”…

March 25, 2014 7:54 am

fadingfool says:
March 25, 2014 at 6:06 am
rgb – if back radiation actually existed and could be directly measured then we could harness it. Given as I have no “back radiation” powered flashlight
>>>>>>>>>>>>>>>>>
It can be directly measured, has been directly measured, I’ve linked upthread to three articles that include not only the explanation but the actual measurements at various places on earth. Your back radiation powered flashlight doesn’t exist because it isn’t cost effective to build one. I expect that you don’t have a clock powered by a potato either, which has nothing to do with it being possible to build one.

March 25, 2014 8:00 am

fadingfool;
somehow “warms” the lower atmosphere via molecular collisions at an altitude of reduced molecular collisions?
>>>>>>>>>>>>
Some photons that would otherwise have escaped to space are absorbed by CO2 molecules and then emitted again in a random direction. The direction being random, some portion of them are emitted downward from whence they came. Where in that explanation do you see anything about molecular collisions? Again, read the material and the explanations for what they say rather than what you think they say.

Rob Ricket
March 25, 2014 8:20 am

Richard,
The civil discourse is appreciated; as is, your assertion regarding the difference in abortion rates amongst liquids and gasses. Certainly, your statement below runs afoul of Charles Law, in as much as atmospheric mass creates a constant pressure. In relative terms, the distance electrons orbit are humongous (scientific term) relative to the proton. There are collisions (hence movement) proportionate to temperature.
“can vibrate by changing the ‘angle’ between its oxygen molecules, and this vibration energy is internal to the molecule so does not alter its speed relative to other molecules in the air. Hence, this vibrational energy has no direct effect on the temperature of the gas.”

Knight who says ni!
March 25, 2014 8:33 am

Looks like a man making global warming there

richardscourtney
March 25, 2014 8:45 am

Rob Ricket:
I wrote

GHG molecules such as H2O and CO2 can store energy by vibrating and rotating parts of their structures.
A CO2 molecule
O-C-O
can vibrate by changing the ‘angle’ between its oxygen molecules, and this vibration energy is internal to the molecule so does not alter its speed relative to other molecules in the air. Hence, this vibrational energy has no direct effect on the temperature of the gas.
Oxygen (O2) and nitrogen (N2) molecules are not GHG molecules because they cannot store energy by vibrating and rotating parts of their structures.
O-O and N-N each has no ‘angle’ to vibrate.

At March 25, 2014 at 8:20 am you have replied to that saying

Certainly, your statement below runs afoul of Charles Law, in as much as atmospheric mass creates a constant pressure.

Say what!?
Charles’ Law is a special case of the ideal gas law. It applies to ideal gases held at a constant volume (V) allowing only the pressure and temperature (T) to change and can be stated as
V1/T1 = Vf2/T2
I fail to see how my correct, true and accurate statements run “afoul of Charles Law”. Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.
Richard

fadingfool
March 25, 2014 8:55 am

Dmhoffer: – no but I did have a potato powered radio (all these Hiroshima per second must be good for something) as yet I’ve yet to see downward IR emissions able to do any work and if it can’t do work it isn’t power.
Again even if the IR photon is re-emitted downward (this is not AGW theory by the way as current AGW theory has the energy transmitted to non-GHG gasses via collisions) this would be absorbed by either CO2 or H2O in a temperature neutral dipole moment change.
I have argued this before and have yet to get a consistent answer – given the triple state of H2O in the thermosphere where does the temperature increase come from and hence why would anyone expect to find the increase in energy in the temperature data (rather than the thermosphere extent and H2O triple point ratio)?

neasdenparade
March 25, 2014 9:07 am

Without wanting to state the obvious, sometimes the simplest picture needs to be spelled out in single sentences to be isolated.
Looking at his white line, he has gone on the record revising existing temperature with no scientific basis.
That means Michael Mann, with no other defence, has altered the known figures without any known justification, and like Oscar Pistorius, unless he can show a legal defence is now no better than any of the mafia. That’s it, he has openly cheated and unless he can prove otherwise, because he put the fake figures out, he is a cheat. No one else known on earth has ever produced such figures, he can provide no source, and even if he could does he really believe he can then go and prove ALL THE OTHER OFFICIAL SETS ARE WRONG AS WELL?
Unless he both provides a defence for his alteration and then demonstrates why all the world’s official sets were never reading high enough MICHAEL MANN HAS MADE UP HIS OWN DATA, AND AS A RESULT HAS LEFT THE REALM OF SCIENCE. That is the only possible way (unless he can both provide a defence AND demonstrate all the other data read too low) the graph can be interpreted. The chocolate is gone and is all over his face and hands.

Mike M
March 25, 2014 9:14 am

Would it be a worthwhile experiment to compare heating two buckets (white plastic) of cold water with an IR lamp from the top – one with only water and the other with carbonated water?
Assuming the plain water one heats enough to be detected, won’t the one with the CO2 heat up more, proving that the CO2 did not re-radiate the IR back out – it only helped heat the water itself?
My conundrum is that if the CO2 did “re-radiate” IR back out, then an even higher concentration of it should result in even more “re-radiation” back out resulting in even less heat energy going into the mixture and therefore less warming?

richardscourtney
March 25, 2014 9:22 am

Mike M:
You ask at March 25, 2014 at 9:14 am

Would it be a worthwhile experiment to compare heating two buckets (white plastic) of cold water with an IR lamp from the top – one with only water and the other with carbonated water?

It would be pointless. The radiative properties of CO2 are very different when the CO2 is free molecules in air and dissolved molecules in water.
Richard

Rob Ricket
March 25, 2014 9:22 am

Richard C.,
Of the three forms of heat transfer, convection is germane to our discussion, in as much as, heat is transmitted through a gas…in the case of a typical kitchen range, the transmission gas is air. In an electric range, photons (in the infrared band) are emitted from the heating element. The photons induce the electrons to jump (mysteriously and instantaneously move) to outer orbits. Inside this fixed space the propensity for molecular collisions must increase with temperature as the electron orbit of each molecule expands.
I guess, I’m having difficulty coming to terms with your assertion that increases in molecular vibration occur independent a congruent increase in molecular collisions.
“I fail to see how my correct, true and accurate statements run “afoul of Charles Law”. Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.”

March 25, 2014 9:45 am

Bottom line: they are going to keep moving the goalposts until they believe they have scored.

richardscourtney
March 25, 2014 9:58 am

Rob Ricket:
I am copying all your post at March 25, 2014 at 9:22 am so it is clear that I am answering in context.

Richard C.,
Of the three forms of heat transfer, convection is germane to our discussion, in as much as, heat is transmitted through a gas…in the case of a typical kitchen range, the transmission gas is air. In an electric range, photons (in the infrared band) are emitted from the heating element. The photons induce the electrons to jump (mysteriously and instantaneously move) to outer orbits. Inside this fixed space the propensity for molecular collisions must increase with temperature as the electron orbit of each molecule expands.
I guess, I’m having difficulty coming to terms with your assertion that increases in molecular vibration occur independent a congruent increase in molecular collisions.
“I fail to see how my correct, true and accurate statements run “afoul of Charles Law”. Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.”

Yet again you ignore my repeated explanation that while all heat is energy not all energy is heat.
There is no “heat transfer” when a GHG molecule absorbs an IR photon; none, zilch, nada. There is an absorbtion of electromagnetic energy by the molecule. And there is no temperature change because there is is no heat transfer.
As I explained to you, in the atmosphere electron shell absorbtion by atoms is so trivial that it has no relevance when compared to the GHG molecular absorbtion.
Electromagnetic energy is converted to heat energy (i.e. is thermalised) when it is absorbed by solids and liquids.
There is a collisional effect in the atmosphere, but it is not what I was discussing and it is nothing like your strange imagining. I explain it as follows.
An excited GHG molecule may collide with another molecule e.g. a nitrogen molecule. In this case, the GHG molecule may transfer its internal (i.e. vibrational or rotational) energy to the other molecule. But the receiving molecule cannot store the energy internally so it accelerates and, thus, increases the temperature of the gas. Hence, the collision has thermalised the energy which was stored in the GHG molecule.
In the lower atmosphere most de-excitation of GHG molecules occurs collisionally and not radiatively (i.e. not by emitting photons). But so what?
Richard

March 25, 2014 10:11 am

I have to make the question again, is the literature certain (or as certain as climate science can be) about the time it takes for warming to kick in? At one point in the article Monckton says only half of the warming happens in the first 200 years. The rest may happen over the following 1000-3000 years.
Politicians have set this nonsense 2ºC limit, which when compared to pre-industrial times means in we “only” have 1.1ºC left of warming before mega-disaster happens. I always knew it was a matter of decades, but now it seems to be a matter of *centuries*.
If true this takes the absurdity of the whole DAGW bandwagon to another level. And I wonder how many in the public know this. 0.01%, maybe?
Of course it’s extremely convenient for the usual suspects that it will take so much time for warming to kick in: they can always claim the thing hasn’t been disproved, therefore the money should keep flowing.

Bart
March 25, 2014 10:13 am

rgbatduke says:
March 25, 2014 at 12:54 am
Yes, a lot of people seem to think conservation of energy implies conservation of energy flows. They fail to appreciate the dynamical nature of this system.
However, after giving some thought to it, I realized what I believe to be a substantial oversight in the standard litany. I’d be interested in your thoughts. I wrote it up here. Richard C. et al., would appreciate your taking a look, too.

richardscourtney
March 25, 2014 10:35 am

Bart:
re your request to me at March 25, 2014 at 10:13 am, I have looked and I agree.
There are two issues; viz.
(a) the existence of the radiative greenhouse effect (GHE)
and
(b) the magnitude of enhanced GHE (i.e. climate sensitivity).
I have been trying to help people who dispute (a) because they do not understand the realities pointed out and referenced by davidmhoffer.
You say of (b)

Good question. It doesn’t add up. I’ve come to the conclusion that radiative exchanges are simply not dominant.

At present levels of atmospheric GHG concentrations, I strongly agree. This is because the logarithmic effect of GHG concentration results in little effect from increase to radiative exchanges induced by increase to existing atmospheric GHG concentrations.
Viscount Monckton agrees, too. In his above article he says

Broadly speaking, the IPCC expects this century’s warming to be equivalent to that from a doubling of CO2 concentration. In that event, 1 Cº is the warming we should expect from a CO2 doubling, and the only sense in which the 1.5 Cº lower bound of the IPCC’s interval of climate-sensitivity estimates is “unrealistic” is that it is probably somewhat too high.

And I have repeatedly written that on WUWT and elsewhere saying the following.
The feedbacks in the climate system are negative and, therefore, any effect of increased CO2 will be probably too small to discern because natural climate variability is much, much larger. This concurs with the empirically determined values of low climate sensitivity.
Empirical – n.b. not model-derived – determinations indicate climate sensitivity is less than 1.0°C for a doubling of atmospheric CO2 equivalent. This is indicated by the studies of
Idso from surface measurements
http://www.warwickhughes.com/papers/Idso_CR_1998.pdf
and Lindzen & Choi from ERBE satellite data
http://www.drroyspencer.com/Lindzen-and-Choi-GRL-2009.pdf
and Gregory from balloon radiosonde data
http://www.friendsofscience.org/assets/documents/OLR&NGF_June2011.pdf
Indeed, because climate sensitivity is less than 1.0°C for a doubling of CO2 equivalent, it is physically impossible for the man-made global warming to be large enough to be detected (just as the global warming from UHI is too small to be detected). If something exists but is too small to be detected then it only has an abstract existence; it does not have a discernible existence that has effects (observation of the effects would be its detection).
Richard

Rob Ricket
March 25, 2014 10:35 am

Richard C.,
I’m afraid we are going to have to agree to disagree and this post shall be my last my last on the subject; lest, (as always happens with repeated counter-posts) the discussion degenerates into name calling.
You may say I’m wrong..No problem. I leave you to ponder these two statements from your previous posts.
“Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.”
“But the receiving molecule cannot store the energy internally so it accelerates and, thus, increases the temperature of the gas.”
Rob

Resourceguy
March 25, 2014 10:37 am

You know it’s faux science when those who are otherwise quite capable of reading and understanding the implications of multidecadal climate cycles choose not to.

mellyrn
March 25, 2014 10:48 am

@rgb — I’m really trying to get this; I don’t mean to be obtuse. But in your basketball story, suppose the other players are just as likely to intercept the balls thrown into the hamper by the ball boys, and either throw the balls back out or into the hamper? Also, the extra players notoriously move closer to the edges of the court whenever they intercept a ball from either source (warmed gases rise), and become -slightly- more likely to toss the ball out of the court.
Seems to me that CO2 is going to act just as effectively on incoming IR as it does on outgoing IR. I make the analogy of a cave with a big window. The window lets light in and, of course, light gets reflected back out. So let’s sprinkle bits of silver into the glass of the window so it will catch that outgoing light and bounce it back in, in order to brighten up our cave. It will do that, of course — at the expense of blocking light coming in in the first place.
Why and how does CO2 act on outgoing IR but not on incoming? I am unable to imagine a way in which it could specialize like that. I don’t pretend my lack of imagination constitutes an argument, which is why this is a real question.

mellyrn
March 25, 2014 10:54 am

@rgb — there is no incoming longwave?

DirkH
March 25, 2014 11:04 am

mellyrn says:
March 25, 2014 at 10:48 am
“Why and how does CO2 act on outgoing IR but not on incoming? I am unable to imagine a way in which it could specialize like that. I don’t pretend my lack of imagination constitutes an argument, which is why this is a real question.”
Consider the atmosphere to be a dense fog in the absorption / re-emission band of CO2.
The only question then becomes, what amount of IR in this band (two bands actually, 15 micrometer and 4 micrometer IIRC) comes from the sun, which amount comes from Earth’s surface. (integrated over all frequencies of the respective absorption/re-emission band)
http://en.wikipedia.org/wiki/File:Solar_Spectrum.png
CO2 : Figure 5 here:
http://www.wag.caltech.edu/home/jang/genchem/infrared.htm

richardscourtney
March 25, 2014 11:04 am

Rob Ricket:
Your post at March 25, 2014 at 10:35 am says in total

Richard C.,
I’m afraid we are going to have to agree to disagree and this post shall be my last my last on the subject; lest, (as always happens with repeated counter-posts) the discussion degenerates into name calling.
You may say I’m wrong..No problem. I leave you to ponder these two statements from your previous posts.
“Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.”
“But the receiving molecule cannot store the energy internally so it accelerates and, thus, increases the temperature of the gas.”

I can see why your experience is that your conversations “always” degenerate “into name calling”.
You have taken out of context two different statements I made about about two different things; i.e.
(a) absorbtion of a photon by a GHG molecule
and
(b) acceleration of a non-GHG molecule by a collision
Then you have claimed there is “something” to “ponder” about them.
There is nothing to “ponder” about those statements: they are simply true.
Richard

March 25, 2014 11:28 am

fadingfool says:
March 25, 2014 at 8:55 am
>>>>>>>>>>>>>>>
I didn’t see your potato powered radio, therefor as far as I am concerned, it doesn’t exist.
Do you understand the fallacy of that logic?
I pointed you to three articles that go into considerable detail on how the whole process works, and they INCLUDE the actual measured downward LW energy flux that you insist doesn’t exist. It HAS been measured, it CAN do work, it DOES to work, we can even measure the work it does.
But you can’t get past the difference between storing energy from a photon in a molecule and changing the direction that the photon is travelling in. You won’t read the text book that RGB pointed you at, you won’t read the articles I pointed you at, and you dismiss out of hand the fact that millions of engineers design products everyday world wide based on the exact same physics as those text books and articles.
So I am dropping this thread because there is clearly no point in trying to explain what is essentially first year physics to people who have not studied it and refuse to do so when given the opportunity.

Mike M
March 25, 2014 11:32 am

Mike M: You ask at March 25, 2014 at 9:14 am : Would it be a worthwhile experiment to compare heating two buckets (white plastic) of cold water with an IR lamp from the top – one with only water and the other with carbonated water?
————————–
richardscourtney says: March 25, 2014 at 9:22 am: “It would be pointless. The radiative properties of CO2 are very different when the CO2 is free molecules in air and dissolved molecules in water.”
—————————-
richardscourtney says: March 25, 2014 at 9:58 am: An excited GHG molecule may collide with another molecule e.g. a nitrogen molecule. In this case, the GHG molecule may transfer its internal (i.e. vibrational or rotational) energy to the other molecule. But the receiving molecule cannot store the energy internally so it accelerates and, thus, increases the temperature of the gas. Hence, the collision has thermalised the energy which was stored in the GHG molecule.
————————-
“and, thus, increases the temperature of the gas.” … So I conclude that you agree in principle what I’m looking at but not the experiment I thought might reveal it?
I certainly submit the radiative properties of CO2 in water are different, (I never said they were like air), but by how much? What would happen? How about carbon soot particles in water versus none? Can you think of an experiment to illustrate what you stated – “and, thus, increases the temperature of the gas.”?
If more CO2 does indeed go toward a little more heating of the atmosphere then that heat did NOT get “re-radiated” back down to the ground at all and such IMO blows away a huge chunk of ERB theory assumptions in regard to AGW.

Bart
March 25, 2014 11:41 am

richardscourtney says:
March 25, 2014 at 10:35 am
Yes, but the point is, not all of the heat transfer is radiative and, while the so-called GHGs do act as insulators, they also act as radiators. Just as in an automobile engine, if you can get the heat to the radiator via more efficient means than radiation, then you will actually get cooling.
The talk of radiation only misses a large part of the system.
HenryP says:
March 25, 2014 at 10:50 am
It is rather long, and I may have missed something, but it seems still to be talking mostly of radiation only. The Earth has significant internal heat transfer mechanisms other than just radiation. That is the point I am trying to make.
If the GHG layer were separated from the surface of the Earth by pure vacuum, then the all radiation discussions would make sense. But, for heat flux balance at the surface, one also has to take into account the very significant heat transfer from convective flows which carry surface heat directly to the radiating layers for dissipation to space.

Bob Kutz
March 25, 2014 11:44 am

So . . . y’all are accusing Mann of having falsified a graph by using a data source that is not indicated in the description of the graph and not revealed in the data reference either?
I am completely shocked at the allegation.
Is there no level of deception this man will refuse to stoop to?
Has he no shame? Has he . . . oh, wait.
(And for the record; this is humorous satire, not a direct allegation of fraudulent behavior, Dr. Mann. As such, it is protected free speech rather than say, libel.)

Craig C
March 25, 2014 11:50 am

Mann: “The graph became a lightning rod in the climate change debate, and I, as a result, reluctantly became a public figure.”
Gee, that means that Stein can comment publicly about him. Hmm?

richardscourtney
March 25, 2014 11:50 am

Mike M:
re your post addressed to me at March 25, 2014 at 11:32 am.
You ask

I certainly submit the radiative properties of CO2 in water are different, (I never said they were like air), but by how much? What would happen? How about carbon soot particles in water versus none? Can you think of an experiment to illustrate what you stated – “and, thus, increases the temperature of the gas.”?

I twice attempted an experiment which would try to quantify what you suggest.
But he real world varies from place to place. A result obtained from a single experiment would be open to dispute.
Idso conducted his eight “natural experiments” which derived his climate sensitivity values from a variety of environments. I think you would like to read his paper which is here
http://www.warwickhughes.com/papers/Idso_CR_1998.pdf
And the GCMs exist as an attempt to emulate all the complexities which occur in the atmosphere.
Regards
Richard

Bart
March 25, 2014 11:51 am

mellyrn says:
March 25, 2014 at 10:48 am
“Seems to me that CO2 is going to act just as effectively on incoming IR as it does on outgoing IR.”
It would, but most of the incoming sunlight energy is contained above the IR band. This energy heats the ground, but the ground radiates back relative to its temperature, and the Planck distribution peaks in the IR.
So, you essentially have something like a diode acting on current in an electrical circuit – the GHG layer allows sunlight to enter essentially unimpeded, but outgoing radiation is impeded.
But, the diode is shorted, because the heat “current” has other avenues than radiation to flow. That is the point I am trying to get across.

March 25, 2014 11:59 am

fadingfool says:
March 25, 2014 at 8:55 am
http://climaterealists.com/?action=report&uid=8100&id=9004
Thought you might enjoy a picture of the “work” CO2 can do.

March 25, 2014 12:00 pm

@Bart (&others)
True, I don’t know anything about the effects other than GH
it just becomes too complicated if you have to take all factors into account, but your analogy of the radiator works for me.
I only speak about the GH effect as there are some here that actually do not believe that it exists.
To prove that it exists:
here in southern Africa : we clearly have it much warmer in a cloudy winters’ night compared to a winters night when it is not cloudy
but even within that GH effect I truly believe that most have not understood that GHG’s also cool the atmosphere. For example, the way we identify CO2 on other planets is because of its absorption (and subsequent re-radiation) in a region of the spectrum of the UV.
That means energy from the sun hits on the CO2 and is sent back to space, leading to a cooling effect (of the atmosphere of the planet)
For comprehensive proof that CO2 is (also) cooling the atmosphere by re-radiating sunshine, see here:
http://www.iop.org/EJ/article/0004-637X/644/1/551/64090.web.pdf?request-id=76e1a830-4451-4c80-aa58-4728c1d646ec
They measured this re-radiation from CO2 as it bounced back to earth from the moon. So the direction was sun-earth (day)-moon(unlit by sun) -earth (night). Follow the green line in fig. 6, bottom. Note that it already starts at 1.2 um, then one peak at 1.4 um, then various peaks at 1.6 um and 3 big peaks at 2 um. You can see that it all comes back to us via the moon in fig. 6 top & fig. 7. Note that even methane cools the atmosphere by re-radiating in the 2.2 to 2.4 um range.

Bart
March 25, 2014 12:11 pm

HenryP says:
March 25, 2014 at 12:00 pm
“For comprehensive proof that CO2 is (also) cooling the atmosphere by re-radiating sunshine, see here:”
I like. Thanks for the link.
FWIW, I gave a mathematical description of why I think it is likely that the convective-radiative cycle should push the system to near zero CO2-to-temperature sensitivity here. And, near zero sensitivity is essentially what we see.
Aside from the long term trend and the ~60 year cycle, there is little change in temperatures since at least 1880 (the temperature record gets rapidly more uncertain before then – really,anything before 1900 is pretty dicey), thus little evidence of any response to rising CO2 at all in the surface temperatures.

March 25, 2014 12:15 pm

I believe there must be a typo in Lord Monckton’s text quoted here:
Science: The Arctic has not lost as much sea ice as had been thought. In the 1920s and 1930s there was probably less sea ice in the Arctic than there is today. The decline in sea ice is small in proportion to the seasonal variability, as the graph from the University of Illinois shows. And the part of the satellite record that is usually cited began in 1979. An earlier record, starting in 1973, showed a rapid growth in sea ice until it reached its peak extent in 1970.
When did the earlier satellite record actually start? It’s hard to see a “peak in 1970” when the dat starts in 1973. Did the two dates get swapped, did the earlier satellite data start before 1973, or was the peak date misstated?
Otherwise, thanks for the point-by-point analysis.

March 25, 2014 12:32 pm

Bart says
anything before 1900 is pretty dicey
Henry says
I would say anything before 1940 is pretty dicey
because
a) show me a calibration certificate of a thermometer before 1940
b) from 1970 measurements are done automatic, recorded every second , compared to previous man made, 4 times a day (if you are lucky)
c) accuracy of measurement has much improved (thermo couples) from 1970
So comparing data from before 1940 with data from 1970-2014 is like comparing apples with pears…
hence I only believe in my own results….
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
a true analysis of those results (and anyone duplicating them) will show that it will be cooling until 2040

Bart
March 25, 2014 12:32 pm

Bart says:
March 25, 2014 at 11:51 am
“This energy heats the ground, but the ground radiates back relative to its temperature, and the Planck distribution peaks in the IR.”
There is one more point I’d like to make relating to this. The radiation from the Earth is not genuine blackbody. In fact, there is a large gap right at where the peak should be, as in Figure 3.5 here.
AGW proponents like to claim that the gap is produced by the atmospheric absorption, and is readily apparent proof of the power of GHGs in blocking surface IR emissions.
But, I was giving this some thought, because it seemed to gainsay my suspicion that direct convective heating of the GHGs could cancel the radiative GHE, and I realized this: What does a spectrum with a huge chunk of red taken out look like to the eye? Well, it looks blue. And, the Earth is largely… blue, because of the oceans. And, the oceans look blue whether you are looking at the blue marble from space, or from the edge of a dock.
So, this gap is not necessarily indicative of atmospheric absorption in the IR. It mostly just means the oceans are blue.

mellyrn
March 25, 2014 12:45 pm

@ DirkH: Thank you; you’re very kind and those are good, clear figures. It looks as though there is very little 15um and 4um coming from the sun, or at least arriving at/near the surface.
But there wouldn’t need to be much, I’m thinking? Whatever IS there, the CO2 acts on it just as it does on the 15’s and 4’s heading out, limited only by how much CO2 there is.
It still seems that it must block as much as it traps. I could see the diode thing (@Bart), if there were NO 15’s or 4’s from the Sun.

pottereaton
March 25, 2014 1:42 pm

Re Michael Mann: Mark Steyn has hired three new lawyers and they sound fearsome:
http://www.steynonline.com/6201/what-kind-of-fool-am-i
Maybe he’ll get that $30 million afterall.

richardscourtney
March 25, 2014 3:44 pm

crakar24:
re your question at March 25, 2014 at 3:03 pm.
Yes, your explanation is the reality as it is currently understood.
I commend the posts of Roger Brown (rgbatduke) and David Hoffer (davidmhoffer) for further introduction to the subject.
Richard

Bart
March 25, 2014 5:52 pm

crakar24 says:
March 25, 2014 at 3:03 pm
“So what you are saying is a GHG will absorb and release IR energy without any heat exchange because the kinetic energy of the GHG is not altered.”
Specifically, translational kinetic energy. See here.
mellyrn says:
March 25, 2014 at 12:45 pm
“I could see the diode thing (@Bart), if there were NO 15′s or 4′s from the Sun.”
The ground does not have to be excited by photons at 15 or 4 to emit at 15 or 4. See here.

crakar24
March 25, 2014 6:09 pm

Bart says:
March 25, 2014 at 5:52 pm
Thanks for the link Bart, at this point i am operating without a net as it is a long way from my field of expertise (electronics). I kind of understood what the wiki article states so do you think my above statements to Richard is accurate? If so then thats good enough for me for now.
I will do some further reading on the subject and try to get a greater understanding of the process.
Thanks again
Crakar24

Robert_G
March 25, 2014 8:51 pm

Richardcourtney and rgbatduke thanks for the great discussions.
Since the conversation has taken a turn from “hide the decline and pass the sick-bucket, Alice,” I have a physics question for you Richard.
Since you say temperature is a reflection of only the translational movement of molecules (atoms); if one were able to reach absolute zero, would rotational and vibrational “intra- molecular movements” still be allowed (along with what ever is going on with the internal structure of the atoms themselves)? And could this “extra-thermal” energy be theoretically somehow further reduced to get a lower total energy state below the threshold of absolute zero? I’m guessing there is some quantum uncertainty principle involvement (e.g., momentum-position) that prevents the ultimate perfection of this process.
I apologize in advance if this is utter nonsense, but I’m intrigued.

Patrick
March 25, 2014 10:37 pm

Danger zone in 22 years? How many times have we heard similar claims? All failed! 22 years, not long to wait.

richardscourtney
March 25, 2014 11:27 pm

Robert_G:
re your question at March 25, 2014 at 8:51 pm.
OK has no relevance to this thread but it is an imaginary state and the Mann’s ‘white line’ is also imaginary so I will address your question.
Firstly, matter is energy (E=mc^2) so if there were any matter at 0K then there would be energy at 0K.
By definition, 0K would exist when there were no activity. In other words, at OK any matter could not exist as a gas and there could be no transfer of energy between states.
Richard

fadingfool
March 26, 2014 2:25 am

– you linked to a photo of frost in the shade with an assertion as an explanation?
– potato powered radios used to be a standard experiment for children interested in physics and engineering. I was not, as you seem to think, getting into sophistry. Yes I have studied physics (admittedly only to A-level, with Maths and Chemistry – the easy 3) I continued with Mathematics at University. So rather than getting hot under the collar try a cogent argument that doesn’t treat IR of a limited (15 micron IIRC) spectra as phlogiston.

March 26, 2014 3:55 am

An extraordinarily important question has not been raised yet. Mann asserts that enhanced global warming continues. He believes that a 2C degree rise in some 20 years is coming. Fine.
But since this means a median yearly global temperature rise of about 0.1C degrees, where is all that energy coming from? Where are the early indicators that this energy rise is manifest? Merely arm-waving and asserting this rapid rate of global temperature increase is one thing – demonstrating that such an increase is both plausible and reasonable is quite another.
As we all know, relying on already falsified GCMs is no use – they don’t work. What makes Mann at SciAm real science? And not just another Sci-Scam?

rgbatduke
March 26, 2014 6:07 am

rgb – if back radiation actually existed and could be directly measured then we could harness it. Given as I have no “back radiation” powered flashlight I suspect it is likely to not be a physical force and more likely only an artefact of misguided thinking.
Backradiation does exist, we can directly measure it, but we cannot “harness” it, outside of the tiny currents it sets up in suitable photocells. We have a hard enough time harnessing sunlight at a reasonable efficiency (and some small fraction of the energy yielded by solar cells comes from back radiation — small because the frequencies are wrong not because of vastly different intensities).
One basic problem with harnessing random energy sources is the pesky second law. One has to have a substantial temperature difference between two reservoirs in order to be able to interpolate a heat engine in between and run it. Back radiation is locally at more or less the same temperature as the outgoing radiation from the surface. And please, do not tell me that this means that it cannot “warm” the surface — it is part of an energy budget that eventually determines the temperature of the surface. The surface ends up warmer with it than it does without it, but net energy/heat consistently flows from warmer to cooler and the second law is quite happy with e.g. 1 layer models or steel greenhouses.
To the person who lamented the difference between heat and energy — don’t feel bad if you are a bit confused, it is a difficult concept. I’ll make it even worse. If you have a jar of monoatomic non-interacting molecules at a fixed temperature, they have a total internal kinetic energy U = 3/2 kT = \sum_i 1/2 m_i v_i^2. This internal kinetic energy is not heat. One can do work on the contents of the jar by changing its volume with a piston: W = \int P dV. This work is not heat. Heat is internal energy that flows (spontaneously) in or out of the jar through its boundary.
This is expressed in the second law:
\Delta Q_in = \Delta U_of + \Delta W_by
In words: Heat flowing into the jar is split between increasing the internal energy of the gas in the jar and the gas in the jar doing work. All three terms can have either sign, but the sum must remain consistent. Note also that we can never speak of the “heat content of the jar” — we can only talk about its total internal energy.
To a physicist, heat is a kind of random energy in flow. It is closely tied to entropy:
\Delta S = \Delta Q/T
which is a direct measure of (the irreversible change of) the disorder in a system has heat flows.
That doesn’t mean that we always are particular careful in our usage of the words — I’ll sometimes speak of the heat in a jar, and we all speak of the heat capacity of jars of gas — but when we get down to equations or using the concepts we sharpen up.
Hope this helps.
rgb

Bart
March 26, 2014 6:18 am

Robert_G says:
March 25, 2014 at 8:51 pm
“…if one were able to reach absolute zero, would rotational and vibrational “intra- molecular movements” still be allowed…”
No, because of the equipartition theorem.
“…I’m guessing there is some quantum uncertainty principle involvement…”
Indeed. See Zero-Point Energy.

Crusty the Ex-Clown
March 26, 2014 10:17 am

Uh, Dr. Mann, could we discuss faux principal components analysis, please?

Neo
March 27, 2014 1:47 pm

It looks like the National Science Foundation has been handing out grants for some unorthodox research projects, according to House Republicans.
This includes $700,000 in funding for a climate change musical.

http://dailycaller.com/2014/03/26/feds-spent-700000-on-a-climate-change-musical/
The Music Mann?
With a capitol T and that rhymes with P and that stands for Phool….or Phraud?

March 27, 2014 2:23 pm

There was a great animated graphic a few years ago, putting the variation in temps in context from near term (20 years) out through a few thousand years. I am looking to find that, and find out what went into that. That graphic was, for me anyway, a critical factor in changing my outlook. Does any one have a link to this? Or a page explaining it? Thanks!

Reply to  Joe Landman
March 27, 2014 6:31 pm

@joe
Do a search on j storrs hall here on wuwt
He had some graphs on that, tha convinced me

Cramer
March 29, 2014 1:50 am

Michael Mann’s “little white line” is calculated as follows:
Raw northern hempisphere temperature data (see links below):
nhBEST
nhHadCRUT4
Adjustment Calculations:
stdBEST = StdDev [nhHadCRUT4 (1850-2011)]
stdHadCRUT = StdDev [nhHadCRUT4 (1850-2011)]
baselinepre = Average [nhBEST (1750-1849)]
varadjBEST = nhBEST*(stdHadCRUT/stdBEST)-baselinepre
ModMeanHadCRU = Average [nhHadCRUT4 (1961-1990)]
ModMeanBEST = Average [nhHadCRUT4 (1961-1990)]
“little white line” = nhHadCRUT4 – ModMeanHadCRU + ModMeanBEST
This is shown in Mann’s MATLAB code:
http://www.meteo.psu.edu/holocene/public_html/supplements/EBMProjections/MatlabCode/EBMProjection.m
See lines 52 to line 84. “little white line” is calculated on line 84.
Mann’s nhBEST file:
http://www.meteo.psu.edu/holocene/public_html/supplements/EBMProjections/Data/BEST_annual_nh.dat
Mann’s nhHadCRUT4
http://www.meteo.psu.edu/holocene/public_html/supplements/EBMProjections/Data/HadCRUT4_annual_nh.dat
nhBEST data (see 3rd column — approx average of 12 monthly anomalies):
http://berkeleyearth.lbl.gov/auto/Regional/TAVG/Text/northern-hemisphere-TAVG-Trend.txt
nhHadCRUT4 data (see last column):
http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT4-nh.dat

Cramer
March 29, 2014 1:58 am

CORRECTION (to my March 29, 2014 at 1:50 am comment):
ModMeanBEST = Average [nhHadCRUT4(1961-1990)]
should have been
ModMeanBEST = Average [varadjBEST(1961-1990)]
where varadjBEST is variance adjusted nhBEST.

Cramer
March 29, 2014 2:06 am

ANOTHER CORRECTION (to my March 29, 2014 at 1:50 am comment):
stdBEST = StdDev [nhHadCRUT4 (1850-2011)]
should have been
stdBEST = StdDev [nhBEST (1850-2011)]
[Sorry for the errors. Please correct them during moderation if you can.]

Cramer
March 29, 2014 2:35 pm

HenryP,
The HadCRUT anomalies you provided from woodfortrees.org (or even straight from Hadley Center) are calculated relative to a base period of 1961-1990. The 2 degrees Celsius “threshold” is relative to preindustrial temperatures (pre-1850). You have to adjust your anomalies upward. Mann calculates this adjustment to be 0.45674 degrees Celsius from 1750-1849 variance-adjusted BEST temperatures. So your HadCRUT4 graph that looks to have recent temperatures at about 0.55 C (relative to 1961-1990) should be at 1.0 C (relative to pre-1850).
The additional 0.15 degrees C (above 1.0 C) as seen in the “Faux Pause” section of Mann’s graph is because it’s northern hemisphere temperatures. Mann says this in the graph description (see 1st sentence under “Danger Zone in 22 Years”):
“If the Northern Hemisphere’s surface temperatures rise more than two degrees Celsius above preindustrial levels (baseline),…”
I ran Mann’s MATLAB code using global HadCRUT4 and global BEST temperatures in place of northern hemisphere temperatures. The global temperature projections (sensitivity = 3) from his model cross the 2 degree Celsius “threshold” in 2041 instead of 2036.
Note: 1941 is before the 1946 northern hemisphere projection with sensitivity = 2.5.
It’s a simple energy balance model. Play around with it by entering your own assumptions. Mann even give the different sensitivities of 1.5, 2.0, 2.5, 3.0, and 4.5 in his graph. These give 2.0 degree Celsius “threshold” dates of 2093, 2063, 2046, 2036, and 2021, respectively.

March 29, 2014 6:43 pm

@Cramer
Your analysis is simply wrong for the following reasons
1) ACCURACY
mercuric thermometers were used before 1940, with 0.2 error and I have challenged anyone to bring me an official re-calibration certificate from a thermometer used before 1940. I have not yet seen one, After 1970 we started using calibrated thermocouples <0.05 error
2) NUMBER OF MEASUREMENTS
before 1940 they took 4 or 6 measurements a day if you were lucky compared to continuous recording now which is like every second of the day…
So comparing data from 1970 is not comparing apples with apples. It is rather something like comparing apples with pears.
I reckon that with the bringing in of modern techniques to determine average temperature the results could easily have shifted upwards by about 0.2 degrees C.This is significant in any analysis.
In my wood for trees analysis I have looked only at data from 1987 -2015. They all see a top of warming being reached around 1998 and a cooling trend from the millennium. I had expected to find this because my own 3 data sets all show that we are cooling from around 2000. Here it is:
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
Note the specific sampling procedure followed to attain global representation.
Note with me that my sample for means (middle table, bottom, blue figures) reports the results of a warming rate of 0.13 C/decade from 1980 and 0.14 C/decade from 1990 which is very close to the values reported by others. It therefore seems most probable that my reported global cooling rate of -0.17C per decade from 2000 is also more or less correct.
Looking at the first table (red figures, bottom) we see maximum temperatures dropping at a rate best described by a binomial distribution, like somebody throwing a ball.
The best fit I could make for this drop was
http://blogs.24.com/henryp/2012/10/02/best-sine-wave-fit-for-the-drop-in-global-maximum-temperatures/
because if it were projected from the binomial we would be falling into an ice age.
If you scroll down , you will note that in one station I had good data for maxima going down to 1942. Maxima is a much better proxy for evaluating energy coming in, as it is not affected by the change in the number of measurements.
Setting the speed of warming/cooling out against time, you get acceleration, or, in this case, deceleration, in degrees C / t2. When looking at that plot for the first time, it was as if God Himself gave me a revelation. The curve exactly looks like the speed of a thrown object plotted against time. My results suggest that earth is most likely on an 87 or 88 year A-C wave, the so-called Gleissberg solar/weather cycle, with ca. 44 years of warming followed by 44 years of cooling.
Indeed, I hope that this is the best fit for my data, because any of the other best fits that I could think of, would have us end up in much more global cooling. Other investigations confirm the very existence of the Gleissberg solar/weather cycle.
http://www.nonlin-processes-geophys.net/17/585/2010/npg-17-585-2010.htmlhttp://www.nasa.gov/vision/earth/lookingatearth/nilef-20070319.html
Note that the results of my plot suggest that this global cooling already started in 1995 as far as energy-in is concerned and will last until ca. 2038. Also, from the look at my tables, it looks earth’s energy stores are depleted now and average temperatures on earth will probably fall by as much as what the maxima are falling now. I estimate this is about -0.3K in the next 8 years and a further -0.2 or -0.3K from 2020 until 2038. By that time we will be back to where we were in 1950, more or less…
The above are my own results. Three data sets of mine, maxima, means and minima showed that it is globally cooling.
From wood for trees we can see another 4 global data sets showing that it is cooling from the new millennium,
Yet,that is not all of it. We are cooling from the top latitudes downwards, so if you measure in the middle latitudes you would not find much change (due to more energy coming free from more condensation.)
My own results show that it has been cooling significantly in Alaska, at a rate of -0.55 per decade since 1998.
http://oi40.tinypic.com/2ql5zq8.jpg
That is almost one whole degree C since 1998. And it seems NOBODY is telling the poor farmers there that it is not going to get any better.
NASA also admits now that antarctic ice is increasing significantly:
http://wattsupwiththat.com/2013/10/22/nasa-announces-new-record-growth-of-antarctic-sea-ice-extent/#more-96133
So, all in all, Mr Cramer, I have 7 global data sets showing we are globally cooling. My own data sets suggests that we will be cooling until at least 2038, and probably a few years after that as wel
I am sure you will feel the cooler weather soon.

Cramer
March 29, 2014 10:53 pm

HenryP,
You said, “your analysis is simply wrong for the following reasons.”
I made no analysis. I simply provided FACTS. I did NOT endorse Mann’s analysis. I made NO CLAIMS about climate change. I simply informed you that (1) the HadCRUT anomaly data you referenced at woodfortrees.org is relative to a base period of 1961-1990, (2) that Mann had to change this to a pre-industrial base period, and (3) that Mann used Northern Hemisphere temperature anomalies. These are FACTS.
The first series you referenced was the HadCRUT4 global mean (monthly). You select anomalies from 1987 to 2015. Look at the data. The first data point is 0.104 degrees Celsius at 1987.0. The last data point is 0.506 degrees Celsius at 2014.0. These are temperature ANOMALIES. This is relative to the average temperature anomaly for the 30 year period from 1961 to 1990. In the data set this period of 360 anomalies will average to zero.
You can see this for yourself. Try this exercise:
Pull up the entire HadCRUT4 monthly anomalies from January 1850 to January 2014 at woodfortrees.org. This is 1,968 data points. Put this in Excel and calculate a 30 year trailing moving average (360 data points). The moving average will start at December 1879 with a value of -0.30131 and run to Jan 2014 with a of 0.324603. Plot this. You will see that the moving average time series remains negative until it passes through zero at December 1990, then it will turn and remain positive. The 30 yr trailing moving average for Dec 1990 is -0.00051 and for Jan 1991 is 0.000125.
It doesn’t matter if you believe that any of these temperature anomalies are correct. These are the FACTS of the HadCRUT4 data set you referenced. IT’S MATH.

March 29, 2014 11:54 pm

cramer says
Pull up the entire HadCRUT4 monthly anomalies from January 1850 to January 2014 at woodfortrees.org. This is 1,968 data points. Put this in Excel and calculate a 30 year trailing moving average (360 data points). The moving average will start at December 1879 with a value of -0.30131 and run to Jan 2014 with a of 0.324603. Plot this. You will see that the moving average time series remains negative until it passes through zero at December 1990, then it will turn and remain positive. The 30 yr trailing moving average for Dec 1990 is -0.00051 and for Jan 1991 is 0.000125.
henry says
all tough we are using linear equations to determine the speed of warming, we also know that the distribution behind the figures is non linear.I detected the nature of this non-linearity and know what it is like.
Other studies confirm my own finding
e.g.
Persistence of the Gleissberg 88-year solar cycle over the last ˜12,000 years: Evidence from cosmogenic isotopes
Peristykh, Alexei N.; Damon, Paul E.
Journal of Geophysical Research (Space Physics), Volume 108, Issue A1, pp. SSH 1-1, CiteID 1003, DOI 10.1029/2002JA009390
Among other longer-than-22-year periods in Fourier spectra of various solar-terrestrial records, the 88-year cycle is unique, because it can be directly linked to the cyclic activity of sunspot formation. Variations of amplitude as well as of period of the Schwabe 11-year cycle of sunspot activity have actually been known for a long time and a ca. 80-year cycle was detected in those variations. Manifestations of such secular periodic processes were reported in a broad variety of solar, solar-terrestrial, and terrestrial climatic phenomena. Confirmation of the existence of the Gleissberg cycle in long solar-terrestrial records as well as the question of its stability is of great significance for solar dynamo theories. For that perspective, we examined the longest detailed cosmogenic isotope record—INTCAL98 calibration record of atmospheric 14C abundance. The most detailed precisely dated part of the record extends back to ˜11,854 years B.P. During this whole period, the Gleissberg cycle in 14C concentration has a period of 87.8 years and an average amplitude of ˜1‰ (in Δ14C units). Spectral analysis indicates in frequency domain by sidebands of the combination tones at periods of ≈91.5 ± 0.1 and ≈84.6 ± 0.1 years that the amplitude of the Gleissberg cycle appears to be modulated by other long-term quasiperiodic process of timescale ˜2000 years. This is confirmed directly in time domain by bandpass filtering and time-frequency analysis of the record. Also, there is additional evidence in the frequency domain for the modulation of the Gleissberg cycle by other millennial scale processes.
end quote
so why on earth would I look at a 30 yrs running average?
there maybe a number of other cycles like PDO, AMO, moon cycles and volcanic cycles etc that influence the weather but the all important one is the the amount of energy being let through the door, i.e. the atmosphere, as this is the one that prescribes to the others on how to behave.
Sorry if I misunderstood, but you seemed to be giving an understanding for Mann’s analysis, thereby endorsing it or at least partially agreeing with it. All I am saying is that you cannot compare data from before 1940 with data from after 1970 and I wanted to give you an understanding of what the non-linear behavior of the weather on earth really looks like, as calculated/estimated by me from accumulated data after 1970.
http://blogs.24.com/henryp/2013/04/29/the-climate-is-changing/
best wishes
henry

Cramer
March 30, 2014 1:08 am

HenryP says: “so why on earth would I look at a 30 yrs running average?”
Do you understand that if you calculate the average of HadCRUT4 data from 1961 to 1990 (30 annual data points or 360 monthly data points), it will average to zero? Do you understand why this is? It seems you do not, because you keep talking about exogenous data like solar cycles, lunar cycles, etc.
HenryP says: “but you seemed to be giving an understanding for Mann’s analysis, thereby endorsing it or at least partially agreeing with it.”
So, if I understand an analysis that means I’m endorsing it or agreeing with it? That is not logical. One must understand the analysis before they can agree or disagree with it. If someone doesn’t understand it, their opinion (whether in agreement or disagreement) is worthless.
HenryP says: “All I am saying is that you cannot compare data from before 1940 with data from after 1970.”
Then why did you reply to my comment? That has nothing to do with my posting of the calculations in Mann’s MATLAB code. I posted that because the subject of this blog post was questioning were the “little white line” came from. If police where questioning people about a crime and I was a witness, I would inform the police what I knew. That does not mean I’m endorsing the crime. Anthony Watts posted the graph from Michael Mann and his own graphs that include data from 1850 to 2011. All these graphs are wrong if you can not compare data before 1940 with data after 1970. Your first comment was on March 24. Why didn’t you bring this issue up earlier???

March 30, 2014 9:11 am

@cramer
average Hadcrut 4 data from 1960-1990 is not a straight line?
http://www.woodfortrees.org/plot/hadcrut4gl/from:1927/to:2015/plot/hadcrut4gl/from:2002/to:2015/trend/plot/hadcrut3gl/from:1987/to:2015/plot/hadcrut3gl/from:2002/to:2015/trend/plot/rss/from:1987/to:2015/plot/rss/from:2002/to:2015/trend/plot/hadsst2gl/from:1987/to:2015/plot/hadsst2gl/from:2002/to:2015/trend/plot/hadcrut4gl/from:1960/to:1990/trend/plot/hadcrut3gl/from:1987/to:2002/trend/plot/hadsst2gl/from:1987/to:2002/trend/plot/rss/from:1987/to:2002/trend/plot/hadcrut4gl/from:1930/to:1960/trend
either way, the above plot shows more or less what I know is the underlying distribution, namely a general warming trend obvious from 1927 until 1998. This is exactly what we could expect to see, naturally, from the 88 year Gleissberg cycle, where 1927 was a turning point and [at the same time] not fully well knowing what the average global temperature was, exactly, before that time, i.e. before 1927,
As I said, due to
poor accuracy of thermometers
and
different methods of data collection.
The problem I have is that people do not (want to?) understand that global warming is over now and that global cooling has begun. We are globally cooling from around 2000.
Note that it really was very cold in 1940′s….The Dust Bowl drought 1932-1939 was one of the worst environmental disasters of the Twentieth Century anywhere in the world. Three million people left their farms on the Great Plains during the drought and half a million migrated to other states, almost all to the West. http://www.ldeo.columbia.edu/res/div/ocp/drought/dust_storms.shtml
I find that as we are moving back, up, from the deep end of the 88 year sine wave, there will be standstill in the change of the speed of cooling, neither accelerating nor decelerating, on the bottom of the wave; therefore naturally, there will also be a lull in pressure difference at that > [40 latitude], where the Dust Bowl drought took place, meaning: no wind and no weather (read: rain).
2014-88=1926. We have about 6 or 7 years before the beginning of the droughts at the higher latitudes.

Cramer
March 30, 2014 11:53 am

HenryP says: “average Hadcrut 4 data from 1960-1990 is not a straight line?”
Straight line? Where did you get this? I never claimed it was a straight line. I said it averages to zero. There is a reason for this. 1961 to 1990 is the base period. This was chosen as a base period for exactly what your saying: “poor accuracy” of earlier data, such a pre-industrial data. However, you do not seem to understand this. Otherwise you would acknowledge the facts of my previous comments rather than go off on tangents that have nothing to do with what I have said.
HenryP says: “The problem I have is that people do not (want to?) understand that global warming is over now and that global cooling has begun.”
Why are you preaching this to me? I did not make any claim or suggestion that this was not true. Please remove the @cramer from your comments and direct your comments to someone that is actually claiming global warming is continuing with or without pause. Otherwise, if you continue to misrepresent what I have said, I will have to continue to defend myself. I have only given facts. If you want to dispute the facts (e.g. I actually did make a mistake in describing Mann’s MATLAB code), please do so.

March 30, 2014 12:25 pm

Cramer says: I said it averages to zero.
Henry says
if that were the case, it (=a linear regression) should be a straight line (between 1960-1990)
So, clearly, WFT is using original data. You cannot “correct” data. The data is the data. If you “correct” the data you have something else.
Otherwise, I am glad that we both agree that we have started to cool globally.
That helps.

March 30, 2014 12:34 pm

Henry said
if that were the case, it (=a linear regression) should be a straight line (between 1960-1990)
Henry says: this should read
if that were the case, it (=a linear regression) should be a straight line with zero slope i.e. no incline or decline (between 1960-1990)

Cramer
March 30, 2014 1:54 pm

HenryP says: “if that were the case, it (=a linear regression) should be a straight line with zero slope i.e. no incline or decline (between 1960-1990)”
That is not true. What is the average of this data series: -2,-1,0,1,2. This is a straight line with a slope=1. It averages to zero. Add the five points and divide by five. Why not just take an average of the 360 data points (Jan 1961 to Dec 1990) in HadCRUT4. Add the 360 points and divide by 360. It is equal to -0.00051. Since it’s not exactly zero (due to rounding error), this is why I wanted you to calculate the “trailing moving average,” because you can then see this cross zero at Dec 1990. Surely, you know how to use a spreadsheet, if you are calculating sine waves. Do you? Do you know what a moving average is? Maybe I assumed you did, when you might not.
HenryP says: “So, clearly, WFT is using original data.”
I never claimed WFT was NOT using the original data. It’s the same data from the Hadley Center. The original data is calculated with a baseline of 1961-1990. WFT is using this same data.
HenryP says: “You cannot “correct” data.”
I never said anything about “correcting” data. “Correcting” implies the original data is wrong. Changing the baseline of temperature anomalies is not “correcting” it. You seem to believe these temperature anomalies are ABSOLUTE temperatures. They are NOT. They have to be calculated relative to a baseline. In the graph used above to compare different data, they used a baseline of Jan 1951 to Dec 1980 (although that looks to be a typo — but they did specify it because it is important to do so). I could adjust the data to a baseline of 1945 to 1965. It does not make the data wrong if that is specified. I weigh 170 lbs. If I start saying that I weigh 77.11 kg, does that mean I am not giving my correct weight?
“Otherwise, I am glad that we both agree that we have started to cool globally.”
There you go making assumptions again. I no more believe in your Gleissberg sine wave theory, than I believe in Mann’s 2036 prediction. I’m a skeptic. Those that believe they know the truth (warming or cooling) are not skeptics. When the heat capacity of the oceans are 1000 times the heat capacity of the atmosphere, that’s where the answer lies — and it has not been figured-out yet. No warming from the mid-1940s to the mid-1970s (when we were emitting a lot of CO2), warming from the mid-1970s to late-1990s, then no warming again. These periods do correspond with ENSO activity (el nino vs la nina predominance). See heat capacity here:
http://wattsupwiththat.com/2011/04/06/energy-content-the-heat-is-on-atmosphere-vs-ocean/

March 31, 2014 9:47 am

Cramer says
I no more believe in your Gleissberg sine wave theory, than I believe in Mann’s 2036 prediction.
\
Henry says
it is not theory
it is fact
my own results merely confirm it

Cramer
March 31, 2014 1:35 pm

HenryP says: “it is not theory it is fact”
It is a fact that global temperatures will decrease by 0.6 deg C by 2038?

April 1, 2014 8:51 am

Cramer says
(re-stating Henry’s claim)
It is a fact that global temperatures will decrease by 0.6 deg C by 2038?
Henry says
According to my own data set, we are already down by ca. -0.2 since 2000.
The other 4 data major global sets make this about -0.1 since 2002, on average
Now, as I said before, the major cooling actually is happening from the top latitudes downward,
so between [60-70] it is cooling more
For example, my own results show that it has been cooling significantly in Alaska, at a rate of -0.055 per annum since 1998 (average of 10 weather stations in Alaska)
http://oi40.tinypic.com/2ql5zq8.jpg
That is almost 1 degree C since 1998.
So,
all I am saying that by 2040 temperatures we will be back, more or less, to where we were in 1950
It seems to me that around 1950, it was -0.1?
http://www.woodfortrees.org/plot/hadcrut4gl/from:1927/to:2015/plot/hadcrut4gl/from:2002/to:2015/trend/plot/hadcrut3gl/from:1987/to:2015/plot/hadcrut3gl/from:2002/to:2015/trend/plot/rss/from:1987/to:2015/plot/rss/from:2002/to:2015/trend/plot/hadsst2gl/from:1987/to:2015/plot/hadsst2gl/from:2002/to:2015/trend/plot/hadcrut4gl/from:1960/to:1990/trend/plot/hadcrut3gl/from:1987/to:2002/trend/plot/hadsst2gl/from:1987/to:2002/trend/plot/rss/from:1987/to:2002/trend/plot/hadcrut4gl/from:1930/to:1960/trend
In 2000 we were +0.5 or +0.6
So, what is not to understand of my claim for a drop of 0.6 by 2038?

Cramer
April 1, 2014 3:53 pm

HenryP says: “So, what is not to understand of my claim for a drop of 0.6 by 2038?”
Now your calling “it” a claim. What were you calling a fact? [And don’t tell me “Gleissberg cycles” because I explicitly said “your Gleissberg sine wave theory.”]
What’s not to understand? Why you cherry picked Nome and Anchorage Intl Airport as a proxy for upper latitudes? Did you look at the “Alaska” data from 1978 to 1999 (be sure to cherry pick those dates)? The 1978 to 1999 temperatures have a downward trend that about as much as your 1998-2013 data. Should the 1978 to 1999 trend be upward or downward according to your “sine wave theory?”
I also understand other upper latitudes locations such as Greenland and Iceland (not just Nome and Anchorage Intl Airport). Have you ever heard of gridded data. If you are going to cherry pick Alaska, at least use gridded data:
http://berkeleyearth.lbl.gov/auto/Regional/TAVG/Text/alaska-TAVG-Trend.txt
http://www.ncdc.noaa.gov/temp-and-precip/alaska/tmp/mon/0
What’s not to understand? Why you cherry picked the Gleissberg Cycle. Do you even understand that the Gleissberg cycle is simply a cycle of the amplitude modulation of the Schwabe Cycle? It has little effect on solar forcing. Can you provide an estimate in the variation of solar forcing due to the Gleissberg Cycle?

April 1, 2014 8:18 pm

Cramer says
What’s not to understand? Why you cherry picked Nome and Anchorage Intl Airport as a proxy for upper latitudes? Did you look at the “Alaska” data from 1978 to 1999 (be sure to cherry pick those dates)? The 1978 to 1999 temperatures have a downward trend that about as much as your 1998-2013 data.
Henry says
I did no such thing. I only chose to show the linear regression lines of those two stations as otherwise the picture becomes too cluttered. I chose 10 weather stations in Alaska precisely from a source other than Berkeley, namely tutiempo.net.
Of the 10 stations observed, only one station showed an upward trend, namely Barrow, and there I put a question mark at one particular result that does not fit in with the other 9.
Nevertheless, to determine the average trend, I took the 10 slopes (including Barrow) and averaged these, to give me an average slope of -0055 degree C per annum or -0.55 degreeC/decade since 1998.
I chose 1998 as my starting point as most data sets (including my own), seem to agree that earth was at its warmest point in 1998, i.e. when its output was at its maximum.
As far as Greenland and Iceland is concerned: most stations are on the coast, which gives a false impression, due to warmer streams that can carry warmth for decades after cooling has started, (just like Barrow, perhaps).
If you chose to believe Berkeley, and not me, that is your choice.
Note my results for the Anchorage Army base station going back to 1942 (for maxima only)
2nd graph, below the global graph, here
http://blogs.24.com/henryp/2012/10/02/best-sine-wave-fit-for-the-drop-in-global-maximum-temperatures/
hence I know that the cooling in Alaska will continue until 2038.
There is a small variation within the TSI when measured over time, especially in the UV sector.
A small difference in the distribution of light coming from the sun affects the production of the ozone, peroxides and nitrogenous oxides lying at the TOA, and that deflects more radiation to space , when there is more of it. The higher the latitude the more pronounced this effect TOA, apparently, e.g. note the difference between the Anchorage- and the global graph for the speed in warming.
Every place on earth is on its own particular Gleissberg wave (of 88 years)

Cramer
April 1, 2014 9:57 pm

“hence I know that the cooling in Alaska will continue until 2038.”
Alaska started cooling in 1978. That’s 60 years to 2038 (120 year cycle). You used Anchorage Intl Airport and Nome Airport for your linear trend in your graph. 1978 to 1999 had a steeper negative trend than 1998-2013 (about double for Anchorage):
Anchorage Intl Airport (1978-1999): -0.047 K/yr
Anchorage Intl Airport (1998-2013): -0.024 K/yr
Anchorage Intl Airport (1978-2013): -0.007 K/yr
Nome Airport (1978-1999): -0.081 K/yr
Nome Airport (1998-2013): -0.080 K/yr
Nome Airport (1978-2013): -0.027 K/yr
“Every place on earth is on its own particular Gleissberg wave (of 88 years)”
The actual Gleissberg solar cycle peaked about 1960 (max sunspots during Schwabe Cycle 19).
So I guess every place on earth can have it’s own cycle by having different peaks and different durations. Alaska temperatures peaked in 1978 with 120 year cycle. Some cycles are 88 years, others are 30 years, 40 years, …120 years, etc. Some peaks occur in 1960 with the peak of Gleissburg. Others peak in 1978. Others in 1998.

April 1, 2014 11:09 pm

Cramer says
Anchorage Intl Airport (1978-1999): -0.047 K/yr
Anchorage Intl Airport (1998-2013): -0.024 K/yr
Anchorage Intl Airport (1978-2013): -0.007 K/yr
Henry says
those results do not compare good with my results (from tutiempo.net) for Anchorage airport
0.0114 K/yr 1973-2012
0.0104 K/yr 1980-2012
0.0060 K/yr 1990-2012
-0.1234 K/yr 2000-2012
(see 2nd part, 2nd table – means, marked Anchorage a/p)
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
Clearly you can see a declining warming trend over time which turned negative before the new millennium and can be put into a binomial with high correlation,
but hopefully my sine wave is the (most) correct best fit, as otherwise we might fall into an ice age.
Perhaps, if you understand the seriousness of the problem here, you could try and see how I obtained these particular results? I explained this at the beginning of my tables and gave an example, e.g. New York, Kennedy a/p
Note that it is better to look at maxima (first table) as a proxy for energy coming through as here you will find a lot less noise due to error and weather.

bushbunny
April 1, 2014 11:17 pm

Do you think Alaska is a good example? You have reduced daylight (on the edge of the Arctic circle and land of the midnight sun) You can not provide much dairy produce because of lack of pastures, although moose is eaten. You are naturally cold anyway. Fishing great. And I’d love to go on one of those cruises to see the glaciers etc. You are more adapted to cold and alpine conditions than most of America. And close to those pesky volcanoes.

April 1, 2014 11:36 pm

@bushbunny
I took samples of the temperature from everywhere. Alaska is a good example of what is happening globally. At the lower latitudes you get more rain and condensation, which compensates for energy loss: when water condenses it releases energy. However, at the +[40] latitudes you will get droughts as a result of global cooling, and ironically, it might get warmer….
NASA also admits now that antarctic ice is increasing significantly.
http://wattsupwiththat.com/2013/10/22/nasa-announces-new-record-growth-of-antarctic-sea-ice-extent/#more-96133
I am merely re-stating that the global cooling is happening from the top [90] latitudes downward, and you might be confused into believing that earth is still warming, by those whose business depends on the carbon scare.

Editor
April 1, 2014 11:40 pm

Cramer says:
April 1, 2014 at 9:57 pm

Alaska started cooling in 1978. That’s 60 years to 2038 (120 year cycle). You used Anchorage Intl Airport and Nome Airport for your linear trend in your graph. 1978 to 1999 had a steeper negative trend than 1998-2013 (about double for Anchorage):

I have no idea what you mean by “Alaska started cooling in 1978”. Using what metric? 10 year trailing trends? 20 year centered trends? And using which dataset?
Also, was 1978 the point when one of those (or something like it) went below zero? Or was 1978 the point when one of those (or something like it) peaked and started decreasing?
You can see the reason for my lack of comprehension.
w.

Cramer
April 2, 2014 12:45 am

HenryP says: “(see 2nd part, 2nd table – means, marked Anchorage a/p)”
Your 2nd table is for 2000-2011 (Last 12 yrs). Your graph is for 1998-2013. I used 1998-2013 (YOU CHOSE THE DATES!)
Your data from tutiempo.net gives -0.01 K/yr for 1978 to 2013 (more of a decline than my -0.007 K/yr).
My data came from:
http://berkeleyearth.lbl.gov/auto/Stations/TAVG/Text/43378-TAVG-Data.txt
http://www.arh.noaa.gov/cliMap/akClimate.php
Your tutiempo.net is missing 2002 and 2005. 2013 is 3.7 deg C vs BEST 4.144 deg C. The other years are close. What data source did you used to fill in the missing data? Or did you fudge it?
HenryP says: “Perhaps, if you understand the seriousness of the problem here, you could try and see how I obtained these particular results? I explained this”
What seriousness??? This is very easy stuff. This is not rocket science HenryP. You have been sounding as you have just learned this recently and do not have any formal education in science, engineering or math. Or you have a Dunning–Kruger bias.
Do some work and update your tables. Do some work and calculate the 1998-2013 trend. You chose those dates, not I. You were lazy when going through the HadCRUT data; and now you are being lazy again.

Cramer
April 2, 2014 1:10 am
April 2, 2014 6:27 am

Cramer says
Your tutiempo.net is missing 2002 and 2005. 2013 is 3.7 deg C vs BEST 4.144 deg C. The other years are close. What data source did you used to fill in the missing data? Or did you fudge it?
Henry says
I don’t fix or fudge results. I explained the sampling technique including on how to fill in missing data before the main tables begin.
My main tables merely showed me that there was serious cooling happening in Alaska evident from the results at both the Anchorage Airport and Anchorage Airforce base since the beginning of the millennium. At both stations it showed that it was warming until around 1998 or so.
I subsequently looked at 8 more stations from Alaska where most data only go back to 1995.
All results together show that it has been cooling significantly in Alaska, at a rate of -0.055 per annum since 1998 (that is the average of 10 weather stations in Alaska)
http://oi40.tinypic.com/2ql5zq8.jpg
That is almost -1.0 degree C since 1998.
The work involved is first year statistics
but
I have not seen anyone doing a trend on the speed of warming against time
Apparently you have to be a genius like me to see that it drops like the curve of thrown object.
Hence the reason I am 100% confident that we will be cooling globally until 2038

Cramer
April 2, 2014 3:29 pm

HenryP says: “I don’t fix or fudge results.”
HenryP wrote on his blog: “take the average of that particular month’s preceding year and year after”
This is fudging it. When you continue to say things like this, it shows your background. What do you think fudging means?
Your fudging method is not even robust. You should have at least attempted to include month-over-month comparison than only year-over-year comparisons. I just read on this blog that Chicago had the coldest winter on record (Dec-Mar). Comparing Feb-2014 to Jan-2014 can give you more info than simply comparing Feb-2014 to Feb-2013 and Feb-2015 (if Feb-2015 existed). But it still is best to find another temperature record for Chicago if you are missing Feb-2014. That can be adjusted for bias (both level and variation) between data sets.
Your TuTiempo data is missing 2002 and 2005 for all ten of your Alaska data sets. This should have been a red flag for you.
HenryP says: “I subsequently looked at 8 more stations from Alaska where most data only go back to 1995.”
THIS IS NOT TRUE. TuTiempo.net gives more complete data from 1973 onwards. 7 out of ten of your Alaska stations are not missing a single year from 1973-1999. Dawson and Tanana are only missing one month from 1978-1999. Nenana is the only station missing a significant amount of data.
DO THE WORK!
Here are the results using TuTiempo.net data:
Anchorage Intl Airport (1978-1999): -0.045 K/yr
Anchorage Intl Airport (1998-2013): -0.037 K/yr
Anchorage Intl Airport (1978-2013): -0.010 K/yr
Nome Airport (1978-1999): -0.104 K/yr
Nome Airport (1998-2013): -0.065 K/yr
Nome Airport (1978-2013): -0.039 K/yr
Ten AK Stations Avg (1978-1999): -0.025 K/yr
Ten AK Stations Avg (1998-2013): -0.052 K/yr
Ten AK Stations Avg (1978-2013): +0.001 K/yr
Again, both Anchorage and Nome have steeper rates of temperature decline for 1978-1999 than for 1998-2013. This is a simple lesson in cherry picking.
Here is the data:
Anchorage Intl Airport (1978-2013) =
4.6,4.1,2.3,3.9,1.3,3.0,3.8,2.3,3.5,3.6,3.2,2.3,1.8,2.8,2.1,4.2,2.6,3.1,1.7,3.6,3.2,1.6,3.5,3.0,4.0,4.1,3.8,4.2,2.3,2.8,1.6,2.6,3.1,2.8,1.7,3.7
Nome Airport (1978-2013) =
0.1,-1.0,-1.5,-0.5,-1.6,-0.5,-3.6,-2.6,-2.0,-2.2,-2.1,-2.6,-3.1,-1.7,-4.3,-0.9,-3.3,-1.8,-2.7,-1.8,-1.6,-5.2,-1.2,-3.2,-0.8,-1.3,-0.6,-1.2,-3.2,-1.2,-4.0,-3.3,-2.7,-2.6,-4.6,-2.0
Average of Ten AK Stations (1978-2013) =
-0.75,-1.54,-2.28,-0.31,-3.08,-1.93,-2.72,-2.48,-1.96,-1.35,-1.61,-2.22,-2.84,-1.82,-3.02,-0.31,-2.34,-1.58,-3.39,-1.34,-0.87,-3.38,-1.42,-1.75,-0.39,-0.93,-1.09,-0.65,-2.24,-1.41,-2.87,-2.00,-1.33,-1.86,-3.45,-1.54
[Note: be careful of line breaks if you copy this data.]
Here’s the missing data points that I used:
Anchorage Intl Airport (2002,2005) = 4.0, 4.2
Nome Airport (2002,2005) = -0.8, -1.2
HenryP says: “Apparently you have to be a genius like me to see that it drops like the curve of thrown object. Hence the reason I am 100% confident that we will be cooling globally until 2038.”
Yes, this confirms your arrogance (Dunning–Kruger bias). No, you don’t have to be a genius — even a ten year old child can understand and recognize the curve of a thrown object.

April 3, 2014 7:21 am

Let us deal with the various issues separately, in several posts
First of all re. to my applied correction for missing data
Cramer says
your fudging method is not even robust. You should have at least attempted to include month-over-month comparison than only year-over-year comparisons. I just read on this blog that Chicago had the coldest winter on record (Dec-Mar). Comparing Feb-2014 to Jan-2014 can give you more info than simply comparing Feb-2014 to Feb-2013 and Feb-2015 (if Feb-2015 existed). But it still is best to find another temperature record for Chicago if you are missing Feb-2014. That can be adjusted for bias (both level and variation) between data sets.
Your TuTiempo data is missing 2002 and 2005 for all ten of your Alaska data sets. This should have been a red flag for you.
Henry says
Starting with your last remark, remember that the whole year’s data is given missing but usually only a part of one particular’s month daily data was found missing.
Of the given years the other 11 months data were available, so, I collected those 11 months separately. In the month where I found that there were missing daily data, I looked at how many daily data I had. If there were more than 15 (days of daily data), I would take the average for the month. If there were less than 15 days of daily data and let us say that was for November 2002, I would look at November 2001 and November 2003, and take the average of those two months as the figure for November 2002. I would then proceed to calculate the average yearly temperature for 2002 from the 12 months of 2002.
Now, if I understand you correctly, what I hear you say is that, in the above example, I should rather have taken October 2002 and December 2002 and calculate the average of that, to fill in as the results for November 2002.
I can honestly say that I did not think of doing that as the variations within months can be quite dramatic, especially in the arctic. This course of action is debatable. I am not saying your method is better than mine or that mine is better. I honestly don’t think it will make such a big difference as the true amount of “fudging” error is in any case diluted by 12, seeing that we had the true exact daily data of the remaining 11 months of the year.
I hope you agree with me on that?

April 3, 2014 8:53 am

Cramer says
DO THE WORK!
henry says
Clearly, I don’t have to do any more work once I know what is happening.
But I invite you to repeat my results.
Here are my results for Anchorage airport (again)
0.0114 K/yr 1973-2012 (that means 1973 to 2012 – not including 2012)
0.0104 K/yr 1980-2012
0.0060 K/yr 1990-2012
-0.1234 K/yr 2000-2012
Here are my results for the Elmendorff Airforce base in Anchorage:
0.0245 K/yr 1973-2012
0.0193 K/yr 1980-2012
0.0220 K/yr 1990-2012
-0.1785 K/yr 2000-2012
The last figures 2000-2012 in Alaska seemed a bit steep to me, i.e. the cooling I mean, so I looked at 10 stations in Alaska, getting an average result of
-.0.055 K/yr 1998-2013
Now, going off from the means,
here are my results for maxima (average for 47 weather stations, spread equally Nh and Sh + 70/30 at sea / in-land)
First table, bottom
0.036 K/yr 1974 -2012
0.028 K/yr 1980-2012
0.015 K/yr 1990-2012
-0.013 K/yr 2000-2012
Now,my dear Cramer, anyone who knows a little bit of stats,would see a clear pattern emerging here,
I did a linear fit, setting the speed of warming out against time, on those 4 results for the drop in the speed of global maximum temps,
ended up with y=0.0018x -0.0314, with r2=0.96
I was at least 95% sure (max) temperatures were falling.
On same maxima data, a polynomial fit, of 2nd order, i.e. parabolic, gave me
y= -0.000049×2 + 0.004267x – 0.056745
r2=0.995
That is very high, showing a natural relationship, like the trajectory of somebody throwing a ball…
projection on the above parabolic fit backward, ( 5 years) showed a curve:
happening around 40 years ago. You always have to be careful with forward and backward projection, but you can do so with such high correlation (0.995)
ergo: the final curve must be a sine wave fit, with another curve happening, somewhere on the bottom…
http://blogs.24.com/henryp/2012/10/02/best-sine-wave-fit-for-the-drop-in-global-maximum-temperatures/
Now tell me,
do you honestly still think I am stupid?
Dunning–Kruger bias, is what?

April 3, 2014 9:15 am

Perhaps I should tell you why I started to avoid anglo saxon stations (BEST etc)
I wrote a report on that as well
but you can query me on that if you are interested..
(clearly you work for one of those institutes that I don’t trust)

Cramer
April 3, 2014 3:49 pm

HenryP says: “Now, if I understand you correctly, what I hear you say is that, in the above example, I should rather have taken October 2002 and December 2002.
No, you did not understand me correctly. I said, “you should have at least attempted to INCLUDE month-over-month comparison THAN ONLY year-over-year comparisons.” It was not an either-or; it was both. You should use ALL data available. First, you should be using other data sets to check for inconsistencies and to fill in missing data. There are USHCN, GHCN, GISS, and BEST to name a few. Tutiempo is missing 2002 and 2005 on all 10 of your AK stations. This is a problem with Tutiempo, not the weather station. Did you every compare tutiempo to BEST raw station data? The monthly data is close to exact. BEST even has a QC column. Notice that failed=1 for Nov 2013 for Anchorage data (last data point).
HenryP says: “Now,my dear Cramer, anyone who knows a little bit of stats,would see a clear pattern emerging here,”
You have introduced severe autocorrelation into your time series. “Anyone who knows a little bit of stats” would notice this pattern:
12/38 = 32%
12/32 = 38%
12/22 = 55%
12/12 = 100%
Your last 12 data points from 2000-2011 (“not including 2012”?) represent a larger and larger proportion of your data. Try your regression analysis with this data:
1, 0.05
2, 0.10
3, 0.15
4, 0.20
5, 0.18
This data is clearly linear from 1 to 4 with a slope of 0.05. The last point then drops to 0.18 with a slope of -0.02.
slope(1 to 5) = 0.036
slope(2 to 5) = 0.029
slope(3 to 5) = 0.015
slope(4 to 5) = -0.02
This is very close to your results (maxima for 47 stations).
Your analysis is erroneous. R^2 is meaningless for your regression. Why not take your analysis to someone with expertise in statistics? They will tell you the same thing that I have. Know how do do a regression in Excel does not mean you have much knowledge in statistics.
HenryP says: “Perhaps I should tell you why I started to avoid anglo saxon stations (BEST etc)”
That doesn’t make sense. Do you actually believe tuteimpo and BEST have their own stations at Anchorage airport?
HenryP says: “clearly you work for one of those institutes that I don’t trust.”
It’s an irrational bias on your part. Someone could form an irrational Spanish bias because tutiempo.net is missing a lot of data from 2002 and 2005. That’s why you need to look at the data from multiple sources, not just one.
Here’s the BEST vs tutiempo average temperatures for Anchorage Airport in 2012:
Month, BEST, TuTiempo
1, -15.932, -15.9
2, -3.670, -3.6
3, -6.131, -6.1
4, 3.685, 3.8
5, 7.629, 7.6
6, 12.37, 12.5
7, 13.324, 13.3
8, 13.708, 13.8
9, 9.357, 9.4
10, 1.238, 1.3
11, -7.069, -7.1
12, -8.823, -8.8
So if tutiempo is missing April 2012 (3.8 deg C) you would rather fudge your own number than use that evil BEST data?
tutiempo Apr 2011 = 3.1
tutiempo Apr 2013 = -1.3
average = 0.9
BEST Apr 2012 = 3.685
You need to leave your emotions of hate out of your analysis.

April 4, 2014 6:24 am

Cramer says
So if tutiempo is missing April 2012 (3.8 deg C) you would rather fudge your own number than use that evil BEST data?
tutiempo Apr 2011 = 3.1
tutiempo Apr 2013 = -1.3
average = 0.9
BEST Apr 2012 = 3.685
You need to leave your emotions of hate out of your analysis.
Henry says
I needed a source of data, as unbiased as possible, as globally as possible,
with minima, to see if there was any man made warming,
and with maxima to see what the natural pattern of warming was.
I chose tutiempo and a dealt with the missing data problem as stated, rightly (by me) or wrongly (according to you).
Quite honestly, I don’t think it is going to make any much of a difference to the outcome,
as the other 11 months of data in 2002 and 2005 were available.
Even in the example that you give, which I assume was cherry picked,
it did not affect the final year-average by more 0.3 degrees C which is not much if you look at the variation in the average yearly temperature in Anchorage.

April 4, 2014 6:49 am

Cramer says
Your analysis is erroneous. R^2 is meaningless for your regression. Why not take your analysis to someone with expertise in statistics? They will tell you the same thing that I have. Know how do do a regression in Excel does not mean you have much knowledge in statistics.
Henry says
(remember we are looking at maxima here)
if the average speed of warming (on a randomly selected, balanced global sample) is (found by me )
to be as follows
0.036 K/yr 1974 -2012
0.028 K/yr 1980-2012
0.015 K/yr 1990-2012
-0.013 K/yr 2000-2012
during the periods indicated\,
you honestly don’t see that there must be a strong relationship between time and the speed of warming?
whether you take it linear or binomial or as a sine wave (we really must hope the latter to be true),
surely anyone who knows a little bit about statistics would be able to do a test and see that the correlation R^2 is significant on the 95% confidence level.
The problems is that at the universities there are too many scientists who refuse to accept these results, because of the “public opinion” , hence nobody. including you, is doing the (little) work required to repeat my results.

Cramer
April 4, 2014 4:59 pm

HenryP says: “you honestly don’t see that there must be a strong relationship between time and the speed of warming?”
It appears you did not read what I wrote in my previous comment. Your Y-values are not independent of each other. If you want to see a relationship between time and the speed of warming you should fit the original data to a polynomial or do a segmented linear regression:
1974-1979
1980-1989
1990-1999
2000-2012
Segment it how you want (equal or non-equal), but your segments can not overlap.
HenryP says: “surely anyone who knows a little bit about statistics would be able to do a test and see that the correlation R^2 is significant on the 95% confidence level.”
Your R^2 = 0.96 (or R^2=0.995 for polynomial fit) is meaningless because your regression is meaningless. Learn about the assumptions required for regression. Learn how you can still have an R^2 with a high value (>0.95), but you still have a poor fit or a bad model.
HenryP says: “The problems is that at the universities there are too many scientists who refuse to accept these results, because of the “public opinion” , hence nobody. including you, is doing the (little) work required to repeat my results.”
Yes, you are correct. You did “(little) work.” It’s not your results. It’s your methods. I did the work. You did not do the work (including educating yourself). You refused to investigate the criticisms. You simply kept repeating (cutting-n-pasting) the little work that you did over a year ago. I understand your results better than you do. You did not even understand how to correctly average your ten Alaska weather stations. You drew a question mark on the Barrow weather station, not understanding that it is a North Slope location and the results are consistent with other North Slope locations such as Deadhorse/Prudhoe, Wainwright, and Barter Island (which all continue to warm)(oi40.tinypic.com/2ql5zq8.jpg). But you did not like that it was continuing to warm because it went completely against your thesis: “the major cooling actually is happening from the top latitudes downward.” You didn’t want to believe that Anchorage might be affected by the Pacific Decadal Oscillation, but then you eliminate Greenland and Iceland from your analysis because of effects from the gulf stream and weather stations being on the coast. I guess it okay to include Anchorage when the PDO is cooling it, but not okay when the PDO is warming it. You also compared Greenland to Barrow because both are warming.
HenryP says: “I don’t think it is going to make any much of a difference to the outcome,”
I agree. When your analysis methods are scientifically flawed, the quality of your data does not make much of a difference.
HenryP says: “Even in the example that you give, which I assume was cherry picked,…”
Yes, you are correct. It was cherry picked but in the opposite direction of your implied assumption. I chose the April 2012 because it was the worse month (greatest difference between BEST and tutiempo data), not the best.

April 5, 2014 12:05 am

Cramer says
you want to see a relationship between time and the speed of warming you should fit the original data to a polynomial or do a segmented linear regression:
1974-1979
1980-1989
1990-1999
2000-2012
Segment it how you want (equal or non-equal), but your segments can not overlap.
HenryP says: “surely anyone who knows a little bit about statistics would be able to do a test and see that the correlation R^2 is significant on the 95% confidence level.”
Your R^2 = 0.96 (or R^2=0.995 for polynomial fit) is meaningless because your regression is meaningless. Learn about the assumptions required for regression. Learn how you can still have an R^2 with a high value (>0.95), but you still have a poor fit or a bad model.
HenryP says: “The problems is that at the universities there are too many scientists who refuse to accept these results, because of the “public opinion” , hence nobody. including you, is doing the (little) work required to repeat my results.”
Yes, you are correct. You did “(little) work.” It’s not your results. It’s your methods. I did the work. You did not do the work (including educating yourself). You refused to investigate the criticisms. You simply kept repeating (cutting-n-pasting) the little work that you did over a year ago. I understand your results better than you do. You did not even understand how to correctly average your ten Alaska weather stations. You drew a question mark on the Barrow weather station, not understanding that it is a North Slope location and the results are consistent with other North Slope locations such as Deadhorse/Prudhoe, Wainwright, and Barter Island (which all continue to warm)(oi40.tinypic.com/2ql5zq8.jpg). But you did not like that it was continuing to warm because it went completely against your thesis: “the major cooling actually is happening from the top latitudes downward.” You didn’t want to believe that Anchorage might be affected by the Pacific Decadal Oscillation, but then you eliminate Greenland and Iceland from your analysis because of effects from the gulf stream and weather stations being on the coast. I guess it okay to include Anchorage when the PDO is cooling it, but not okay when the PDO is warming it. You also compared Greenland to Barrow because both are warming.
Henry says
Ok, I will eat the humble pie here and say that you are probably right in cutting it up that way, in fact I think I will cut it up in the actual Schwabe solar cycles that have been apparent…. I still have all the files and I could now also include the results for 2012 and 2013.
The only problem is that at the moment I donot have much time available for this. This is just my hobby.
Having said that, however, I still think that this excercise will not change my analysis that much.
All major data sets say that we have started cooling from just before the new Millennium just as my results predicted.
http://www.woodfortrees.org/plot/hadcrut4gl/from:1987/to:2015/plot/hadcrut4gl/from:2002/to:2015/trend/plot/hadcrut3gl/from:1987/to:2015/plot/hadcrut3gl/from:2002/to:2015/trend/plot/rss/from:1987/to:2015/plot/rss/from:2002/to:2015/trend/plot/hadsst2gl/from:1987/to:2015/plot/hadsst2gl/from:2002/to:2015/trend/plot/hadcrut4gl/from:1987/to:2002/trend/plot/hadcrut3gl/from:1987/to:2002/trend/plot/hadsst2gl/from:1987/to:2002/trend/plot/rss/from:1987/to:2002/trend
In the latter half of your comment you accuse me of bias.
I think I have gone out of my way to explain my sampling procedure to obtain a globally representative sample.
Note that I was more interested inthe pattern of warming being allowed through the atmosphere, than the PDO and AMO etc.
I know that earth has an intricate system of storing energy for years, hence the survival of the planet for such a long time.
However, if you say that I should also have looked at those places on the warmer gulf stream, then I say that you should remember that there is a lag from energy-in and energy-out. Counting back 88 years i.e. 2013-88= we are in 1925.
Now look at some eye witness reports of the ice back then?
http://wattsupwiththat.com/2008/03/16/you-ask-i-provide-november-2nd-1922-arctic-ocean-getting-warm-seals-vanish-and-icebergs-melt/
Sounds familiar? Back then, in 1922, they had seen that the arctic ice melt was due to the warmer Gulf Stream waters. However, by 1950 all that same ‘lost” ice had frozen back. I therefore predict that all lost arctic ice will also come back, from 2020-2035 as also happened from 1935-1950. Antarctic ice is already increasing.