Foreword by Anthony Watts
An essay by Monckton of Brenchley follows, but I wanted to bring this graphic from Dr. Mann’s recent Scientific American article to attention first. In the infamous “hide the decline” episode revealed by Climategate surrounding the modern day ending portion of the “hockey stick”, Mann has been accused of using “Mike’s Nature Trick” to hide the decline in modern (proxy) temperatures by adding on the surface record. In this case, the little white line from his SciAm graphic shows how “the pause” is labeled a “faux pause”, (a little play on words) and how the pause is elevated above past surface temperatures.
Source: http://www.scientificamerican.com/sciam/assets/Image/articles/earth-will-cross-the-climate-danger-threshold-by-2036_large.jpg

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.
So, over the weekend I asked Willis Eschenbach to use his “graph digitizer” tool (which he has used before) to turn Mann’s little white line into numerical data, and he happily obliged.
Here is the result when Mann’s little white line is compared and matched to two well known surface temperature anomaly datasets:
What is most interesting is that Mann’s “white line” shows a notable difference during the “pause” from HadCRUT4 and GISS LOTI. Why would our modern era of “the pause” be the only place where a significant divergence exists? It’s like “hide the decline” deja vu.
The digitized Mann’s white line data is available here: Manns_white_line_digitized.(.xlsx)
As of this writing, we don’t know what dataset was used to create Mann’s white line of surface temperature anomaly, or the base period used. On the SciAm graphic it simply says “Source: Michael E. Mann” on the lower right.
It isn’t GISS land ocean temperature index (LOTI), that starts in 1880. And it doesn’t appear to be HadCRUT4 either. Maybe it is BEST but not using the data going back to 1750? But that isn’t likely either, since BEST pretty much matches the other datasets, and in Mann’s graphic above, which peaks out at above 1°C, none of those hit higher than 0.7°C. What’s up with that?
Now compare that plot above to this portion Dr. Mann’s SciAm plot, noting the recent period of surface temperature and the 1°C reference line which I extended from the Y axis:
I’m reminded of Dr. Mann’s claims about climate skeptics in this video: http://www.linktv.org/video/9382/inside-the-climate-wars-a-conversation-with-michael-mann
At 4:20 in the video, Dr. Mann claims that US climate skeptics are part of the “greatest disinformation campaign ever run”. If his position is so strong and pure, why then do we see silly things like this graph given with an elevated ending of global surface temperature (in contrast to 5 other datasets) and not a single data source citation given?
UPDATE: Mark B writes in comments:
Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.
Explanation of graph including links to source code and data were given here: http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/
REPLY: Yes, I’ve seen that, but there is a discrepancy, the label on the image is “Historical Mean Annual Temperature” (white)
In http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/ it is written:
Historical Simulations. The model was driven with estimated annual natural and anthropogenic forcing over the years A.D. 850 to 2012. Greenhouse radiative forcing was calculated using the approximation (ref. 8) FGHG = 5.35log(CO2e/280), where 280 parts per million (ppm) is the preindustrial CO2 level and CO2e is the “equivalent” anthropogenic CO2. We used the CO2 data from ref. 9, scaled to give CO2e values 20 percent larger than CO2 alone (for example, in 2009 CO2 was 380 ppm whereas CO2e was estimated at 455 ppm). Northern Hemisphere anthropogenic tropospheric aerosol forcing was not available for ref. 9 so was taken instead from ref. 2, with an increase in amplitude by 5 percent to accommodate a slightly larger indirect effect than in ref. 2, and a linear extrapolation of the original series (which ends in 1999) to extend though 2012.
“Historical Mean Annual Temperature” is NOT the same as “Historical Simulations” It looks to me like a bait and switch.
UPDATE2: Note the lead in text says “Global temperature rise…”
But in comments, Willis and Bill Illis have worked out that the white line represents only half the planet, the Northern Hemisphere. The white line is HadCRUT NH value, not global.
Obviously we can’t take such statements as the lead in text saying “global” at face value. Imagine if a climate skeptic made a graph like this. We’d be excoriated.
What needs to be done is to create a graph that shows what this would have looked like had Mann not cherry picked the NH and presented it on a graph with the text “Global temperature rise…”.
==============================================================
Mann’s ‘False Hope’ is false hype
By Christopher Monckton of Brenchley
The legendary Dr Walter Lewin, Professor of Physics at MIT, used to introduce his undergraduate courses by saying that every result in physics depended ultimately on measurement; that mass, distance, and time are its three fundamental physical units that every observation in these and all of their derivative units is subject to measurement uncertainty; and that every result in physics, if only for this reason, is to some degree uncertain.
Contrast this instinctual humility of the true physicist with the unbecoming and, on the evidence to date, unjustifiable self-assurance of the surprisingly small band of enthusiasts who have sought to tell us there is a “climate crisis”’. Not the least among these is Michael Mann, perpetrator of the Hokey-Stick graph that wrought the faux abolition of the medieval warm period.
In logic, every declarative statement is assigned a truth-value: 1 (or, in computer programs, –1) for true, 0 for false. Let us determine the truth-values of various assertions made by Mann, in a recent article entitled False Hope, published in the propaganda-sheet Scientific American.
Mann’s maunderings and meanderings will be in bold face, followed by what science actually says in Roman face, and the verdict: Truth-value 1, or truth-value 0?
Mann: “Global warming continues unabated.”
Science: Starting in Orwell’s Year (1984), and taking the mean of the five standard global temperature datasets since then, the rate of warming has changed as follows:
1979-1990 Aug 140 months +0.080 Cº/decade.
1979-2002 Apr 280 months +0.153 Cº/decade.
1979-2013 Dec 420 months +0.145 Cº/decade.
The slowdown in the global warming rate has arisen from the long pause, now 13 years 2 months in length on the mean of all five datasets (assuming that HadCRUT4, which is yet to report, shows a result similar to the drop in global temperatures reported by the other four datasets).
Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.
Mann: “… during the past decade there was a slowing in the rate at which the earth’s average surface temperature had been increasing. The event is commonly referred to as “the pause,” but that is a misnomer: temperatures still rose, just not as fast as during the prior decade.”
Science: During the decade February 2005 to January 2014, on the mean of all five datasets, there was a warming of 0.01 Cº, statistically indistinguishable from zero.
Truth-value 0: Temperatures did not rise in any statistically significant sense, and the increase was within the measurement uncertainty in the datasets, so that we do not know there was any global warming at all over the decade. Here, Walter Lewin’s insistence on the importance of measurement uncertainty is well demonstrated.
Mann: “In response to the data, the IPCC in its September 2013 report lowered one aspect of its prediction for future warming.”
Science: In 2013 the IPCC reduced the lower bound of its 2007 equilibrium climate-sensitivity interval from 2 Cº to 1.5 Cº warming per CO2 doubling, the value that had prevailed in all previous Assessment Reports. It also reduced the entire interval of near-term projected warming from [0.4, 1.0] Cº to [0.3, 0.7] Cº. Furthermore, it abandoned its previous attempts at providing a central estimate of climate sensitivity.
Verdict: Truth value 0. The IPCC did not lower only “one aspect of its prediction for future warming” but several key aspects, abandoning the central prediction altogether.
Mann: If the world keeps burning fossil fuels at the current rate, it will cross a threshold into environmental ruin by 2036. The “faux pause” could buy the planet a few extra years beyond that date to reduce greenhouse gas emissions and avoid the crossover–but only a few.
Science: Mann is asserting that on the basis of some “calculations” he says he has done, the world will face “environmental ruin” by 2036 or not long thereafter. However, Mann has failed to admit any uncertainty in his “calculations” and consequently in his predictions.
Verdict: Truth-value 0. Given the ever-growing discrepancy between prediction and observation in the models, and Mann’s own disastrous record in erroneously abolishing the medieval warm period by questionable statistical prestidigitation, the uncertainty in his predictions is very large, and a true scientist would have said so.
Mann: “The dramatic nature of global warming captured world attention in 2001, when the IPCC published a graph that my co-authors and I devised, which became known as the ‘hockey stick’. The shaft of the stick, horizontal and sloping gently downward from left to right, indicated only modest changes in Northern Hemisphere temperature for almost 1,000 years–as far back as our data went.”
Science: The Hokey-Stick graph falsely eradicated both the medieval warm period and the little ice age. At co2science.org, Dr. Craig Idso maintains a database of more than 1000 papers demonstrating by measurement (rather than modeling) that the medieval warm period was real, was near-global, and was at least as warm as the present just about everywhere. McIntyre & McKitrick showed the graph to be erroneous, based on multiple failures of good statistical practice. The medieval warm period and the little ice age are well attested in archaeology, history, architecture, and art. It was the blatant nonsense of the Hokey Stick that awoke many to the fact that a small academic clique was peddling unsound politics, not sound science.
Verdict: Truth value 0. Once again, Mann fails to refer to the uncertainties in his reconstructions, and to the many independent studies that have found his methods false and his conclusions erroneous. Here, he takes a self-congratulatory, nakedly partisan stance that is as far from representing true science as it is possible to go.
Mann: “The upturned blade of the stick, at the right, indicated an abrupt and unprecedented rise since the mid-1800s.”
Science: The graph, by confining the analysis to the northern hemisphere, overstated 20th-century global warming by half. Mann says the rise in global temperatures, shown on the graph as 1.1 Cº over the 20th century, is “unprecedented”. However, the Central England Temperature Record, the world’s oldest, showed a rise of 0.9 Cº in the century from 1663 to 1762, almost entirely preceding the industrial revolution, compared with an observed rate of just 0.7 Cº over the 20th century. The CETR is a good proxy for global temperature change. In the 120 years to December 2013 it showed a warming rate within 0.01 Cº of the warming rate taken as the mean of the three global terrestrial datasets.
Verdict: Truth value 0. The warming of the 20th century was less than the warming for the late 17th to the late 18th centuries.
Mann: “The graph became a lightning rod in the climate change debate, and I, as a result, reluctantly became a public figure.”
Science: For “lightning-rod” read “laughing-stock”. For “reluctantly” read “enthusiastically”. For “public figure” read “vain and pompous charlatan who put the ‘Ass’ in ‘Assessment Report’”.
Verdict: Pass the sick-bucket, Alice.
Mann: “In its September 2013 report, the IPCC extended the stick back in time, concluding that the recent warming was likely unprecedented for at least 1400 years.”
Science: The IPCC is here at odds with the published scientific literature. In my expert review of the pre-final draft of the Fifth Assessment Report, I sent the IPCC a list of 450 papers in the reviewed literature that demonstrated the reality of the warm period. The IPCC studiously ignored it. Almost all of the 450 papers are unreferenced in the IPCC’s allegedly comprehensive review of the literature. I conducted a separate test using the IPCC’s own methods, by taking a reconstruction of sea-level change over the past 1000 years, from Grinsted et al. (2009), and comparing it with the schematic in the IPCC’s 1990 First Assessment Report showing the existence and prominence of both the medieval warm period and the little ice age. The two graphs are remarkably similar, indicating the possibility that the sea-level rise in the Middle Ages was caused by the warmer weather then, and that the fall in the Little Ice Age was caused by cooler weather. The sea-level reconstruction conspicuously does not follow a Hokey-Stick shape.
Verdict: Truth value 0. The IPCC has misrepresented the literature on this as on other aspects of climate science. There are of course uncertainties in any 1000-year reconstruction, but if Grinsted et al. have it right then perhaps Mann would care to explain how it was that sea level rose and fell by as much as 8 inches either side of today’s rather average value if there was no global warming or cooling to cause the change?
Mann: “Equilibrium climate sensitivity is shorthand for the amount of warming expected, given a particular fossil-fuel emissions scenario.”
Science: Equilibrium climate sensitivity is a measure of the global warming to be expected in 1000-3000 years’ time in response to a doubling of CO2 concentration, regardless of how that doubling came about. It has nothing to do with fossil-fuel emissions scenarios.
Truth value: 0. Mann may well be genuinely ignorant here (as elsewhere).
Mann: “Because the nature of these feedback factors is uncertain, the IPCC provides a range for ECS, rather than a single number. In the September report … the IPCC had lowered the bottom end of the range. … The IPCC based the lowered bound on one narrow line of evidence: the slowing of surface warming during the past decade – yes, the faux pause.”
Science: For well over a decade there has been no global warming at all. The pause is not faux, it is real, as Railroad Engineer Pachauri, the IPCC’s joke choice for climate-science chairman, has publicly admitted. And the absence of any global warming for up to a quarter of a century is not “one narrow line of evidence”: it is the heart of the entire debate. The warming that was predicted has not happened.
Verdict: Truth value 0. Mann is here at odds with the IPCC, which – for once – paid heed to the wisdom of its expert reviewers and explicitly abandoned the models, such as that of Mann, which have been consistent only in their relentless exaggeration of the global warming rate.
Mann: “Many climate scientists – myself included – think that a single decade is too brief to accurately measure global warming and that the IPCC was unduly influenced by this one, short-term number.”
Science: Overlooking the split infinitive, the IPCC was not “unduly influenced”: it was, at last, taking more account of evidence from the real world than of fictitious predictions from the vast but inept computer models that were the foundation of the climate scare. Nor was the IPCC depending upon “one short-term number”.
James Hansen of NASA projected 0.5 C°/decade global warming as his “business-as-usual” case in testimony before Congress in 1988. The IPCC’s 1990 First Assessment Report took Hansen’s 0.5 C°/decade as its upper bound. It projected 0.35 C°/decade as its mid-range estimate, and 0.3 C°/decade as its best estimate.
The pre-final draft of the 2013 Fifth Assessment Report projected 0.23 C°/decade as its mid-range estimate, but the published version reduced this value to just 0.13 C°/decade – little more than a quarter of Hansen’s original estimate of a quarter of a century previously.
Observed outturn has been 0.08 Cº/decade since 1901, 0.12 C°/decade since 1950, 0.14 C°/decade since 1990, and zero since the late 1990s.
Three-quarters of the “climate crisis” predicted just 24 years ago has not come to pass. The Fifth Assessment Report bases its near-term projections on a start-date of 2005. The visible divergence of the predicted and observed trends since then is remarkable.
It is still more remarkable how seldom in the scientific journals the growing discrepancy between prediction and observation is presented or discussed.
Verdict: Truth value 0. Step by inexorable step, the IPCC is being driven to abandon one extremist prediction after another, as real-world observation continues to fall a very long way short of what it had been predicting.
Mann: “The accumulated effect of volcanic eruptions during the past decade, including the Icelandic volcano with the impossible name, Eyjafjallajökull, may have had a greater cooling effect on the earth’s surface than has been accounted for in most climate model simulations. There was also a slight but measurable decrease in the sun’s output that was not taken into account in the IPCC’s simulations.”
Science: So the models failed to make proper allowance for, still less to predict, what actually happened in the real world.
Verdict: Truth value 0. Eyjafjallajökull caused much disruption, delaying me in the United States for a week (it’s an ill wind …), but it was a comparatively minor volcanic eruption whose signature in the temperature record cannot be readily distinguished from the la Niña cooling following the el Niño at the beginning of 2010. The discrepancy between models’ predictions and observed reality can no longer be as plausibly dismissed as this, and the IPCC knows it.
Mann: “In the latter half of the decade, La Niña conditions persisted in the eastern and central tropical Pacific, keeping global surface temperatures about 0.1 degree C colder than average …”
Science: There were La Niña (cooling) events in 1979, 1983, 1985, 1989, 1993, 1999, 2004, and 2008 – the only la Niña in the second half of the noughties. There were, however, two el Niño (warming) events: in 2007 and 2010.
Verdict: Truth value 0. There is very little basis in the observed record for what Mann says. He is looking for a pretext – any pretext – rather than facing the fact that the models have been programmed to exaggerate future global warming.
Mann: “Finally, one recent study suggests that incomplete sampling of Arctic temperatures led to underestimation of how much the globe actually warmed.”
Science: And that “study” has been debunked. The numerous attempts by meteorological agencies around the world to depress temperatures in the early 20th century to make the centennial warming rate seem larger than it is have far outweighed any failure to measure temperature change in one tiny region of the planet.
Verdict: Truth value 0. Increasingly, as the science collapses, the likes of Mann will resort in desperation to single studies, usually written by one or another of the remarkably small clique of bad scientists who have been driving this silly scare. Meanwhile, the vrai pause continues. As CO2 concentrations increase, the Pause will not be likely to continue indefinitely. But it is now clear that the rate at which the world will warm will be considerably less than the usual suspects have predicted.
Mann: “When all the forms of evidence are combined, they point to a most likely value for ECS that is close to three degrees C.”
Science: The IPCC has now become explicit about not being explicit about a central estimate of climate sensitivity. Given that two-thirds of Mann’s suggested 3 Cº value depends upon the operation over millennial timescales of temperature feedbacks that Mann himself admits are subject to enormous uncertainties; given that not one of the feedbacks can be directly measured or distinguished by any empirical method either from other feedbacks or from the forcings that triggered it; and given that non-radiative transports are woefully represented in the models, there is no legitimate scientific basis whatsoever for Mann’s conclusion that a 3 Cº climate sensitivity is correct.
Truth value: 0. What Mann is careful not to point out is that the IPCC imagines that only half of the warming from a doubling of CO2 concentration will arise in the next 200 years. The rest will only come through over 1000-3000 years. Now, at current emission rates a doubling of the pre-industrial 280 ppmv CO2 will not occur for 80 years. However, 0.9 Cº warming has already occurred since 1750, leaving only another 0.6 Cº warming to occur by 2280, on the assumption that all of the 0.9 Cº was manmade. And that is if Mann and the models are right.
Mann: “And as it turns out, the climate models the IPCC actually used in its Fifth Assessment Report imply an even higher value of 3.2 degrees C.”
Science: The 2007 Fourth Assessment Report said there would be 3.26 Cº warming at equilibrium after a CO2 doubling. But the 2013 Fifth Report said no such thing. It has fallen commendably silent.
Verdict: Truth value 0. Mann is, yet again, at odds with the IPCC, which has now begun to learn that caution is appropriate in the physical sciences.
Mann: “The IPCC’s lower bound for ECS, in other words, probably does not have much significance for future world climate–and neither does the faux pause.”
Science: This is pure wishful thinking on Mann’s part. In all Assessment Reports except the Fourth, the IPCC chose 1.5 Cº as its lower bound for equilibrium climate sensitivity to doubled CO2 concentration. In the Fourth it flirted briefly with 2 Cº, but abandoned that value when faced with the real-world evidence that Mann sneeringly dismisses as “the faux pause”.
Verdict: Truth value 0. Calling the vrai pause “the faux pause” is a faux pas.
Mann: “What would it mean if the actual equilibrium climate sensitivity were half a degree lower than previously thought? Would it change the risks presented by business-as-usual fossil-fuel burning? How quickly would the earth cross the critical threshold?”
Science: But what is the “critical threshold”? Mann fails to define it. Is there some value for global mean surface temperature that is the best of all temperatures in the best of all possible worlds? If so, Mann’s hypothesis can only be tested if he enlightens us on what that ideal temperature is. He does not do so.
Verdict: Truth value 0. In the absence of a clear and scientifically justified statement of an ideal temperature, plus a further justified statement that a given departure from that ideal temperature would be dangerous, there is no case for a “critical threshold”. Furthermore, there is at present little empirical basis for a global warming of more than 1 Cº over the coming century.
Mann: “Most scientists concur that two degrees C of warming above the temperature during preindustrial time would harm all sectors of civilization–food, water, health, land, national security, energy and economic prosperity.”
Science: No survey of scientists to determine whether they “concur” as to the 2 Cº above pre-industrial temperature that Mann considers on no evidence to be the “critical threshold” has been conducted. Even if such a survey had been conducted – and preferably conducted by someone less accident-prone than the absurd Cook and Nutticelli – that would tell us nothing about the scientific desirability or undesirability of such a “threshold”: for science is not done by consensus, though totalitarian politics is. And it was totalitarian politicians, not scientists, who determined the 2 Cº threshold, on no evidence, at one of the interminable paid holidays in exotic locations known as UN annual climate conferences.
Verdict: Truth value 0. There is no scientific basis for the 2 Cº threshold, and Mann does not really attempt to offer one.
Mann: “Although climate models have critics, they reflect our best ability to describe how the climate system works, based on physics, chemistry and biology.”
Science: Mann’s own model that contrived the Hokey-Stick graph shows what happens when a model is constructed with insufficient attention to considerations that might point against the modeler’s personal preconceptions. The model used a highly selective subset of the source data; it excluded hundreds of papers demonstrating the inconvenient truth that the medieval warm period existed; it gave almost 400 times as much weighting to datasets showing the medieval warm period as it did to datasets that did not show it; and the algorithm that drew the graph would draw Hokey Sticks even if random red noise rather than the real data were used. The problem with any model of a sufficiently complex object is that there are too many tunable parameters, so that the modeler can – perhaps unconsciously – predetermine the output. To make matters worse, intercomparison tends to institutionalize errors throughout all the models. Besides, since the climate behaves as a chaotic object, modeling its evolution beyond around ten days ahead is not possible. We can say (and without using a model) that if we add plant-food to the air it will be warmer than if we had not done so; but (with or without a model) we cannot say with any reliability how much warming is to be expected.
Verdict: Truth value 0. Models have their uses, but as predictors of long-term temperature trends they are, for well-understood reasons, valueless.
Mann: “And they [the models] have a proved track record: for example, the actual warming in recent years was accurately predicted by the models decades ago.”
Science: Here is Hansen’s 1988 prediction of how much global warming should have occurred since then, according to his “Giss Model E”.
The trend shown by Hansen is +0.5 Cº per decade. The outturn since 1988, however, was just 0.15 Cº per decade, less than one-third of what Hansen described as his “business-as-usual” case. Models’ projections have been consistently exaggerated:
Verdict: Truth value 0. The models have consistently and considerably exaggerated the warming of recent decades. The next graph shows a series of central projections, compared with the observed outturn to date, extrapolated to 2050. This is not a picture of successful climate prediction. It is on the basis of these failed predictions that almost the entire case for alarm about the climate is unsoundly founded.
Mann: “I ran the model again and again, for ECS values ranging from the IPCC’s lower bound (1.5 Cº) to its upper bound (4.5 Cº). The curves for an equilibrium climate sensitivity of 2.5 Cº and 3 Cº fit the instrument readings most closely. The curves for a substantially lower (1.5 Cº) and higher (4.5 Cº) sensitivity did not fit the recent instrumental record at all, reinforcing the notion that they are not realistic.”
Science: Legates et al. (2013) established that only 0.3% of abstracts of 11,944 climate science papers published in the 21 years 1991-2011 explicitly stated that we are responsible for more than half of the 0.69 Cº global warming of recent decades. Suppose that 0.33 Cº was our contribution to global warming since 1950, that CO2 concentration in that year was 305 ppmv and is now 398 ppmv. Then the radiative forcing from CO2 that contributed to that warming was 5.35 ln(398/305) = 1.42 Watts per square meter. Assuming that the IPCC’s central estimate of 713 ppmv CO2 by 2100 is accurate, the CO2 forcing from now to 2100 will be 5.35 ln(713/398), or 3.12 W m–2. On the assumption that the ratio of CO2 forcing to that from other greenhouse gases will remain broadly constant, and that temperature feedbacks will have exercised 44/31 of the multiplying effect seen to date, the manmade warming to be expected by 2100 on the basis of the 0.33 Cº warming since 1950 will be 3.12/1.42 x 0.33 x 44/31 = 1 Cº. Broadly speaking, the IPCC expects this century’s warming to be equivalent to that from a doubling of CO2 concentration. In that event, 1 Cº is the warming we should expect from a CO2 doubling, and the only sense in which the 1.5 Cº lower bound of the IPCC’s interval of climate-sensitivity estimates is “unrealistic” is that it is probably somewhat too high.
Verdict: Truth value 0. Here, as elsewhere, Mann appears unaware of the actual evolution of global temperatures during the post-1950 era when we might in theory have exercised some warming influence. There has been less warming than They thought, and – on the basis of the scientific consensus established by Legates et al. – less of the observe warming is anthropogenic than They thought they thought.
Mann: “To my wonder, I found that for an ECS of 3 Cº, our planet would cross the dangerous warming threshold of two degrees C in 2036, only 22 years from now. When I considered the lower ECS value of 2.5 Cº, the world would cross the threshold in 2046, just 10 years later.”
Science: Mann here perpetrates one of the fundamental errors of the climate-extremists. He assumes that the prediction of a climate model is subject to so little uncertainty that it constitutes a fact. This statement is one of a series by true-believers saying we have only x years to Save The Planet by shutting down the West. Ex-Prince Chazza has done it. Al Gore has done it. The UN did it big-time by saying in 2005 that there would be 50 million climate refugees by 2010. There weren’t.
Verdict: Truth value 0. Extreme warming that has been predicted does not become a fact unless it comes to pass. If you want my prediction, it won’t. And that’s a fact.
Mann: “So even if we accept a lower equilibrium climate sensitivity value, it hardly signals the end of global warming or even a pause. Instead it simply buys us a little bit of time – potentially valuable time – to prevent our planet from crossing the threshold.”
Science: No one is suggesting that the Pause will continue indefinitely. Theory as well as observation suggests otherwise. However, a Pause that has not occurred cannot “buy us a little bit of time”. Mann’s mention of “buying us a little bit of time” is, therefore, an admission that the Pause is real, as all of the temperature datasets show.
Verdict: Truth value 0. A low enough climate sensitivity will allow temperatures to remain stable for decades at a time, during periods when natural factors tending towards global cooling temporarily overwhelm the warming that would otherwise occur.
Mann: “These findings have implications for what we all must do to prevent disaster.”
Science: Warming of 3 Cº would not be a “disaster”. Even the bed-wetting Stern Review of 2006 concluded that warming of 3 Cº over the 21st century would cost as little as 0-3% of global GDP. But at present we are heading for more like 1 Cº. And even the IPCC has concluded that less than 2 Cº warming compared with 1750, which works out at 1.1 Cº compared with today, will be net-beneficial.
Verdict: Truth value 0. There is no rational basis for any suggestion that our adding CO2 to the atmosphere at the predicted rate, reaching 713 ppmv by 2100, will be anything other than beneficial.
Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm. Ironically, if the world burns significantly less coal, that would lessen CO2 emissions but also reduce aerosols in the atmosphere that block the sun (such as sulfate particulates), so we would have to limit CO2 to below roughly 405 ppm. We are well on our way to surpassing these limits.”
Science: What we are concerned with is not CO2 simpliciter, but CO2-equivalent. CO2 itself contributes only 70% of the anthropogenic enhancement of the greenhouse effect. The (admittedly arbitrary) target of 450 ppmv CO2-equivalent is thus a target of only 315 ppmv CO2 – the concentration that prevailed in 1958. Mann’s suggested target of 405 ppmv CO2e would represent just 284 ppmv CO2. And that would fling us back to the pre-industrial CO2 concentration.
Verdict: Truth value 0. We are not “well on our way to surpassing these limits”: we passed them as soon as the industrial revolution began. The current CO2-equivalent concentration of 398 ppmv already exceeds the pre-industrial 284 ppmv by 40%, yet the world has warmed by only 0.9Cº since then, our contribution to that warming may well be 0.33 Cº or less.
Mann: “Some climate scientists, including James E. Hansen, former head of the NASA Goddard Institute for Space Studies, say we must also consider slower feedbacks such as changes in the continental ice sheets.”
Science: The IPCC already takes changes in ice-sheets into account. It says that in the absence of “dynamical ice flow” that cannot happen, the Greenland ice sheet would not disappear “for millennia”. And there is no prospect of losing ice from the vast ice sheet of East Antarctica, which is at too high an altitude or latitude to melt. Even the West Antarctic Ice Sheet, which has lost some ice, is proving more robust than the usual suspects had thought. Sea level, according to the GRACE gravitational anomaly satellites, has been falling (Peltier et al., 2009). During the eight years of ENVISAT’s operation, from 2004-2012, sea level rose at a scary 1.3 inches per century.
Verdict: Truth value 0. There is no reason to suppose the major ice sheets will disintegrate on timescales of less than millennia.
Mann: “Hansen and others maintain we need to get back down to the lower level of CO2 that existed during the mid-20th century–about 350 ppm.”
Science: 350 ppmv is, again, CO2-equivalent. That implies 245 ppmv, a value well below the pre-industrial 280 ppmv. At 180 ppmv, plants and trees become dangerously starved of CO2. Flinging CO2 concentration back to that value would reduce CO2 fertilization and hence crop yields drastically, and would do major damage to the rain-forests.
Mann: “In the Arctic, loss of sea ice and thawing permafrost are wreaking havoc on indigenous peoples and ecosystems.”
Science: The Arctic has not lost as much sea ice as had been thought. In the 1920s and 1930s there was probably less sea ice in the Arctic than there is today. The decline in sea ice is small in proportion to the seasonal variability, as the graph from the University of Illinois shows. And the part of the satellite record that is usually cited began in 1979. An earlier record, starting in 1973, showed a rapid growth in sea ice until it reached its peak extent in 1970. Indigenous peoples, like the polar bears, prefer warmer to colder weather. And almost all ecosystems also prefer warmer to colder weather.
Verdict: Truth value 0. The decline in sea ice in the Arctic is far more of a benefit than a loss.
Mann: “In low-lying island nations, land and freshwater are disappearing because of rising sea levels and erosion.”
Science: On the contrary, detailed studies show not only that low-lying island nations are not sinking beneath the waves, but that their territory is in many cases expanding. The reason is that corals grow to meet the light. As sea level rises, the corals grow and there is no net loss of territory. Also, sea level rises less in mid-ocean, where the islands are, than near the continental coasts. And sea level has scarcely been rising anyway. According to Grinsted et al., it was 8 inches higher in the medieval warm period than it is today.
Verdict: Truth value 0. If the world were once again to become as warm as it was in the Middle Ages, perhaps sea level would rise by about 8 inches. And that is all.
Mann: “Let us hope that a lower climate sensitivity of 2.5 degrees C turns out to be correct. If so, it offers cautious optimism. It provides encouragement that we can avert irreparable harm to our planet. That is, if–and only if–we accept the urgency of making a transition away from our reliance on fossil fuels for energy.”
Science: Mann is here suggesting that a climate sensitivity of 3 Cº would be disastrous, but that 2.5 Cº would not. The notion that as little as 0.5 Cº would make all the difference is almost as preposterous as the notion that climate sensitivity will prove to be as high as 2.5 Cº. As we have seen, on the assumption that less than half of the warming since 1950 was manmade, climate sensitivity could be as low as 1 Cº – a value that is increasingly finding support in the peer-reviewed literature.
Verdict: Truth value 0. The central error made by Mann and his ilk lies in their assumption that models’ predictions are as much a fact as observed reality. However, observed climate change has proven far less exciting in reality than the previous predictions of Mann and others had led us to expect. The multiple falsehoods and absurdities in his Scientific American article were made possible only by the sullen suppression by the Press of just how little of what has been predicted is happening in the real climate. In how many legacy news media have you seen the Pause reported at all? But it will not be possible for the mainstream organs of propaganda to conceal from their audiences forever the inconvenient truth that even the most recent, and much reduced, projections of the silly climate models are proving to be egregious exaggerations.
![earth-will-cross-the-climate-danger-threshold-by-2036_large[1]](http://wattsupwiththat.files.wordpress.com/2014/03/earth-will-cross-the-climate-danger-threshold-by-2036_large1.jpg?resize=640%2C423&quality=83)

![land-and-ocean-other-results-1950-large[1]](http://wattsupwiththat.files.wordpress.com/2014/03/land-and-ocean-other-results-1950-large1.png?resize=640%2C492&quality=75)

GregB:
Your post at March 24, 2014 at 10:56 pm is addressed to davidmhoffer and says in total
Allow me to help you to grasp why you are failing to understand what davidmhoffer is explaining to you.
All apples are fruit. Not all fruit are apples.
Similarly,
All heat is energy. Not all energy is heat.
There is no “photon of heat energy”.
A photon is a quantum of electromagnetic energy. It has an energy which is related to its wavelength.
The heat of a gas is kinetic energy of its molecules. It is expressed as the temperature of the gas which is a function of the average (i.e. RMS) speed of the gas molecules.
A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.
In other words, you are plain wrong when you write
The mathematical and physical facts are that the molecule does not “cool” by any amount when it gives up vibrational or rotational energy because they are not heat.
I hope that helps.
Richard
Richard C. regarding:
“A gas molecule does not gain kinetic energy – so does NOT warm – when it absorbs a photon. The photon provides the molecule with vibrational or translational energy. And the molecule does NOT “cool” when it loses that vibrational or rotational energy because its kinetic energy is not affected.”
I’m just a laymen who has completed some light reading on Physics. I am confused by your explanation; which seems to be partially correct. Is it not true that photon bombardment/absorption of sufficient intensity will induce electrons to jump orbit at discreet intervals/quanta? isn’t vibration of material kinetic energy? Is the vibration of boiling water greater or less than non-boiling water?
“the IPCC published a graph that my co-authors and I devised….”
That’s an interesting word. It smacks of a Freudian slip. Scientists can certainly devise theories, there’s nothing wrong with that. But can you ‘devise’ data? Can you ‘devise’ a graph that shows data?
According to Mann, apparently you can. Of course, anyone can devise – i.e. make up – anything they like. But if it’s passed off as real data in a scientific paper, then it’s fraudulent, pure and simple.
Perhaps climate scientists should make greater use of empirical scientific data rather than ‘devising’ stuff.
Chris
_Jim says:
March 24, 2014 at 10:07 pm
“re: DirkH says March 24, 2014 at 11:08 am
… Rather, the highest bit of a register serves as sign in 2′s-complement arithmetic.
One sees this in print, in various places, and if one presses that into practice one finds oneself at odds with the practical implementation of numbers in Two’s Complement form.
If, instead, one treats the MSB as the minimum value representable in the memory allocation ‘chunk’ (e.g. -128 for a byte,. -32768 for a 16-bit word, etc) or *register* one is working with, then ‘things’ work out in a much more straight-forward fashion; For human interpretation it is a simply matter of ‘summing’ the bit representations together to arrive at a value.”
These are just two ways of looking at it, Jim. No difference on the binary function level. And no, I don’t find myself at odds with the practical implementation when doing 2’s complement arithmetic. It’s all implemented exactly as one would expect it to.
SciAm ceased being “scientific” (and “American”) 25 years ago.
rgb – if back radiation actually existed and could be directly measured then we could harness it. Given as I have no “back radiation” powered flashlight I suspect it is likely to not be a physical force and more likely only an artefact of misguided thinking.
Given as H2O dominates within the entire climate zone (up to an altitude where there is no weather) explain again how CO2 absorbing IR (that doesn’t have the energy to increase the kinetic energy of the CO2 molecule – dipole moment change is the mechanism in CO2) – somehow “warms” the lower atmosphere via molecular collisions at an altitude of reduced molecular collisions? How does this top down mechanism work when the temperature profile is bottom up?
All this talk, Yet Co2 keeps going up, and temperature does not.
It really is quite simple.
These are just two ways of looking at it, Jim. No difference on the binary function level.
Not everything one reads on the internet is true; the above is just one more case.
There is a form of ‘binary’ notation where the MSB is used as the ‘sign’ bit, called “sign-magnitude” or “sign and magnitude” representation (such as was used on early IBM computers like the IBM 7090 which used this representation), but, making this assumption in Two’s Complement notation is case of messed up thinking (and has screwed up MORE than one student’s understanding of how Two’s Complement works.) Also, sign-magnitude form should not be confused with One’s Complement representation which shares similarities to Two’s Complement form.
Take this 8-bit binary value: 1000 0000 as a number in Two’s Complement form … what is the value of this number in base 10 *IF* the MSB is simply taken to represent the “sign” bit as DirkH contends?
I proffer that DirkH IS thinking in terms of Sign-Magnitude, which results in this series of ‘numbers’ in binary form vs Two’s Complement form:
. . . . . . . . . Binary . . . . . . . . . . Binary
. Dec .Two’s Complement . . Sign-Magnitude
127 . . . . .0111 1111 . . . . . . . 0111 1111
+2 . . . . . 0000 0010 . . . . . . . 0000 0010
+1. . . . . . 0000 0001 . . . . . . . 0000 0001
0 . . . . . . .0000 0000 . . . . . . . 0000 0000 (also S-M possible: 1000 0000)
-1 . . . . . . 1111 1111 . . . . . . . 1000 0001
-2 . . . . . . 1111 1110 . . . . . . . 1000 0010
-127 . . . . 1000 0001. . . . . . . 1111 1111
-128 . . . . 1000 0000 . . . . . . . No representation possible
As one can see, the MSB represents a value of negative 128 (-128) vs a simple sign bit. Simply calling the MSB in a Two’s Complement number a sign bit leads to confusion, especially those (like students) new to the ‘handling’ and representation of numbers in the various binary forms.
.
Rob Ricket:
Thankyou for your request for clarification at March 25, 2014 at 4:31 am where you write
OK. What I wrote is completely correct, but is not It was not intended to be (and could not be) a complete explanation of all the issues.
Please note that my post at March 25, 2014 at 3:19 am said and tried to explain the importance of
Your question about boiling water is not relevant to that explanation: water is a liquid – not a gas – and is only boiling in trivial amounts (e.g. at hot springs) around the globe.
All atoms absorb and release photons of appropriate energy. As you say, they do this by inducing electron shell jumps within the atom. But these atomic effects are trivially small in the atmosphere. Indeed, they are so very, very small compared to the molecular effects of greenhouse gas (GHG) molecules that these atomic effects are usually ignored.
GHG molecules such as H2O and CO2 can store energy by vibrating and rotating parts of their structures.
A CO2 molecule
O-C-O
can vibrate by changing the ‘angle’ between its oxygen molecules, and this vibration energy is internal to the molecule so does not alter its speed relative to other molecules in the air. Hence, this vibrational energy has no direct effect on the temperature of the gas.
Oxygen (O2) and nitrogen (N2) molecules are not GHG molecules because they cannot store energy by vibrating and rotating parts of their structures.
O-O and N-N each has no ‘angle’ to vibrate.
I hope that provides the needed clarification.
Richard
Thank you Richard for your comments, your explanation makes things clearer for me. So what you are saying is a GHG will absorb and release IR energy without any heat exchange because the kinetic energy of the GHG is not altered. However if this GHG collides with N2 etc (non GHG) the energy imparted onto the non ghg molecule will affect its kinetic energy and thus be seen as heat.
Is this explanation close to reality
Cheers
Crakar24
@bw
I’ve been asking the same question for years. Why do satellites and SAT diverge from 1980 forward? The greenhouse effect hypothesis specifically states the troposphere should be warming at a significantly higher rate than the surface because it is “trapping heat”, yet observations show just the opposite.
Where is the hot spot?
re: GregB says March 24, 2014 at 7:54 pm
Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”
I simply used an LSB ‘bit’ set in bytes assigned as flags while doing assembler on a Z80, with a value initialized quickly in “A” (the Accumulator) as:
XOR A . . ; Clears reg A (only 4 clk “T” states and a 1 byte op code)
INC A . . . ; Sets LSBit in reg A (only 4 clk “T” states and a 1 byte op code)
LD mem_loc_flg,A . . ; Put init’d value to memory location for flag
Intrinsic in many operations with Reg A (the ‘Accumulator’) such as loads from memory are comparisons with (or against) zero, checks of the so-called “sign” bit (MSB) position resulting in various flag bits (in the Flag register) being set as a result of these intrinsic compares; subsequent checks of those flags (e.g. “JNZ”, “JP cc,nn” etc) can then effect ‘program flow’ (branching on various conditions). Keeping these intrinsic compares in mind is paramount when trying to write tight, fast-executing code … these kinds of features must have driven the first compiler-writers nuts too! One can also see why some compilers might result in tighter or faster-executing code as well, depending how much/how many features of the ‘native’ CPU the compiler is ‘aware’ of.
.
Does anyone take Mann seriously any more?
Nik says:
March 25, 2014 at 12:08 am
“What if…. we had a magic wand to reduce CO2 and attain an ideal global temperature and this wand was given to the warmistas? Do they have such “ideal” levels or would they fight each other tooth and nail to establish, each his own, preconceived notions of what is “best for the planet”?”
They do not answer when asked for what the ideal temperature would be. Because they don’t care. We just asked one:
http://notrickszone.com/2014/03/22/sks-hiroshima-bomb-heat-clock-fraud-claim-2-1-billion-climate-ground-zeros-yet-cant-find-a-single-one-of-them/#comment-926047
“Not even wrong”…
fadingfool says:
March 25, 2014 at 6:06 am
rgb – if back radiation actually existed and could be directly measured then we could harness it. Given as I have no “back radiation” powered flashlight
>>>>>>>>>>>>>>>>>
It can be directly measured, has been directly measured, I’ve linked upthread to three articles that include not only the explanation but the actual measurements at various places on earth. Your back radiation powered flashlight doesn’t exist because it isn’t cost effective to build one. I expect that you don’t have a clock powered by a potato either, which has nothing to do with it being possible to build one.
fadingfool;
somehow “warms” the lower atmosphere via molecular collisions at an altitude of reduced molecular collisions?
>>>>>>>>>>>>
Some photons that would otherwise have escaped to space are absorbed by CO2 molecules and then emitted again in a random direction. The direction being random, some portion of them are emitted downward from whence they came. Where in that explanation do you see anything about molecular collisions? Again, read the material and the explanations for what they say rather than what you think they say.
Richard,
The civil discourse is appreciated; as is, your assertion regarding the difference in abortion rates amongst liquids and gasses. Certainly, your statement below runs afoul of Charles Law, in as much as atmospheric mass creates a constant pressure. In relative terms, the distance electrons orbit are humongous (scientific term) relative to the proton. There are collisions (hence movement) proportionate to temperature.
“can vibrate by changing the ‘angle’ between its oxygen molecules, and this vibration energy is internal to the molecule so does not alter its speed relative to other molecules in the air. Hence, this vibrational energy has no direct effect on the temperature of the gas.”
Looks like a man making global warming there
Rob Ricket:
I wrote
At March 25, 2014 at 8:20 am you have replied to that saying
Say what!?
Charles’ Law is a special case of the ideal gas law. It applies to ideal gases held at a constant volume (V) allowing only the pressure and temperature (T) to change and can be stated as
V1/T1 = Vf2/T2
I fail to see how my correct, true and accurate statements run “afoul of Charles Law”. Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.
Richard
Dmhoffer: – no but I did have a potato powered radio (all these Hiroshima per second must be good for something) as yet I’ve yet to see downward IR emissions able to do any work and if it can’t do work it isn’t power.
Again even if the IR photon is re-emitted downward (this is not AGW theory by the way as current AGW theory has the energy transmitted to non-GHG gasses via collisions) this would be absorbed by either CO2 or H2O in a temperature neutral dipole moment change.
I have argued this before and have yet to get a consistent answer – given the triple state of H2O in the thermosphere where does the temperature increase come from and hence why would anyone expect to find the increase in energy in the temperature data (rather than the thermosphere extent and H2O triple point ratio)?
Without wanting to state the obvious, sometimes the simplest picture needs to be spelled out in single sentences to be isolated.
Looking at his white line, he has gone on the record revising existing temperature with no scientific basis.
That means Michael Mann, with no other defence, has altered the known figures without any known justification, and like Oscar Pistorius, unless he can show a legal defence is now no better than any of the mafia. That’s it, he has openly cheated and unless he can prove otherwise, because he put the fake figures out, he is a cheat. No one else known on earth has ever produced such figures, he can provide no source, and even if he could does he really believe he can then go and prove ALL THE OTHER OFFICIAL SETS ARE WRONG AS WELL?
Unless he both provides a defence for his alteration and then demonstrates why all the world’s official sets were never reading high enough MICHAEL MANN HAS MADE UP HIS OWN DATA, AND AS A RESULT HAS LEFT THE REALM OF SCIENCE. That is the only possible way (unless he can both provide a defence AND demonstrate all the other data read too low) the graph can be interpreted. The chocolate is gone and is all over his face and hands.
Would it be a worthwhile experiment to compare heating two buckets (white plastic) of cold water with an IR lamp from the top – one with only water and the other with carbonated water?
Assuming the plain water one heats enough to be detected, won’t the one with the CO2 heat up more, proving that the CO2 did not re-radiate the IR back out – it only helped heat the water itself?
My conundrum is that if the CO2 did “re-radiate” IR back out, then an even higher concentration of it should result in even more “re-radiation” back out resulting in even less heat energy going into the mixture and therefore less warming?
Mike M:
You ask at March 25, 2014 at 9:14 am
It would be pointless. The radiative properties of CO2 are very different when the CO2 is free molecules in air and dissolved molecules in water.
Richard
Richard C.,
Of the three forms of heat transfer, convection is germane to our discussion, in as much as, heat is transmitted through a gas…in the case of a typical kitchen range, the transmission gas is air. In an electric range, photons (in the infrared band) are emitted from the heating element. The photons induce the electrons to jump (mysteriously and instantaneously move) to outer orbits. Inside this fixed space the propensity for molecular collisions must increase with temperature as the electron orbit of each molecule expands.
I guess, I’m having difficulty coming to terms with your assertion that increases in molecular vibration occur independent a congruent increase in molecular collisions.
“I fail to see how my correct, true and accurate statements run “afoul of Charles Law”. Indeed, I fail to see any relevance of Charles’ Law which applies to alterations to temperature and pressure but molecular absorbtion does not alter temperature and/or pressure.”
Bottom line: they are going to keep moving the goalposts until they believe they have scored.