Hide the decline deja vu? Mann's 'little white line' as 'False Hope' may actually be false hype

Foreword by Anthony Watts 

An essay by Monckton of Brenchley follows, but I wanted to bring this graphic from Dr. Mann’s recent Scientific American article to attention first. In the infamous “hide the decline” episode revealed by Climategate surrounding the modern day ending portion of the “hockey stick”, Mann has been accused of using “Mike’s Nature Trick” to hide the decline in modern (proxy) temperatures by adding on the surface record. In this case, the little white line from his SciAm graphic shows how “the pause” is labeled a “faux pause”, (a little play on words) and how the pause is elevated above past surface temperatures.

earth-will-cross-the-climate-danger-threshold-by-2036_large[1]

Source: http://www.scientificamerican.com/sciam/assets/Image/articles/earth-will-cross-the-climate-danger-threshold-by-2036_large.jpg

Zoom of section of SciAm's graph from Dr. Mann. The 1°C line was added for reference.
Zoom of section of SciAm’s graph from Dr. Mann. The 1°C line was added for reference.

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.

So, over  the weekend I asked Willis Eschenbach to use his “graph digitizer” tool (which he has used before) to turn Mann’s little white line into numerical data, and he happily obliged.

Here is the result when Mann’s little white line is compared and matched to two well known surface temperature anomaly datasets:

mann_falsehope_vs_GISS-HAD4

What is most interesting is that  Mann’s “white line” shows a notable difference during the “pause” from HadCRUT4 and GISS LOTI. Why would our modern era of “the pause” be the only place where a significant divergence exists? It’s like “hide the decline” deja vu.

The digitized Mann’s white line data is available here: Manns_white_line_digitized.(.xlsx)

As of this writing, we don’t know what dataset was used to create Mann’s white line of surface temperature anomaly, or the base period used. On the SciAm graphic it simply says “Source: Michael E. Mann” on the lower right.

It isn’t GISS land ocean temperature index (LOTI), that starts in 1880. And it doesn’t appear to be HadCRUT4 either. Maybe it is BEST but not using the data going back to 1750? But that isn’t likely either, since BEST pretty much matches the other datasets, and in Mann’s graphic above, which peaks out at above 1°C, none of those hit higher than 0.7°C. What’s up with that?

land-and-ocean-other-results-1950-large[1]

Now compare that plot above to this portion Dr. Mann’s SciAm plot, noting the recent period of surface temperature and the 1°C reference line which I extended from the Y axis:

Manns_white_line_extended_1C

I’m reminded of Dr. Mann’s claims about climate skeptics in this video: http://www.linktv.org/video/9382/inside-the-climate-wars-a-conversation-with-michael-mann

At 4:20 in the video, Dr. Mann claims that US climate skeptics are part of  the “greatest disinformation campaign ever run”. If his position is so strong and pure, why then do we see silly things like this graph given with an elevated ending of global surface temperature (in contrast to 5 other datasets) and not a single data source citation given?

UPDATE: Mark B writes in comments:

Looking at the SciAm graphic (see zoom at right), something didn’t seem right, especially since there doesn’t seem to be any citation given for what the temperature dataset used was. And oddly, the graphic shows Mann’s little white line peaking significantly warmer that the 1998 super El Niño, and showing the current temperature equal to 1998, which doesn’t make any sense.

Explanation of graph including links to source code and data were given here: http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/

REPLY: Yes, I’ve seen that, but there is a discrepancy, the label on the image is “Historical Mean Annual Temperature” (white)

In http://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/ it is written:

Historical Simulations. The model was driven with estimated annual natural and anthropogenic forcing over the years A.D. 850 to 2012. Greenhouse radiative forcing was calculated using the approximation (ref. 8) FGHG = 5.35log(CO2e/280), where 280 parts per million (ppm) is the preindustrial CO2 level and CO2e is the “equivalent” anthropogenic CO2. We used the CO2 data from ref. 9, scaled to give CO2e values 20 percent larger than CO2 alone (for example, in 2009 CO2 was 380 ppm whereas CO2e was estimated at 455 ppm). Northern Hemisphere anthropogenic tropospheric aerosol forcing was not available for ref. 9 so was taken instead from ref. 2, with an increase in amplitude by 5 percent to accommodate a slightly larger indirect effect than in ref. 2, and a linear extrapolation of the original series (which ends in 1999) to extend though 2012.

“Historical Mean Annual Temperature” is NOT the same as “Historical Simulations” It looks to me like a bait and switch.

UPDATE2: Note the lead in text says “Global temperature rise…”

But in comments, Willis and Bill Illis have worked out that the white line represents only half the planet, the Northern Hemisphere. The white line is HadCRUT NH value, not global.

Obviously we can’t take such statements as the lead in text saying “global” at face value. Imagine if a climate skeptic made a graph like this. We’d be excoriated.

What needs to be done is to create a graph that shows what this would have looked like had Mann not cherry picked the NH and presented it on a graph with the text “Global temperature rise…”.

==============================================================

Mann’s ‘False Hope’ is false hype

By Christopher Monckton of Brenchley

The legendary Dr Walter Lewin, Professor of Physics at MIT, used to introduce his undergraduate courses by saying that every result in physics depended ultimately on measurement; that mass, distance, and time are its three fundamental physical units that every observation in these and all of their derivative units is subject to measurement uncertainty; and that every result in physics, if only for this reason, is to some degree uncertain.

Contrast this instinctual humility of the true physicist with the unbecoming and, on the evidence to date, unjustifiable self-assurance of the surprisingly small band of enthusiasts who have sought to tell us there is a “climate crisis”’. Not the least among these is Michael Mann, perpetrator of the Hokey-Stick graph that wrought the faux abolition of the medieval warm period.

In logic, every declarative statement is assigned a truth-value: 1 (or, in computer programs, –1) for true, 0 for false. Let us determine the truth-values of various assertions made by Mann, in a recent article entitled False Hope, published in the propaganda-sheet Scientific American.

Mann’s maunderings and meanderings will be in bold face, followed by what science actually says in Roman face, and the verdict: Truth-value 1, or truth-value 0?

Mann: “Global warming continues unabated.”

Science: Starting in Orwell’s Year (1984), and taking the mean of the five standard global temperature datasets since then, the rate of warming has changed as follows:

1979-1990 Aug 140 months +0.080 Cº/decade.

1979-2002 Apr 280 months +0.153 Cº/decade.

1979-2013 Dec 420 months +0.145 Cº/decade.

The slowdown in the global warming rate has arisen from the long pause, now 13 years 2 months in length on the mean of all five datasets (assuming that HadCRUT4, which is yet to report, shows a result similar to the drop in global temperatures reported by the other four datasets).

Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.

Mann: “… during the past decade there was a slowing in the rate at which the earth’s average surface temperature had been increasing. The event is commonly referred to as “the pause,” but that is a misnomer: temperatures still rose, just not as fast as during the prior decade.”

Science: During the decade February 2005 to January 2014, on the mean of all five datasets, there was a warming of 0.01 Cº, statistically indistinguishable from zero.

Truth-value 0: Temperatures did not rise in any statistically significant sense, and the increase was within the measurement uncertainty in the datasets, so that we do not know there was any global warming at all over the decade. Here, Walter Lewin’s insistence on the importance of measurement uncertainty is well demonstrated.

Mann: “In response to the data, the IPCC in its September 2013 report lowered one aspect of its prediction for future warming.”

Science: In 2013 the IPCC reduced the lower bound of its 2007 equilibrium climate-sensitivity interval from 2 Cº to 1.5 Cº warming per CO2 doubling, the value that had prevailed in all previous Assessment Reports. It also reduced the entire interval of near-term projected warming from [0.4, 1.0] Cº to [0.3, 0.7] Cº. Furthermore, it abandoned its previous attempts at providing a central estimate of climate sensitivity.

Verdict: Truth value 0. The IPCC did not lower only “one aspect of its prediction for future warming” but several key aspects, abandoning the central prediction altogether.

Mann: If the world keeps burning fossil fuels at the current rate, it will cross a threshold into environmental ruin by 2036. The “faux pause” could buy the planet a few extra years beyond that date to reduce greenhouse gas emissions and avoid the crossover–but only a few.

Science: Mann is asserting that on the basis of some “calculations” he says he has done, the world will face “environmental ruin” by 2036 or not long thereafter. However, Mann has failed to admit any uncertainty in his “calculations” and consequently in his predictions.

Verdict: Truth-value 0. Given the ever-growing discrepancy between prediction and observation in the models, and Mann’s own disastrous record in erroneously abolishing the medieval warm period by questionable statistical prestidigitation, the uncertainty in his predictions is very large, and a true scientist would have said so.

Mann: “The dramatic nature of global warming captured world attention in 2001, when the IPCC published a graph that my co-authors and I devised, which became known as the ‘hockey stick’. The shaft of the stick, horizontal and sloping gently downward from left to right, indicated only modest changes in Northern Hemisphere temperature for almost 1,000 years–as far back as our data went.”

Science: The Hokey-Stick graph falsely eradicated both the medieval warm period and the little ice age. At co2science.org, Dr. Craig Idso maintains a database of more than 1000 papers demonstrating by measurement (rather than modeling) that the medieval warm period was real, was near-global, and was at least as warm as the present just about everywhere. McIntyre & McKitrick showed the graph to be erroneous, based on multiple failures of good statistical practice. The medieval warm period and the little ice age are well attested in archaeology, history, architecture, and art. It was the blatant nonsense of the Hokey Stick that awoke many to the fact that a small academic clique was peddling unsound politics, not sound science.

Verdict: Truth value 0. Once again, Mann fails to refer to the uncertainties in his reconstructions, and to the many independent studies that have found his methods false and his conclusions erroneous. Here, he takes a self-congratulatory, nakedly partisan stance that is as far from representing true science as it is possible to go.

Mann: “The upturned blade of the stick, at the right, indicated an abrupt and unprecedented rise since the mid-1800s.”

Science: The graph, by confining the analysis to the northern hemisphere, overstated 20th-century global warming by half. Mann says the rise in global temperatures, shown on the graph as 1.1 Cº over the 20th century, is “unprecedented”. However, the Central England Temperature Record, the world’s oldest, showed a rise of 0.9 Cº in the century from 1663 to 1762, almost entirely preceding the industrial revolution, compared with an observed rate of just 0.7 Cº over the 20th century. The CETR is a good proxy for global temperature change. In the 120 years to December 2013 it showed a warming rate within 0.01 Cº of the warming rate taken as the mean of the three global terrestrial datasets.

Verdict: Truth value 0. The warming of the 20th century was less than the warming for the late 17th to the late 18th centuries.

clip_image002

Mann: “The graph became a lightning rod in the climate change debate, and I, as a result, reluctantly became a public figure.”

Science: For “lightning-rod” read “laughing-stock”. For “reluctantly” read “enthusiastically”. For “public figure” read “vain and pompous charlatan who put the ‘Ass’ in ‘Assessment Report’”.

Verdict: Pass the sick-bucket, Alice.

Mann: “In its September 2013 report, the IPCC extended the stick back in time, concluding that the recent warming was likely unprecedented for at least 1400 years.”

Science: The IPCC is here at odds with the published scientific literature. In my expert review of the pre-final draft of the Fifth Assessment Report, I sent the IPCC a list of 450 papers in the reviewed literature that demonstrated the reality of the warm period. The IPCC studiously ignored it. Almost all of the 450 papers are unreferenced in the IPCC’s allegedly comprehensive review of the literature. I conducted a separate test using the IPCC’s own methods, by taking a reconstruction of sea-level change over the past 1000 years, from Grinsted et al. (2009), and comparing it with the schematic in the IPCC’s 1990 First Assessment Report showing the existence and prominence of both the medieval warm period and the little ice age. The two graphs are remarkably similar, indicating the possibility that the sea-level rise in the Middle Ages was caused by the warmer weather then, and that the fall in the Little Ice Age was caused by cooler weather. The sea-level reconstruction conspicuously does not follow a Hokey-Stick shape.

clip_image004

Verdict: Truth value 0. The IPCC has misrepresented the literature on this as on other aspects of climate science. There are of course uncertainties in any 1000-year reconstruction, but if Grinsted et al. have it right then perhaps Mann would care to explain how it was that sea level rose and fell by as much as 8 inches either side of today’s rather average value if there was no global warming or cooling to cause the change?

Mann: “Equilibrium climate sensitivity is shorthand for the amount of warming expected, given a particular fossil-fuel emissions scenario.”

Science: Equilibrium climate sensitivity is a measure of the global warming to be expected in 1000-3000 years’ time in response to a doubling of CO2 concentration, regardless of how that doubling came about. It has nothing to do with fossil-fuel emissions scenarios.

Truth value: 0. Mann may well be genuinely ignorant here (as elsewhere).

Mann: “Because the nature of these feedback factors is uncertain, the IPCC provides a range for ECS, rather than a single number. In the September report … the IPCC had lowered the bottom end of the range. … The IPCC based the lowered bound on one narrow line of evidence: the slowing of surface warming during the past decade – yes, the faux pause.”

Science: For well over a decade there has been no global warming at all. The pause is not faux, it is real, as Railroad Engineer Pachauri, the IPCC’s joke choice for climate-science chairman, has publicly admitted. And the absence of any global warming for up to a quarter of a century is not “one narrow line of evidence”: it is the heart of the entire debate. The warming that was predicted has not happened.

Verdict: Truth value 0. Mann is here at odds with the IPCC, which – for once – paid heed to the wisdom of its expert reviewers and explicitly abandoned the models, such as that of Mann, which have been consistent only in their relentless exaggeration of the global warming rate.

Mann: “Many climate scientists – myself included – think that a single decade is too brief to accurately measure global warming and that the IPCC was unduly influenced by this one, short-term number.”

Science: Overlooking the split infinitive, the IPCC was not “unduly influenced”: it was, at last, taking more account of evidence from the real world than of fictitious predictions from the vast but inept computer models that were the foundation of the climate scare. Nor was the IPCC depending upon “one short-term number”.

James Hansen of NASA projected 0.5 C°/decade global warming as his “business-as-usual” case in testimony before Congress in 1988. The IPCC’s 1990 First Assessment Report took Hansen’s 0.5 C°/decade as its upper bound. It projected 0.35 C°/decade as its mid-range estimate, and 0.3 C°/decade as its best estimate.

The pre-final draft of the 2013 Fifth Assessment Report projected 0.23 C°/decade as its mid-range estimate, but the published version reduced this value to just 0.13 C°/decade – little more than a quarter of Hansen’s original estimate of a quarter of a century previously.

Observed outturn has been 0.08 Cº/decade since 1901, 0.12 C°/decade since 1950, 0.14 C°/decade since 1990, and zero since the late 1990s.

Three-quarters of the “climate crisis” predicted just 24 years ago has not come to pass. The Fifth Assessment Report bases its near-term projections on a start-date of 2005. The visible divergence of the predicted and observed trends since then is remarkable.

clip_image006

It is still more remarkable how seldom in the scientific journals the growing discrepancy between prediction and observation is presented or discussed.

Verdict: Truth value 0. Step by inexorable step, the IPCC is being driven to abandon one extremist prediction after another, as real-world observation continues to fall a very long way short of what it had been predicting.

Mann: “The accumulated effect of volcanic eruptions during the past decade, including the Icelandic volcano with the impossible name, Eyjafjallajökull, may have had a greater cooling effect on the earth’s surface than has been accounted for in most climate model simulations. There was also a slight but measurable decrease in the sun’s output that was not taken into account in the IPCC’s simulations.”

Science: So the models failed to make proper allowance for, still less to predict, what actually happened in the real world.

Verdict: Truth value 0. Eyjafjallajökull caused much disruption, delaying me in the United States for a week (it’s an ill wind …), but it was a comparatively minor volcanic eruption whose signature in the temperature record cannot be readily distinguished from the la Niña cooling following the el Niño at the beginning of 2010. The discrepancy between models’ predictions and observed reality can no longer be as plausibly dismissed as this, and the IPCC knows it.

Mann: “In the latter half of the decade, La Niña conditions persisted in the eastern and central tropical Pacific, keeping global surface temperatures about 0.1 degree C colder than average …”

Science: There were La Niña (cooling) events in 1979, 1983, 1985, 1989, 1993, 1999, 2004, and 2008 – the only la Niña in the second half of the noughties. There were, however, two el Niño (warming) events: in 2007 and 2010.

Verdict: Truth value 0. There is very little basis in the observed record for what Mann says. He is looking for a pretext – any pretext – rather than facing the fact that the models have been programmed to exaggerate future global warming.

Mann: “Finally, one recent study suggests that incomplete sampling of Arctic temperatures led to underestimation of how much the globe actually warmed.”

Science: And that “study” has been debunked. The numerous attempts by meteorological agencies around the world to depress temperatures in the early 20th century to make the centennial warming rate seem larger than it is have far outweighed any failure to measure temperature change in one tiny region of the planet.

Verdict: Truth value 0. Increasingly, as the science collapses, the likes of Mann will resort in desperation to single studies, usually written by one or another of the remarkably small clique of bad scientists who have been driving this silly scare. Meanwhile, the vrai pause continues. As CO2 concentrations increase, the Pause will not be likely to continue indefinitely. But it is now clear that the rate at which the world will warm will be considerably less than the usual suspects have predicted.

Mann: “When all the forms of evidence are combined, they point to a most likely value for ECS that is close to three degrees C.”

Science: The IPCC has now become explicit about not being explicit about a central estimate of climate sensitivity. Given that two-thirds of Mann’s suggested 3 Cº value depends upon the operation over millennial timescales of temperature feedbacks that Mann himself admits are subject to enormous uncertainties; given that not one of the feedbacks can be directly measured or distinguished by any empirical method either from other feedbacks or from the forcings that triggered it; and given that non-radiative transports are woefully represented in the models, there is no legitimate scientific basis whatsoever for Mann’s conclusion that a 3 Cº climate sensitivity is correct.

Truth value: 0. What Mann is careful not to point out is that the IPCC imagines that only half of the warming from a doubling of CO2 concentration will arise in the next 200 years. The rest will only come through over 1000-3000 years. Now, at current emission rates a doubling of the pre-industrial 280 ppmv CO2 will not occur for 80 years. However, 0.9 Cº warming has already occurred since 1750, leaving only another 0.6 Cº warming to occur by 2280, on the assumption that all of the 0.9 Cº was manmade. And that is if Mann and the models are right.

Mann: “And as it turns out, the climate models the IPCC actually used in its Fifth Assessment Report imply an even higher value of 3.2 degrees C.”

Science: The 2007 Fourth Assessment Report said there would be 3.26 Cº warming at equilibrium after a CO2 doubling. But the 2013 Fifth Report said no such thing. It has fallen commendably silent.

Verdict: Truth value 0. Mann is, yet again, at odds with the IPCC, which has now begun to learn that caution is appropriate in the physical sciences.

Mann: “The IPCC’s lower bound for ECS, in other words, probably does not have much significance for future world climate–and neither does the faux pause.”

Science: This is pure wishful thinking on Mann’s part. In all Assessment Reports except the Fourth, the IPCC chose 1.5 Cº as its lower bound for equilibrium climate sensitivity to doubled CO2 concentration. In the Fourth it flirted briefly with 2 Cº, but abandoned that value when faced with the real-world evidence that Mann sneeringly dismisses as “the faux pause”.

Verdict: Truth value 0. Calling the vrai pause “the faux pause” is a faux pas.

Mann: “What would it mean if the actual equilibrium climate sensitivity were half a degree lower than previously thought? Would it change the risks presented by business-as-usual fossil-fuel burning? How quickly would the earth cross the critical threshold?”

Science: But what is the “critical threshold”? Mann fails to define it. Is there some value for global mean surface temperature that is the best of all temperatures in the best of all possible worlds? If so, Mann’s hypothesis can only be tested if he enlightens us on what that ideal temperature is. He does not do so.

Verdict: Truth value 0. In the absence of a clear and scientifically justified statement of an ideal temperature, plus a further justified statement that a given departure from that ideal temperature would be dangerous, there is no case for a “critical threshold”. Furthermore, there is at present little empirical basis for a global warming of more than 1 Cº over the coming century.

Mann: “Most scientists concur that two degrees C of warming above the temperature during preindustrial time would harm all sectors of civilization–food, water, health, land, national security, energy and economic prosperity.”

Science: No survey of scientists to determine whether they “concur” as to the 2 Cº above pre-industrial temperature that Mann considers on no evidence to be the “critical threshold” has been conducted. Even if such a survey had been conducted – and preferably conducted by someone less accident-prone than the absurd Cook and Nutticelli – that would tell us nothing about the scientific desirability or undesirability of such a “threshold”: for science is not done by consensus, though totalitarian politics is. And it was totalitarian politicians, not scientists, who determined the 2 Cº threshold, on no evidence, at one of the interminable paid holidays in exotic locations known as UN annual climate conferences.

Verdict: Truth value 0. There is no scientific basis for the 2 Cº threshold, and Mann does not really attempt to offer one.

Mann: “Although climate models have critics, they reflect our best ability to describe how the climate system works, based on physics, chemistry and biology.”

Science: Mann’s own model that contrived the Hokey-Stick graph shows what happens when a model is constructed with insufficient attention to considerations that might point against the modeler’s personal preconceptions. The model used a highly selective subset of the source data; it excluded hundreds of papers demonstrating the inconvenient truth that the medieval warm period existed; it gave almost 400 times as much weighting to datasets showing the medieval warm period as it did to datasets that did not show it; and the algorithm that drew the graph would draw Hokey Sticks even if random red noise rather than the real data were used. The problem with any model of a sufficiently complex object is that there are too many tunable parameters, so that the modeler can – perhaps unconsciously – predetermine the output. To make matters worse, intercomparison tends to institutionalize errors throughout all the models. Besides, since the climate behaves as a chaotic object, modeling its evolution beyond around ten days ahead is not possible. We can say (and without using a model) that if we add plant-food to the air it will be warmer than if we had not done so; but (with or without a model) we cannot say with any reliability how much warming is to be expected.

Verdict: Truth value 0. Models have their uses, but as predictors of long-term temperature trends they are, for well-understood reasons, valueless.

Mann: “And they [the models] have a proved track record: for example, the actual warming in recent years was accurately predicted by the models decades ago.”

Science: Here is Hansen’s 1988 prediction of how much global warming should have occurred since then, according to his “Giss Model E”.

clip_image008

The trend shown by Hansen is +0.5 Cº per decade. The outturn since 1988, however, was just 0.15 Cº per decade, less than one-third of what Hansen described as his “business-as-usual” case. Models’ projections have been consistently exaggerated:

clip_image010

Verdict: Truth value 0. The models have consistently and considerably exaggerated the warming of recent decades. The next graph shows a series of central projections, compared with the observed outturn to date, extrapolated to 2050. This is not a picture of successful climate prediction. It is on the basis of these failed predictions that almost the entire case for alarm about the climate is unsoundly founded.

clip_image012

Mann: “I ran the model again and again, for ECS values ranging from the IPCC’s lower bound (1.5 Cº) to its upper bound (4.5 Cº). The curves for an equilibrium climate sensitivity of 2.5 Cº and 3 Cº fit the instrument readings most closely. The curves for a substantially lower (1.5 Cº) and higher (4.5 Cº) sensitivity did not fit the recent instrumental record at all, reinforcing the notion that they are not realistic.”

Science: Legates et al. (2013) established that only 0.3% of abstracts of 11,944 climate science papers published in the 21 years 1991-2011 explicitly stated that we are responsible for more than half of the 0.69 Cº global warming of recent decades. Suppose that 0.33 Cº was our contribution to global warming since 1950, that CO2 concentration in that year was 305 ppmv and is now 398 ppmv. Then the radiative forcing from CO2 that contributed to that warming was 5.35 ln(398/305) = 1.42 Watts per square meter. Assuming that the IPCC’s central estimate of 713 ppmv CO2 by 2100 is accurate, the CO2 forcing from now to 2100 will be 5.35 ln(713/398), or 3.12 W m–2. On the assumption that the ratio of CO2 forcing to that from other greenhouse gases will remain broadly constant, and that temperature feedbacks will have exercised 44/31 of the multiplying effect seen to date, the manmade warming to be expected by 2100 on the basis of the 0.33 Cº warming since 1950 will be 3.12/1.42 x 0.33 x 44/31 = 1 Cº. Broadly speaking, the IPCC expects this century’s warming to be equivalent to that from a doubling of CO2 concentration. In that event, 1 Cº is the warming we should expect from a CO2 doubling, and the only sense in which the 1.5 Cº lower bound of the IPCC’s interval of climate-sensitivity estimates is “unrealistic” is that it is probably somewhat too high.

Verdict: Truth value 0. Here, as elsewhere, Mann appears unaware of the actual evolution of global temperatures during the post-1950 era when we might in theory have exercised some warming influence. There has been less warming than They thought, and – on the basis of the scientific consensus established by Legates et al. – less of the observe warming is anthropogenic than They thought they thought.

Mann: “To my wonder, I found that for an ECS of 3 Cº, our planet would cross the dangerous warming threshold of two degrees C in 2036, only 22 years from now. When I considered the lower ECS value of 2.5 Cº, the world would cross the threshold in 2046, just 10 years later.”

Science: Mann here perpetrates one of the fundamental errors of the climate-extremists. He assumes that the prediction of a climate model is subject to so little uncertainty that it constitutes a fact. This statement is one of a series by true-believers saying we have only x years to Save The Planet by shutting down the West. Ex-Prince Chazza has done it. Al Gore has done it. The UN did it big-time by saying in 2005 that there would be 50 million climate refugees by 2010. There weren’t.

Verdict: Truth value 0. Extreme warming that has been predicted does not become a fact unless it comes to pass. If you want my prediction, it won’t. And that’s a fact.

Mann: “So even if we accept a lower equilibrium climate sensitivity value, it hardly signals the end of global warming or even a pause. Instead it simply buys us a little bit of time – potentially valuable time – to prevent our planet from crossing the threshold.”

Science: No one is suggesting that the Pause will continue indefinitely. Theory as well as observation suggests otherwise. However, a Pause that has not occurred cannot “buy us a little bit of time”. Mann’s mention of “buying us a little bit of time” is, therefore, an admission that the Pause is real, as all of the temperature datasets show.

Verdict: Truth value 0. A low enough climate sensitivity will allow temperatures to remain stable for decades at a time, during periods when natural factors tending towards global cooling temporarily overwhelm the warming that would otherwise occur.

Mann: “These findings have implications for what we all must do to prevent disaster.”

Science: Warming of 3 Cº would not be a “disaster”. Even the bed-wetting Stern Review of 2006 concluded that warming of 3 Cº over the 21st century would cost as little as 0-3% of global GDP. But at present we are heading for more like 1 Cº. And even the IPCC has concluded that less than 2 Cº warming compared with 1750, which works out at 1.1 Cº compared with today, will be net-beneficial.

Verdict: Truth value 0. There is no rational basis for any suggestion that our adding CO2 to the atmosphere at the predicted rate, reaching 713 ppmv by 2100, will be anything other than beneficial.

Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm. Ironically, if the world burns significantly less coal, that would lessen CO2 emissions but also reduce aerosols in the atmosphere that block the sun (such as sulfate particulates), so we would have to limit CO2 to below roughly 405 ppm. We are well on our way to surpassing these limits.”

Science: What we are concerned with is not CO2 simpliciter, but CO2-equivalent. CO2 itself contributes only 70% of the anthropogenic enhancement of the greenhouse effect. The (admittedly arbitrary) target of 450 ppmv CO2-equivalent is thus a target of only 315 ppmv CO2 – the concentration that prevailed in 1958. Mann’s suggested target of 405 ppmv CO2e would represent just 284 ppmv CO2. And that would fling us back to the pre-industrial CO2 concentration.

Verdict: Truth value 0. We are not “well on our way to surpassing these limits”: we passed them as soon as the industrial revolution began. The current CO2-equivalent concentration of 398 ppmv already exceeds the pre-industrial 284 ppmv by 40%, yet the world has warmed by only 0.9Cº since then, our contribution to that warming may well be 0.33 Cº or less.

Mann: “Some climate scientists, including James E. Hansen, former head of the NASA Goddard Institute for Space Studies, say we must also consider slower feedbacks such as changes in the continental ice sheets.”

Science: The IPCC already takes changes in ice-sheets into account. It says that in the absence of “dynamical ice flow” that cannot happen, the Greenland ice sheet would not disappear “for millennia”. And there is no prospect of losing ice from the vast ice sheet of East Antarctica, which is at too high an altitude or latitude to melt. Even the West Antarctic Ice Sheet, which has lost some ice, is proving more robust than the usual suspects had thought. Sea level, according to the GRACE gravitational anomaly satellites, has been falling (Peltier et al., 2009). During the eight years of ENVISAT’s operation, from 2004-2012, sea level rose at a scary 1.3 inches per century.

Verdict: Truth value 0. There is no reason to suppose the major ice sheets will disintegrate on timescales of less than millennia.

Mann: “Hansen and others maintain we need to get back down to the lower level of CO2 that existed during the mid-20th century–about 350 ppm.”

Science: 350 ppmv is, again, CO2-equivalent. That implies 245 ppmv, a value well below the pre-industrial 280 ppmv. At 180 ppmv, plants and trees become dangerously starved of CO2. Flinging CO2 concentration back to that value would reduce CO2 fertilization and hence crop yields drastically, and would do major damage to the rain-forests.

Mann: “In the Arctic, loss of sea ice and thawing permafrost are wreaking havoc on indigenous peoples and ecosystems.”

Science: The Arctic has not lost as much sea ice as had been thought. In the 1920s and 1930s there was probably less sea ice in the Arctic than there is today. The decline in sea ice is small in proportion to the seasonal variability, as the graph from the University of Illinois shows. And the part of the satellite record that is usually cited began in 1979. An earlier record, starting in 1973, showed a rapid growth in sea ice until it reached its peak extent in 1970. Indigenous peoples, like the polar bears, prefer warmer to colder weather. And almost all ecosystems also prefer warmer to colder weather.

clip_image014

Verdict: Truth value 0. The decline in sea ice in the Arctic is far more of a benefit than a loss.

Mann: “In low-lying island nations, land and freshwater are disappearing because of rising sea levels and erosion.”

Science: On the contrary, detailed studies show not only that low-lying island nations are not sinking beneath the waves, but that their territory is in many cases expanding. The reason is that corals grow to meet the light. As sea level rises, the corals grow and there is no net loss of territory. Also, sea level rises less in mid-ocean, where the islands are, than near the continental coasts. And sea level has scarcely been rising anyway. According to Grinsted et al., it was 8 inches higher in the medieval warm period than it is today.

Verdict: Truth value 0. If the world were once again to become as warm as it was in the Middle Ages, perhaps sea level would rise by about 8 inches. And that is all.

Mann: “Let us hope that a lower climate sensitivity of 2.5 degrees C turns out to be correct. If so, it offers cautious optimism. It provides encouragement that we can avert irreparable harm to our planet. That is, if–and only if–we accept the urgency of making a transition away from our reliance on fossil fuels for energy.”

Science: Mann is here suggesting that a climate sensitivity of 3 Cº would be disastrous, but that 2.5 Cº would not. The notion that as little as 0.5 Cº would make all the difference is almost as preposterous as the notion that climate sensitivity will prove to be as high as 2.5 Cº. As we have seen, on the assumption that less than half of the warming since 1950 was manmade, climate sensitivity could be as low as 1 Cº – a value that is increasingly finding support in the peer-reviewed literature.

Verdict: Truth value 0. The central error made by Mann and his ilk lies in their assumption that models’ predictions are as much a fact as observed reality. However, observed climate change has proven far less exciting in reality than the previous predictions of Mann and others had led us to expect. The multiple falsehoods and absurdities in his Scientific American article were made possible only by the sullen suppression by the Press of just how little of what has been predicted is happening in the real climate. In how many legacy news media have you seen the Pause reported at all? But it will not be possible for the mainstream organs of propaganda to conceal from their audiences forever the inconvenient truth that even the most recent, and much reduced, projections of the silly climate models are proving to be egregious exaggerations.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
256 Comments
Inline Feedbacks
View all comments
Arno Arrak
March 24, 2014 4:52 pm

I quote Michael Mann: “The rate at which earth’s temperature has been rising eased slightly in the past decade, but the temperature is still rising…” That statement is entirely wrong. I have closely monitored earth temperature for years and can tell you that there has been no global temperature rise for 17 years. I assume the reader is aware that the temperature rise the author is talking about is greenhouse warming, or more correctly, the enhanced greenhouse warming, from carbon dioxide we add to the atmosphere. That must be distinguished from natural warming that may also happen at unpredictable times. One example of such natural temperature rise is the super El Nino of 1998 that came out of nowhere and subsided quickly. It did leave a legacy, though, by causing a short temperature rise in its wake that raised twenty-first century temperatures that followed one third of a degree Celsius higher than the last decades of the twentieth century had been. As a result, all the record temperatures are now twenty-first century temperatures despite the lack of any warming. That makes our century warmest on record as well as greenhouse free at the same time. It is well to recall here that the existence of greenhouse warming was first recognized by James Hansen who reported this to the US Senate in 1988. He showed a rising temperature curve that started in 1880 and reached a peak in 1988. That was the highest temperature within the last 100 years, he said. There was only a one percent chance that this could happen by chance alone. Hence, it followed that the greenhouse effect was detected. The newly established IPCC took that as a fact and It is now touted as the cause of dangerous warming to come. The fear that the two degree threshold may be exceeded by 2036 comes from application of this doctrine to temperature calculations by means of climate models. But did Hansen truly prove the existence of the greenhouse warming? An examination of the temperature graph he supplied to the Senate reveals huge problems with it. He speaks of a hundred year temperature rise but the record shows something else. First, the early part of the curve does not count as being caused by the greenhouse effect. According to IPCC-AR5 the effect of anthropogenic warming does not become observable until about 1950. Hence, the first seventy years of his temperature curve do not count as greenhouse warming years. Secondly, according to his curve there was no warming from the fifties to the mid-sixties. That also gets deducted from his 100 year curve, leaving just the period from mid-sixties to 1988 as a warming period he can use as proof of greenhouse warming. But this is not all. Examination of the satellite temperature record where it overlaps Hansen’s graph shows that there were three El Nino peaks between 1980 and 1988. His temperature graph is too course to show their presence because he uses a one year interval between his data points. On top of that he imposes a 5 year running mean upon his data so that there is no way to see the true shape of the last part of his curve. His relative peak heights are also wrong. All this makes him think that his last data point is a culmination of his hundred year warming when in fact it is nothing more exotic than an ordinary El Nino peak, the 1987/88 El Nino to be precise. No way can this misplaced and misidentified datum that marks an El Nino peak be the discovery point of greenhouse warming. He simply did not know what was in his own data and made grandiose claims about it. He should now properly withdraw his claim that he obsereved greenhouse warming in 1988. In the meantime, IPCC has gone ahead and used his “discovery” to justify their prediction of a coming greenhouse Armageddon. And this meaningless concept is what Michael Mann is still clinging to against all observations of nature.

Niff
March 24, 2014 5:07 pm

We should all be thankful that Lord Monckton applies himself so diligently to challenging all the falsehoods.
The real question is can we actually PROVE that Mickey KNOWS that these manipulated versions of the truth are false or is he a ‘useful idiot’ or ‘scientific fantasist’?
He’s clearly a faux scientist…..goes without saying.

crakar24
March 24, 2014 5:10 pm

I dont get this bit, how does LM establish the lower figure of CO2 rather than the 405 by Mann
Mann: “If we are to limit global warming to below two degrees C forever, we need to keep CO2 concentrations far below twice preindustrial levels, closer to 450 ppm. Ironically, if the world burns significantly less coal, that would lessen CO2 emissions but also reduce aerosols in the atmosphere that block the sun (such as sulfate particulates), so we would have to limit CO2 to below roughly 405 ppm. We are well on our way to surpassing these limits.”
Science: What we are concerned with is not CO2 simpliciter, but CO2-equivalent. CO2 itself contributes only 70% of the anthropogenic enhancement of the greenhouse effect. The (admittedly arbitrary) target of 450 ppmv CO2-equivalent is thus a target of only 315 ppmv CO2 – the concentration that prevailed in 1958. Mann’s suggested target of 405 ppmv CO2e would represent just 284 ppmv CO2. And that would fling us back to the pre-industrial CO2 concentration.
Verdict: Truth value 0. We are not “well on our way to surpassing these limits”: we passed them as soon as the industrial revolution began. The current CO2-equivalent concentration of 398 ppmv already exceeds the pre-industrial 284 ppmv by 40%, yet the world has warmed by only 0.9Cº since then, our contribution to that warming may well be 0.33 Cº or less.
Can anyone explain this in more detail for me.
TIA

Björn
March 24, 2014 5:19 pm

Like Tom I also digitized Manns white line , and get a similiar result , the the average diffrence to the data in the spreadsheet linked in Anthony’s foreword (rounded to two digits) 0.38, and has a standard population deviation of 0.03 ( i.e the wiggles dance in sync, but do not hug or touch ). I used ‘engauge’ a digtizer tool i have come to prefer , and I had indeed noticed when I was setting axis-points in the pictured graph , that the Fauxline ( a.k.a Manns white ) was somewhat diffrent from the all of the usual anomaly temperature datasets , in that ít’s graph rarely drops below zero, and assumed Mann had just added an offset to one of the standard sets, in order to be able to plaster it onto some ‘űberheiss’ model simulation results . If that is the case then there is of course comparing it to one of the standard set is no problem as long as you know the value of the offset used , just add it to the standard set or subtract it from the fauxline. But I do not really know how that Mann whiteline is optained , and can find no indication of where it comes from in the article, or what explaines the obvious pushup from the standard global anomaly sets, though perhaps the sentance ” If the Northern Hemisphere’s temperature … blabla….” , is telling us that he is really not using a globl anomaly set but only “half a globe set” to cry wolf, it could also be an explanation of the disreparancy Anthony noticed.
But be that as it may, I will at least not put any more effort into figure this out, but If Willis Is around a comment from him on how or if he went about to ¨normalize¨ the fauxline for to enable a comparision of it to the standard anomaly set would be appreciated.
On the side , if you need to digitize to extract data from a graph image the “engauge” tool is good application for it a it has a good good selection of discertization settings that can makes it an almost automatic process to pull a single data line out of a spaghett-rainbow. It is my preferred tool for such endeavours. I run it in a Linux OS- environment and do not know if it is available for other OS-system but if it is check it out. And should it be the case that there is no version available for your OS , my second choice would be the OODigitizer-Standalone version, It orginally came about as a plugin extension to Open Office and in that incarnation works quite well in Libre Office too so if you have either one of those installed it’s but a simple matter to a fetch and install it through the addon manager in those apps, but if you like me have neither one installed there is a exists a standalone version of it written in java, that runs quite happily on any java enabled machine regardless of which OS is runnig.

March 24, 2014 5:32 pm

Willis;
Having said that, however, I agree with you regarding the white line. I just looked at the HadCRUT4 Northern Hemisphere graph, and it’s quite a good match (altho not perfect) to the white line.
>>>>>>>>>>>>>
I tried a few different running means and 20 months is a pretty good match, at the wiggle level anyway.
At the giggle level, I ran SH and global for comparison and it is clear why Mann used NH only.
http://www.woodfortrees.org/plot/hadcrut4nh/mean:20/from:1980/plot/hadcrut4sh/mean:20/from:1980/plot/hadcrut4gl/mean:20/from:1980
When I first started following this debate, I couldn’t reconcile my understanding of the physics with the claims being made. Now, I no longer even bother to try. There’s simply no point in this kind is misleading advertising unless your product is of such poor quality that there’s no other way to sell it.

Jim Brock
March 24, 2014 5:42 pm

ckb: In Father Eisele’s class at Springhill we used a metal airplane and a bb gun. The airplane had lasted for lo! those many years. Wonder where it is now some 65 years later.

March 24, 2014 5:43 pm

Steve from Rockwood says:

Verdict: Truth-value 0. Mann’s statement that global warming “continues unabated is false”, since the warming rate is declining.
The warming “rate” can be “declining” and global warming can still “continue” as long as the rate remains above zero. The rate of warming would have to decline to zero or negative to falsify Mann’s statement (which it has).

You need to look up the meaning of the word “abate”:
http://www.thefreedictionary.com/abate‎
v. a·bat·ed, a·bat·ing, a·bates. v.tr. 1. To reduce in amount, degree, or intensity; lessen
“continues unabated” means: “continues without lessening, continues at the same rate or even higher”, it does not mean “hasn’t stopped”.

Ray Blinn
March 24, 2014 5:59 pm

I think the whole idea of using tree rings to determine temperature is ridiculous. That might be why Mann’s Hockey Stick might not be showing the Medieval Warm period (even without his Nature Trick).
I have three ash trees in my back yard that so far have avoided the emerald ash borer. They were all planted from seedlings about twenty years ago. They are all within fifty feet of each other and get equal sunlight. The trunks of the three trees vary in diameter from about 7″ to 12″. If you would cut them down and try to determine the average temperature per year based on the rings you would come up with three vastly different temperature graphs. There are obviously more factors involved in tree growth than temperature that can be very difficult to factor out of the data set.

rgbatduke
March 24, 2014 6:00 pm

In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.
Dearest David,
I have written, well — a lot — of code, most of it in C.

rgb@lilith|B:1033>cat main.c
#include
main()
{
 if(-1){
   printf("Actually, -1 is true...\n");
 }
 if(0){
    printf("This would be executed if 0 was true...\n");
 } else {
    printf("...but it's not -- -1 (or any nonzero value) is true.\n");
 }
}
rgb@lilith|B:1031>gcc -c main.c; gcc -o main main.o
rgb@lilith|B:1032>./main
Actually, -1 is true...
...but it's not -- -1 (or any nonzero value) is true.

Thus, as Samuel Johnson might say, I refute you.
rgb

rgbatduke
March 24, 2014 6:18 pm

Oh, and your claim isn’t -1 in the following languages, either: perl, awk, fortran, mostly python, c++, bash,
The code I submitted dropped the brackets around stdio.h, but if you fix the include line to read:
#include <stdio.h>
you can even compile it and run it to test it yourself.
rgb

Adam
March 24, 2014 6:27 pm

The rate of warming has not hit a plateau. It has fallen to zero. It is the temperature that has hit a plateau.

gnomish
March 24, 2014 6:28 pm

yeah, that was weak but it doesn’t make any difference to the thesis- i thought it was a hyphen rather than a monad.
if one looks for sense from monckton, it’s there to be found so consistently that it’s expected.
JNZ is an opcode in various assembly languages. (jump if not zero)
there is no jump.if.minus.one opcode in any
not to put too fine a point on it, but if computers were so error prone you needed multiple bits for redundancy in the normal course of events- you’d have no use for such a device.

Steve Garcia
March 24, 2014 6:29 pm

Much here especially in regards to the one-man Mann claim of the “faux pause” that should label Mann as a climate change denier. The climate in our post-LIA period changed (as it periodically does) from rising to not rising, and THAT IS IN THE DATA, not in models or predictions. Hence, Mann is guilty of denying that the climate has changed its very changing. The pause is real. Man”s assertion of “faux pause” is on contravention of the facts – meaning he is a climate denier.

Steve Garcia
March 24, 2014 6:43 pm

Blinn –
Yes, tree rings as temperature-only is dicey proxy assertion at best. That is what the “hide the decline” was all about: Briffa and his tree ring proxies simply did not keep following the rise in CO2, nor the global temps. It was necessary to hide this discrepancy – known as the “Divergence Problem”, which still exists. Tree rings started diverging from the global temps in about 1940 and by 1960 it was readily apparent. As of the “hide the decline” email, the divergence was severe – and still is.
Biologists – unlike climatologists – use tree rings as proxies for precip. As your own logic will tell you, tree rings cannot be reliably read as proxies for BOTH temp and precip. BOTH groups of science have to ASSUME that the OTHER forcing (temp or precip) is constant. And since we all know that neither one is constant, BOTH groups of scientists are barking up the wrong tree rings.
And now take away the tree rings, and how much temp data exists for the period before about 1880? Ice cores are useless for high-res and for recent centuries – the first few decades are corrupted by recent melt-offs and such so they don’t know exactly where the starting point is. Corals? Not high-res at all. Varves? Probably not bad as a proxy for temps, as long as sufficient knowledge exists about agricultural and other activity around the body of water. But varves are most directly forced by rainfall, not temps.
So without tree rings, they basically got NUTTIN. And they don’t GOT tree rings.
So your thinking is in the right direction. And your assessment of even OTHER forcings is correct, too. Where climatologists assume one factor changing and all others are constant, they are simply retarded in their capacity to apply logic and knowledge to the problem.
Yep, tree rings are really bad as proxies for temps.

GregB
March 24, 2014 6:45 pm

The problem with CO2 is that back radiation is effectively a zero sum game. Exery particle that emitted that back radiation (ie the atmosphere) must have cooled by exactly the opposite value of net energy. At the end of the day it’s effect is zero.

rgbatduke
March 24, 2014 6:59 pm

One final comment. Here is Hadcrut4:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1850/to:2014
There is no possible way to make any smoothed HADCRUT4’s average come out in excess of 0.6 C post 2000. HADCRUT4 on average is roughly 0.5C across the entire interval from 2000 to the present. The figure presented above (NOT by Mann) bounces around 0.6 C and only rarely drops to 0.5 C. One doesn’t do the argument any favors by adding 0.1 C to HADCRUT4 over the last decade.
While we’re at it, GISS LOTI is not, as the graph above also incorrectly shows, almost on top of HADCRUT4 in the 2000-2014 stretch:
http://www.woodfortrees.org/plot/gistemp/from:1850/to:2014/plot/hadcrut4gl/from:1850/to:2014
It is, in fact, almost 0.1 C higher. This is quite puzzling. GISS LOTI supposedly corrects for UHI and HADCRUT4 does not. UHI should, without any question, be a net negative correction in the recent past compared to the remote past (where GISS LOTI and HADCRUT4 are indistinguishable). Yet GISS LOTI is almost always either equal to or greater than HADCRUT4, and the excess strictly grows towards the present. One has to wonder whether or not these guys know the difference between addition and subtraction.
Just a few minor inconsistencies. But they matter. Mann’s little white line is 0.1C over even GISS LOTI, and is a solid 0.2 C over HADCRUT4, which has to be the more reliable of the two given the egregious relative sign error in LOTI.
Although both of them are a joke when compared to a truly global temperature measurement:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1850/to:2014/plot/rss/from:1979/to:2014/plot/gistemp/from:1850/to:2014
RSS lower troposphere shows an anomaly of only 0.2C give or take over an interval where HADCRUT4 increases to 0.5C and GISS LOTI increases to 0.6 C from a more or less common base. A gap of 0.3 to 0.4 C over 35 years. But then, RSS shows basically no warming over its entire range:
http://www.woodfortrees.org/plot/rss/from:1979/to:2014
at this point. Any sane eyeball would pick out a total warming on the order of 0.2 to 0.3 C over 35 years of data, less than 0.1 C/decade, and trending flat to down for half of the period covered. One can fit the data quite nicely (given the obvious uncertainties visible in the range of fluctuation) with a flat segment from 1979 to 1993 or 1994 at an anomaly around -0.1C, a flat segment from 1997 or so to 2014 around 0.2C, and a rapid jump from 1993 to 1997 in between (or any of a number of similar e.g. logistic curve shapes). This is a pure Hurst-Kolmogorov jump.
At this point both GISS LOTI and HADCRUT4 are stuck and they know it. RSS and UAH have already deviated from them so much that the game is up, but true believers can at least hold one to their belief in a lack of bias as long as the gap grows no wider. But LOTI cannot add another 0.1C by hook or by crook at this point, and HADCRUT4 has already read the writing on this particular wall and has solidly split from GISS so that it can retain SOME credibility in case Hansen’s operation tries to continue the game and is called on it in a way nobody can even pretend to ignore. All the global surface records would then be subjected to the kind of careful analysis that IMO they rather fear. What if somebody recomputes all of the means after eliminating all of the changes they made that exaggerate the late stage warming, fix the UHI problem, and find that the global records track well with RSS and that maybe half of the warming previously reported was an artifact of the corrections? Disaster! And I don’t mean of the climate kind.
rgb

March 24, 2014 7:04 pm

rgbatduke says:
March 24, 2014 at 6:00 pm

In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.

I can absolutely assure you that is not true for C. Give me 10 minutes to look it up in my copy of Harbison & Steele and I will quote you the section of the standard.
Or you can just compile into machine code the fragment:

int a;
if (a) {
itsTrue();
} else {
itsFalse();
}
and examine the branch statements generated.

GregB
March 24, 2014 7:10 pm

All assemblers I’ve seen says -1 = True.(all bits set)

rgbatduke
March 24, 2014 7:17 pm

The problem with CO2 is that back radiation is effectively a zero sum game. Exery particle that emitted that back radiation (ie the atmosphere) must have cooled by exactly the opposite value of net energy. At the end of the day it’s effect is zero.
This is simply and obviously untrue. Suppose you are trying to throw a bunch of basketballs out of bounds from the center of a basketball court. Ball boys come and drop basketballs into a big hamper at a steady rate. You throw them out of bounds at a rate that is proportional to the number of basketballs in the hamper, so that (say) if the hamper is half full you match the rate that the boys are putting them in.
Now suppose that a few players come on to the court and try to intercept your throws. Whenever they manage to catch a basketball, about half of the time they throw the ball the rest of the way out and half the time they fire it back into your hamper. The hamper now is getting balls from two sources — the steady rate of ball-boy delivery and a SECOND source that rejects some of your attempts to throw the balls out with some probability. Your rate of removal is monotonically related to the number of the balls in the hamper, and that number will increase until you manage to throw balls out at a net rate equal to the rate at which the ball boys deliver them PLUS the rate that the players pitch them back.
If you replace the basketballs with photons carrying energy, the hamper with the heat capacity of the surface, the ball boys with Mr. Sun, your throw rate with Stefan-Boltzmann blackbody radiation, and the confounded players who catch some of the balls and throw them back with greenhouse gas molecules, you have a rough idea of how the back radiation works. Sure, the players on the court don’t hold on to the balls for more than a second or two before throwing them back or throwing them out, but that doesn’t matter. What matters is that they throw them back at you, and you have to increase the rate at which you throw balls TRYING to get them out to compensate for the rejected attempts and keep up with the rate the ball boys deliver. The only thing that increases the rate at which you work is higher temperature — more balls in the hamper.
(Note to all — yes, one could probably dress this analogy up with cheerleaders who mug the ball boys on the way in and take their balls away — so to speak — to represent albedo, add a group of referees who grab balls from the hamper and carry them through the blocking players to give them a random toss 2/3 of the way to the edge of the court — latent heat transport — and so on, but of course all it is good for is helping people understand that the greenhouse effect is real when they make absurd statements about how it does not or cannot work, it isn’t in any sense a quantitative model. Quantitative models are incredibly difficult to build, because the basketball court in question is planet sized and the players in question are molecule-sized, and out of bounds is the whole bloody universe.)
rgb

thingadonta
March 24, 2014 7:36 pm

The guy-Mann- is an idiot.
In one of his books (the hockey stick wars) he talks about the Club of Rome and Elrich’s past doom and gloom predictions back in the 60s and 70s as coming true, and vindicated, and having the last laugh etc, because in 1992 in Rio a whole bunch of scientists signed a statement about impending disaster-which was also untrue.
According to Mann, a prediction becomes true if someone else he considers important also makes the same prediction, it is irrelevant whether such predictions actually occur in reality. He has taken peer review to an absurd level, something is true if you can simply find someone else to say it.

GregB
March 24, 2014 7:40 pm

rgbatduke, I apologize if this sounds rude but debate me on the physics rather than a “made up” by your mind analogy as to how it works . Frankly I thought you missed the corresponding cooling by a country mile. I actually enjoy most of your posts.

rgbatduke
March 24, 2014 7:48 pm

I can absolutely assure you that is not true for C. Give me 10 minutes to look it up in my copy of Harbison & Steele and I will quote you the section of the standard.
Um, did you actually read my comment? I was responding to one far upstream where it was asserted otherwise, and I actually included a code fragment I coded, compiled, and ran to prove it that you and I are, of course, correct. Not only correct, but correct in most of the dominant languages. 0 is false, any nonzero value in any bit is true, although nowadays one has to be a bit more careful with typing than one did in the good old days of K&R C when an int was an int and not /usr/include/bits/types.h and beyond. All to the good, all to the good, mind you.
I just couldn’t tell — you quoted me quoting somebody else and one might be left with the impression that you thought I asserted that 0 was true in C, when I was rather proving the opposite by the most reliable of methods — an actual program.
You can also do a perl one liner to the same effect:

rgb@lilith|B:1043>perl -e 'if(-1){ print "Hello World\n";}'
Hello World

In bash it is harder to do a one liner but the following fragment works:

#!/bin/sh
if [ 1 ]
then
   echo "Yes"
else
   echo "No"
fi

Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”:
http://en.wikibooks.org/wiki/X86_Assembly/Control_Flow
Rather one does e.g. cmp (comparison) of two registers and then jmp (jump/branch) based on one whatever criterion you like from the comparison. Although it has I admit been many moons since I wrote much assembler (not really since the days of the 8088/8087 when if you wanted to do certain things, you pretty much did them in assembler as compilers were horrendously inefficient and memory was tiny and expensive).
But YES YES YES, true is “not false”, and false is 0, in the most common compilers and scripting languages.
rgb

bushbunny
March 24, 2014 7:48 pm

His hypothesis based on tree rings is the worst way to gain knowledge of global temperatures. Bristlecone pines are evergreen try a deciduous tree and you will get more information, as they remain dormant for three months of the year, if you have a dry spring or summer, they won’t grow much.

GregB
March 24, 2014 7:54 pm

Ditto awk, ditto fortran, etc. It is more difficult to extend into assembler as assembler/machine code doesn’t really manage the concept of “true”
Right about machine code but virtually all assemblers have codified standards of -1 = true and use it extensively.:

ttfn
March 24, 2014 7:56 pm

rgbatduke says:
March 24, 2014 at 6:00 pm
“In a good number of modern languages, especially anything based on the C language, 0 is true and anything else is false. The exact opposite of the example given by Monckton.”
I think it would be more accurate to say that in c anything non-zero is true and 0 is false.
#include “stdio.h”
main() {
int a = 1 == 1;
int b = 1 == 0;
printf(“a=%d b=%d\n”, a, b);
}
cc a.cc
./a.out
a=1 b=0

1 3 4 5 6 7 10