Science at work: “Tuning may be a way to fudge the physics”

A collection of fudge from The Team, sweet!

ClimateGate FOIA grepper! – Email 636

Solution 1: fudge the issue. Just accept that we are Fast-trackers and can therefore get away with anything.

[Hat tip: M. Hulme]

Email 5175-Tom Wigley – 2004

In any simple global formula, there should be at least two clearly identifiable sources of uncertainty. One is the sensitivity (d(melt)/dT) and the other is the total available ice. In the TAR, the latter never comes into it in their analysis (i.e., the ‘derivation’ of the GSIC formula) — but my point is that it *does* come in by accident due to the quadratic fudge factor. The total volume range is 5-32cm, which is, at the very least, inconsistent with other material in the chapter (see below). 5cm is clearly utterly ridiculous.

Email 5054, Colin Harpham, UEA, 2007

I will press on with trying to work out why the temperature needs a ‘fudge factor’ along with the poorer modelling for winter.

Email 1461, Milind Kandlikar, 2004

With GCMs the issue is different. Tuning may be a way to fudge the physics. For example, understanding of clouds or aerosols is far from complete – so (ideally) researchers build the “best” model they can within the constraints of physical understanding and computational capacity. Then they tweak parameters to provide a good approximation to observations. It is this context that all the talk about “detuning” is confusing. How does one speak of “detuning” using the same physical models as before? A “detuned” model merely uses a different set of parameters that match observations – it not hard to find multiple combinations of parameters that give the similar model outputs (in complex models with many parameters/degrees of freedom) So how useful is a detuned model that uses old physics? Why is this being seen as some sort of a breakthrough?

Email 1047, Briffa, 2005

We had to remove the reference to “700 years in France” as I am not sure what this is , and it is not in the text anyway. The use of “likely” , “very likely” and my additional fudge word “unusual” are all carefully chosen where used.

Email 723, Elaine Barrow, UEA, 1997

Either the scale needs adjusting, or we need to fudge the figures

Briffa_sep98 code

;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’

h/t to Tom Nelson

Maybe it isn’t fudge, but a social issue. Robert Bradley writes:

 Here is my favorite quotation:

“[Model results] could also be sociological: getting the socially acceptable answer.”

 – Gerald North (Texas A&M) to Rob Bradley (Enron), June 20, 1998.

See “Gerald North on Climate Modeling Revisited (re Climategate 2.0)”: http://www.masterresource.org/2011/11/gerald-north-on-climate-modeling-revisited-re-climategate-2-0/

About these ads

50 thoughts on “Science at work: “Tuning may be a way to fudge the physics”

  1. That is the problem with crap math like statistics. You can say whatever you want and then somehow, that makes it real. Statistics is like saying: a plus or minus some + b plus or minus some = an exact c. It ain’t there. Close enough for BS math.

  2. Ben & Jerry mark their product so that you know it is fudge. The same cannot be said for the team where it is less than tasteful.

  3. When will these “scientists” admit that their work and the GCMs are best guess approximations (SWAG) to backmatch observations and have practically zero predictive value. If multiple variable/coefficient combination variations can yield the same hind cast, what makes them think:

    They know the correct variable/coefficient combination?
    That the coefficients aren’t also variables in response to internal & external system factors?
    Their models have any predictive values?

    This isn’t science, it guesswork, pure and quite simple.

    Bill Yarber

  4. A damning endictment as to the efficacy of models. The underlying physics of the system is not sufficiently known or understood to enable a useful model to be writtem. We are just left with GIGO

  5. This outrages my scientific sensibilities. We knew they were doing such things, or suspected from the math and data, as it was pretty obvious; but seeing them outright planning and conniving… This perversion of science, their positions, and the trust we placed in them…

    It doesn’t matter if they are right or wrong–what matters is what they DID was wrong, deceitful, and utterly unscientific. The ends never justify the means.

    Or, to put it in a quote I very much like:
    “As soon as men decide that all means are permitted to fight an evil, then their good becomes indistinguishable from the evil that they set out to destroy.”
    -Christopher Dawson

  6. Ben And Jerry’s is perfect as they do promote progressive values and support the global warming fraud.

    So far I have asked several politicians up here in Canada if they support hide-the-decline science as valid work for imposing Carbon taxes.

    Crickets never chirped so loud.

  7. David Larsen says:
    November 30, 2011 at 9:11 am
    That is the problem with crap math like statistics.

    I disagree that statistics qualifies as “crap math”. However, it is a problem with crap math like first principles modelling for which the first principles are incapable of providing the “correct” answer. That’s when copious amounts of fudge factors get stirred into the mixture.

  8. When it comes to climate science isn’t “simple global formula” an oxymoron?
    “model that uses old physics” I knew that “new math” was invented for schools a decade or few ago but I don’t remember the memo about “new physics” replacing “old physics”, although in the case of these charlatans it is more likely new psychics replacing old physics.

  9. We’ll call it “Climate Sudoku”. Just plug in random numbers till all the boxes are filled and check to see if the soloution fits the previous puzzles, ignore the ones that don’t fit at all and average the ones that sort of do. With this “data” you can predict all future sudoku puzzles.

  10. Their biggest fudge factor of all is the assumption
    that there is any diagnostic utility in a spatial average
    of well-messaged max/min air-temp readings.

    After all, their big bugaboo here is that Earth’s IR radiation into space
    will be smothered by those eeville Satanic Gasses.

    In fact, Earth’s IR emission spectrum varies greatly
    both temporally and spatially,
    as well as directionally,
    but most important of all
    is that it is a NON-equilibrium phenomenon,
    so that IR radiation does not thermodynamically act
    via its spectral integral,
    let alone its spatial or temporal integral.

    Rather, each wavelength at each direction and at each altitude
    interacts separately with each atmospheric constituent
    in a totally local and immediate manner, molecule by molecule.
    Physically, a mathematical integral of heat flows
    requires teleconnections for there to be any spatial integration
    or requires accumulators such as the oceans for temporal.
    Most long-term trends are regional rather than global
    (unless you’re talking continent-scale glaciers)

    In acknowledgement of these elementary radiation-facts,
    these self-anointed climate cognoscenti
    could at least utilize the average of the absolute temps to the fourth power,
    especially considering it would be higher than the linear average.
    .

  11. Damnit, I like fudge. I move we use “weasel” instead.

    Not even the ferret-fixated could find those chicken-coop-robbing mustelid vermin appealing.

  12. I wonder if Mr. Hulm had a chance to check out JC’s recent post on personality types and how they relate to climate science and communications – http://judithcurry.com/2011/09/27/climate-scientists-are-different-from-the-general-public/ My guess (with a 66% probability in my SWAG) is that he would come out as an ENFJ. A brief summary of this personality type http://www.davidmarkley.com/personality/enfj.htm noted a concern about manipulation- “ENFJs are the benevolent ‘pedagogues’ of humanity. They have tremendous charisma by which many are drawn into their nurturant tutelage and/or grand schemes. Many ENFJs have tremendous power to manipulate others with their phenomenal interpersonal skills and unique salesmanship. But it’s usually not meant as manipulation — ENFJs generally believe in their dreams, and see themselves as helpers and enablers, which they usually are.”

    An couple of Eric Hoffer- The True Believer- http://en.wikiquote.org/wiki/Eric_Hoffer quotes come to mind after contemplating the most recent posts at WUWT-

    “Part Three: United Action and Self-Sacrifice
    Failure in the management of practical affairs seems to be a qualification for success in the management of public affairs.
    The real “haves” are they who can acquire freedom, self-confidence, and even riches without depriving others of them. They acquire all of these by developing and applying their potentialities. On the other hand, the real “have nots” are they who cannot have aught except by depriving others of it. They can feel free only by diminishing the freedom of others, self-confident by spreading fear and dependence among others, and rich by making others poor.”

  13. Briffa_sep98 code

    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
    ;
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor
    if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’

    Did anyone ever compute this fudge-factor on Excel?? Sorry, I forget. This looks like the correction needed to slow down the pre-1950 global temp rise and turn the 1950-1970 fall into continuing rise and give a bit extra-extra on top of UHI extra for the last results.

  14. ps Steve’s “hide the Decline 2.0″ find adds fuel to my belief that it’s the Soon and Baliunas story that will get legs and be the downfall of the Team fraudsters

  15. From 0850.txt

    Tim Barnett says in 2007:

    “the actual forcing data is a must. right now we have some famous models that all agree surprisely well with 20th obs, but whose forcing is really different. clearly, some tuning or very good luck involved. I doubt the modeling world will be able to get away with this much longer….so let’s preempt any potential problems.”

  16. If you tune a simplified model to mimic GCMs, what would happen if the GCMs were wrong as well?

    Email 1604.txt Jonathan Gregory of the Met Office writes:

    “The test of the tool is that it can reliably reproduce the results of several coupled GCMs, both for temperature change and thermal expansion, for different kinds of scenarios (e.g. 1%, historical forcing, stabilisation). This is a stringent requirement. I do not think that a 2D climate model is necessarily preferable to a 1D climate model for this task. In fact, the more elaborate the simple
    model – the more physics it has got in it – the less amenable it will probably be to being tuned to reproduce GCM results. It is essential that this calibration can be demonstrated to be good, because in view (a) it is the GCMs which are basically providing the results. The tool must not be allowed to go its own way. It is not necessary to interpret the tuning parameters of a simple model in terms of the physics of the real climate system when it is being used for this purpose. It is not necessarily the case that a simple climate model is the best choice anyway.”

    And what might happen if you use the hockey stick to tune a GCM?

    0166.txt Max Beran of Didcot asks of Keith Briffa:

    “>The climate models, bless’em, indicate a temperature increase of the order
    >of less than 5 to more than 10 standard deviations by the 2080s. Accepting
    >the robustness of the sensitivities implicit in the Hockey stick
    >reconstruction (much used to tune and confirm GCM behaviour), that suggests
    >to me that we can anticipate a similar order of growth in tree ring width
    >and density?”

    and

    “>So at what point does the tree ring to temperature sensitivity break down?
    >And what might its impacts be on the hockey stick and through that the GCM
    >tuning? Have there been other periods when your post-1940 reversal occurred
    >perhaps due to macroclimatic forces? Could these also account for the
    >discrepancy between the hockey stick and what we thought we used to know
    >about the climate since 1000 AD?”

    Is there any evidence to support this claim that the Hockey Stick was used to tune the models?

  17. Has ANYONE thought to go ask Dr Richard Muller about these emails?

    I wonder if he knows yet he was working with fudged data…

  18. I find the terminology distressing. As it can lead to talk of fudge packing. Then will come Nick or Joel or the Common Sewer Rat stopping by to say “Not that there’s anything wrong with that.” And I don’t particularly like Seinfeld.

  19. “a detuned model that uses old physics” … Ah, yes, the “old physics”. Proper physics, which was actually based on detailed observation, measurement and interpretation of factual data about the world. You can see why they wouldn’t want anything to do with that.

  20. I will press on with trying to work out why the temperature needs a ‘fudge factor’ along with the poorer modelling for winter.

    Email 1461, Milind Kandlikar, 2004

    With GCMs the issue is different. Tuning may be a way to fudge the physics. For example, understanding of clouds or aerosols is far from complete – so (ideally) researchers build the “best” model they can within the constraints of physical understanding and computational capacity. Then they tweak parameters to provide a good approximation to observations. It is this context that all the talk about “detuning” is confusing. How does one speak of “detuning” using the same physical models as before? A “detuned” model merely uses a different set of parameters that match observations – it not hard to find multiple combinations of parameters that give the similar model outputs (in complex models with many parameters/degrees of freedom) So how useful is a detuned model that uses old physics? Why is this being seen as some sort of a breakthrough?

    This email on fudging the physics and ‘de-tuning’ the models and wondering why bringing back real physics, “old physics”, is seen as some sort of breakthrough is from 2004, the piece from Glassman helps put it into their modelling context history, the GCM’s.

    ==============================================

    CO2 ACQUITTAL
    Rocket Scientist’s Journal

    by Jeffrey A. Glassman, PhD
    Revised 11/16/09.

    “C. GREENHOUSE CATASTROPHE MODELS (GCMs)
    Since the industrial revolution, man has been dumping CO2 into the atmosphere at an accelerating rate. However the measured increase in the atmosphere amounts to only about half of that manmade CO2. This is what National Geographic called, “The Case of the Missing Carbon”. Appenzeller [2004].

    Climatologists claim that the increases in CO2 are manmade, notwithstanding the accounting problems. Relying on their greenhouse gas theory, they convinced themselves, and the vulnerable public, that the CO2 causes global warming. What they did next was revise their own embryonic global climate models, previously called GCMs, converting them into greenhouse gas, catastrophe models. The revised GCMs were less able to replicate global climate, but by manual adjustments could show manmade CO2 causing global warming within a few degrees and a fraction!

    The history of this commandeering is documented in scores of peer-reviewed journal articles and numerous press releases by the sanctified authors. Three documents are sufficient for the observations here, though reading them is rocket science. (An extensive bibliography on climate, complete with downloadable documents, covering the peer-reviewed literature and companion articles by peer-published authors is available on line from NASA at http://pubs.giss.nasa.gov/.) The three are Hansen, et al., [1997], Hansen, et al., [2002], and Hansen, et al., [2005]. Among Hansen’s many co-authors is NASA’s Gavin Schmidt, above. He is a frequent contributor to the peer–reviewed literature, and he is responsible for a readable and revealing blog unabashedly promoting AGW. http://www.realclimate.org/.

    The three peer-reviewed articles show that the Global Climate Models weren’t able to predict climate in 1997. They show that in the next five years, the operators decoupled their models from the ocean and the sun, and converted them into models to support the greenhouse gas catastrophe. They have since restored some solar and ocean effects, but it is a token and a concession to their critics. The GCMs still can’t account for even the little ice age, much less the interglacial warming.

    All by themselves, the titles of the documents are revealing. The domain of the models has been changed from the climate in general to the “interannual and decadal climate”. In this way Hansen et al. placed the little ice age anomaly outside the domain of their GCMs. Thus the little ice age anomaly was no longer a counterexample, a disproof. The word “forcing” appears in each document title. This is a reference to an external condition Hansen et al. impose on the GCMs, and to which the GCMs must respond. The key forcing is a steadily growing and historically unprecedented increase in atmospheric CO2. “Efficacy” is a word coined by the authors to indicate how well the GCMs reproduce the greenhouse effect they want.

    In the articles, Hansen et al. show the recent name change from Global Climate Models to Global Circulation Models, a revision appropriate to their abandonment of the goal to predict global climate. The climatologists are still engaged in the daunting and heroic task of making the GCMs replicate just one reasonable, static climate condition, a condition they can then perturb with a load of manmade CO2. The accuracy and sensitivity of their models is no longer how well the models fit earth’s climate, but how well the dozens of GCM versions track one another to reproduce a certain, preconceived level of Anthropogenic Global Warming. This suggests that the models may still be called GCMs, but now standing for Greenhouse Catastrophe Models.

    In these GCMs, the CO2 concentration is not just a forcing, a boundary condition to which the GCM reacts, but exclusively so. In the GCMs, no part of the CO2 concentration is a “feedback”, a consequence of other variables. The GCMs appear to have no provision for the respiration of CO2 by the oceans. They neither account for the uptake of CO2 in the cold waters, nor the exhaust of CO2 from the warmed and CO2–saturated waters, nor the circulation by which the oceans scrub CO2 from the air. Because the GCMs have been split into loosely–coupled atmospheric models and primitive ocean models, they have no mechanism by which to reproduce the temperature dependency of CO2 on water temperature evident in the Vostok data.[*]

    GCMs have a long history. They contain solid, well-developed sub-models from physics. These are the bricks in the GCM structure. Unfortunately, the mortar won’t set. The operators have adjusted and tuned many of the physical relationships to reproduce a preconceived, desired climate scenario. There is no mechanism left in the models by which to change CO2 from a forcing to a feedback.

    Just as the presence of measurable global warming does not prove anthropogenic global warming, the inclusion of some good physics does not validate the GCMs. They are no better than the underlying conjecture, and may not be used responsibly to demonstrate runaway greenhouse effects. Science and ethics demand validation before prediction. That criterion was not met before the climatologists used their models to influence public opinion and public policy.

    The conversion of the climate models into greenhouse catastrophe models was exceptionally poor science. It is also evidence of the failure of the vaunted peer review process to protect the scientific process.

    =====================

    [*]“Because the GCMs have been split into loosely–coupled atmospheric models and primitive ocean models, they have no mechanism by which to reproduce the temperature dependency of CO2 on water temperature evident in the Vostok data.”

    This refers back to the section before on Vostok:


    B. CARBON DIOXIDE SHOULD NO LONGER DRIVE PUBLIC POLICY
    The discovery that the Vostok CO2 record is an effect of the oceanic solubility pump has profound effects on the science and on public policy.

    Over those 420,000 years, warm ocean water has regulated the concentration of CO2 by release of this gas into the atmosphere. Because there is no trace of build–up of CO2 from forest fires, volcanoes, or the oceans themselves, cold waters must be scrubbing CO2 out of the air. Since there is no difference between manmade and natural CO2, anthropogenic CO2 is sure to meet the same fate.

    To the extent that the analyst’s Vostok temperature trace represents a global atmosphere temperature, so does the concentration of CO2. Thus, CO2 is a proxy for global temperature, and attempting to control global temperatures by regulating anthropogenic CO2 is unfounded, futile, and wasteful.”

  21. To be fair to Hulme he ends with

    “So this is the situation as seen by me right now. I guess Solution 1 would
    not pass decent Nature reviewers and Solution 3 may never materialise. If
    people want Solution 2 then I can ask Grubler for what he can give us.

    Mike”

  22. The epistemic structure for GCM modeler’s thought processes:

    a) without scientifically validated observations in nature, modelers start by positing (a priori) there is significant/ alarming/ concernist AGW by CO2 by fossil fuel

    b) make models that show your ‘a priori’ premise

    c) fudge models to compensate for lack of conformance to reality

    d) when iterative fudging doesn’t bring your models in conformance with observations of nature then attack the observations and do gatekeeping to block MSM interest in the observations that invalidate GCMs.

    e) take the above a) through d) as perfect justification for a lot more funding for GCMs

    f) go to bank to deposit earnings

    John

  23. ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
    yrloc=[1400,findgen(19)*5.+1904]
    valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
    2.6,2.6,2.6]*0.75 ; fudge factor

    so basically they just multiplied the last few data points with a linearly increasing multiplier. Bravo fellas

  24. Simeon, from how I read that those values are not multiplied. Multiplication by zero gives what?

    Exactly.

    Those are values ADDED.

Comments are closed.