Still chasing consensus – on building a climate consensus model

Statistics research could build consensus around climate predictions

Philadelphia, PA—Vast amounts of data related to climate change are being compiled by research groups all over the world. Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates.

In a paper published last December in the SIAM Journal on Uncertainty Quantification, authors Matthew Heaton, Tamara Greasby, and Stephan Sain propose a statistical hierarchical Bayesian model that consolidates climate change information from observation-based data sets and climate models.

“The vast array of climate data—from reconstructions of historic temperatures and modern observational temperature measurements to climate model projections of future climate—seems to agree that global temperatures are changing,” says author Matthew Heaton. “Where these data sources disagree, however, is by how much temperatures have changed and are expected to change in the future.  Our research seeks to combine many different sources of climate data, in a statistically rigorous way, to determine a consensus on how much temperatures are changing.”

Using a hierarchical model, the authors combine information from these various sources to obtain an ensemble estimate of current and future climate along with an associated measure of uncertainty. “Each climate data source provides us with an estimate of how much temperatures are changing.  But, each data source also has a degree of uncertainty in its climate projection,” says Heaton. “Statistical modeling is a tool to not only get a consensus estimate of temperature change but also an estimate of our uncertainty about this temperature change.”

The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).

Regional analysis for climate change assessment. Image credit: Melissa Bukovsky, National Center for Atmospheric Research (NCAR/IMAGe)

Observation-based data sets, which focus mainly on local and regional climate, are obtained by taking raw climate measurements from weather stations and applying it to a grid defined over the globe. This allows the final data product to provide an aggregate measure of climate rather than be restricted to individual weather data sets. Such data sets are restricted to current and historical time periods. Another source of information related to observation-based data sets are reanalysis data sets in which numerical model forecasts and weather station observations are combined into a single gridded reconstruction of climate over the globe.

GCMs are computer models which capture physical processes governing the atmosphere and oceans to simulate the response of temperature, precipitation, and other meteorological variables in different scenarios. While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data. A big advantage of GCMs over observed and reanalyzed data is that GCMs are able to simulate climate systems in the future.

RCMs are used to simulate climate over a specific region, as opposed to global simulations created by GCMs. Since climate in a specific region is affected by the rest of the earth, atmospheric conditions such as temperature and moisture at the region’s boundary are estimated by using other sources such as GCMs or reanalysis data.

By combining information from multiple observation-based data sets, GCMs and RCMs, the model obtains an estimate and measure of uncertainty for the average temperature, temporal trend, as well as the variability of seasonal average temperatures. The model was used to analyze average summer and winter temperatures for the Pacific Southwest, Prairie and North Atlantic regions (seen in the image above)—regions that represent three distinct climates. The assumption would be that climate models would behave differently for each of these regions. Data from each region was considered individually so that the model could be fit to each region separately.

“Our understanding of how much temperatures are changing is reflected in all the data available to us,” says Heaton. “For example, one data source might suggest that temperatures are increasing by 2 degrees Celsius while another source suggests temperatures are increasing by 4 degrees.  So, do we believe a 2-degree increase or a 4-degree increase?  The answer is probably ‘neither’ because combining data sources together suggests that increases would likely be somewhere between 2 and 4 degrees. The point is that that no single data source has all the answers.  And, only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”

While most previous such work focuses on mean or average values, the authors in this paper acknowledge that climate in the broader sense encompasses variations between years, trends, averages and extreme events. Hence the hierarchical Bayesian model used here simultaneously considers the average, linear trend and interannual variability (variation between years).  Many previous models also assume independence between climate models, whereas this paper accounts for commonalities shared by various models—such as physical equations or fluid dynamics—and correlates between data sets.

“While our work is a good first step in combining many different sources of climate information, we still fall short in that we still leave out many viable sources of climate information,” says Heaton. “Furthermore, our work focuses on increases/decreases in temperatures, but similar analyses are needed to estimate consensus changes in other meteorological variables such as precipitation. Finally, we hope to expand our analysis from regional temperatures (say, over just a portion of the U.S.) to global temperatures.”

To read other SIAM Nuggets, explaining current high level research involving applications of mathematics in popular science terms, go to http://connect.siam.org/category/siam-nuggets/

Source article:

Modeling Uncertainty in Climate Using Ensembles of Regional and Global Climate Models and Multiple Observation-Based Data Sets

Matthew J. Heaton, Tamara A. Greasby, and Stephan R. Sain

SIAM/ASA Journal on Uncertainty Quantification, 1(1), 535–559 (Online publish date: December 17, 2013). The source article is available for free access at the link above through December 31, 2014.

About the authors:

Matthew Heaton is currently an assistant professor in the department of statistics at Brigham Young University, but the majority of this work was done while the author was a postgraduate scientist at the National Center for Atmospheric Research. Tamara Greasby is a postdoctoral researcher and Stephan Sain is a scientist at the Institute for Mathematics Applied to the Geosciences (IMAGe) at the National Center for Atmospheric Research (NCAR).

About these ads
This entry was posted in 97% consensus, Climate data, Modeling. Bookmark the permalink.

125 Responses to Still chasing consensus – on building a climate consensus model

  1. jbenton2013 says:

    Oh God have we not already had to put up with industrial scale stupidity with cartoonist Cook’s consensus without this. There is so much wrong with this approach I can’t even be bothered to post a critique. Don’t these people realise just how stupid they look to statisticians.

  2. Steven Strittmatter says:

    Since when does consensus mean anything in science?

  3. Peter Miller says:

    Advanced GIGO for geeks.

    Like all current climate modelling, it starts with the premise thar CAGW theory is correct and global temperatures will rise by 2-4 degrees C over the next century.

  4. Eustace Cranch says:

    Exactly, Peter. Decide the answer ahead of time and find a way to make the models produce it.

    Richard Feynman is either screaming or laughing his a– off right now.

  5. Ted Getzel says:

    This sounds like a movie speech crafted be Mel Brooks.

  6. Steve from Rockwood says:

    So much effort for a 3% gain.

  7. A.D. Everard says:

    My guess is that they are trying to get everyone to sing from the same page. Currently there are massive contradictions flying about, which rather shoots the notion of consensus in the foot.

  8. M. Hastings says:

    While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data.

    Since when have the GCM’s closely matched observed data for 30 years?

  9. Gamecock says:

    They can’t even predict the past.

  10. Richard M says:

    Let me help them out. It’s cooling now and will continue to cool for at least another couple of decades.

  11. Alan Robertson says:

    Wait a minute… I thought there already was a consensus.

  12. HGW xx/7 says:

    It’s getting to the point where if you wanted me to explain where the environmental establishment went wrong, I wouldn’t even know where to start. It would be like trying to explain the concept of love; it baffles even the most loquacious of us. It has lost any semblance of trying to “save the planet” and now simply exists to continue its existence.

    That’s “sustainability” for ya!

  13. Cynical Scientst says:

    Statistics is a tool which one should attempt to use minimally. It is like making sausage. The output may be in more useful form, but it has been altered and adulterated with artificial ingredients. Every step by which data is processed destroys information and adds things to the data that were not originally there.

    By the time you get to grand statistical amalgamations of the statistical output of multiple studies and models based on the input of still other studies you have gone way beyond the realm of processed sausage and are eating pure plastic.

  14. markstoval says:

    “They can’t even predict the past.”

    Funniest comment in a while. I’ll be stealing that one! :-)

  15. RH says:

    And I thought they ran out of ways to fudge data.

  16. JohnWho says:

    “The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).”

    I would like them to only focus on observation-based data. Un-fooled around with data would be best. Data that does not fill in points that for whatever reason was not obtained. I suspect they will get different projections based on different chosen start dates.

    Most likely they would discover that their projections would somewhat continue the warming/cooling cycles that we see since the end of the LIA.

    How can one determine when that cyclic warming will end without knowing all the causes of the cyclic warming?

    Also, I have a question:

    Aren’t models supposed to be based on data/observations? Will their new, as proposed by the paper, model(s) be based partly/mostly on current models?

  17. DirkH says:

    “Furthermore, our work focuses on increases/decreases in temperatures, but similar analyses are needed to estimate consensus changes in other meteorological variables such as precipitation.”

    This sounds like they don’t trust their own prediction of a wet Saudi Arabia.

  18. Col Mosby says:

    The everlasting delusion of statisticians that they can mine gold from worthless ore
    simply by applying statistics to a giant boatload of crappy data and model estimates. Bayesian stats are included, no less, which aren’t even statistics, simply people giving their own guesses and weighting their belief in same. Maybe someday it wil dawn on these innocents that model output and projections are not data.

  19. Sweet Old Bob says:

    NCAR has a fever and it’s contagious?
    Symptoms include delusions?

  20. TheLastDemocrat says:

    One of these models is the most accurate. It might be accurate by chance, or because it is fairly well estimating some of the important variables.

    Averaging the models is a way to ensure your model is poor by straying away from the best model.

  21. Latitude says:

    Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates…..

    You’re going to take bad data….average it….and get what??

    “Last fall the Climate Prediction Center of the National Oceanic and Atmospheric Administration predicted that temperatures would be above normal from November through January across much of the Lower 48 states………………”

    http://www.businessweek.com/articles/2014-02-18/the-official-forecast-of-the-u-dot-s-dot-government-never-saw-this-winter-coming

  22. David Wells says:

    I needed something to read whilst my graphics driver updates, I wish now that I had gone to the kitchen and found a very large knife and slit my throat. I am having an email and phone debate with Myles Allen and Kevin Anderson about RCP’s which are projections made by three difference computer models about Co2 emissions and currently we are above the highest which is RCP 8.5 the worst case scenario where we all get to fry next Tuesday and no matter how specific I get about the fact that it has not warmed for 17 years and 5 months its flat, they tell me I am drawing the line in the wrong place. Kevin Anderson advises Ed Davey DECC Sec of State on mitigation policies which is what Connie Hedegaarde is spending Euros 30 trillion to save 2 hundredths of 1C. Myles Allen conducts jam jar laboratory experiments on Co2 and makes extrapolations which are then picked up by computer modellers whose projections are based on those simplistic experiments without any exposure to the real world I have just read the ipcc pdf and the time it must have taken to write 17 pages of hyperbole to justify the insignificance of what they are about and which guys like Kevin Anderson believe is beyond belief. They just read what other people write and if its peer reviewed they just accept it as fact they don’t want to look close or question it because it could mean they wont have a job in six months, if it is not a conspiracy then it is certainly collusion. I have watched an horizon program on placebos today and when a consultant had a meeting about the placebo effect saying he was going to do some trials all of the other doctors in the room called him a bastard sceptic and a heretic for daring to challenge the orthodoxy he was proven right there is such a thing as the placebo effect but I don’t think I will live long enough to see these green bastards out of a job.

  23. Oldseadog says:

    JohnWho:
    Exactly.

  24. Kevin Kilty says:

    Why must we arrive at a consensus? Why wouldn’t we be better served to let people do the best job they think is possible at analysis and projection, and then plan accordingly? Some may plan poorly; while others plan better, but there wouldn’t be the risk of everyone planning poorly and identically, end up like the lemmings.

  25. A.D. Everard says:

    Alan Robertson says:
    February 19, 2014 at 1:28 pm

    Wait a minute… I thought there already was a consensus.

    *

    Yes, but people are getting confused what the consensus is because it’s all getting out of control. Someone must have decided that they needed a consensus of the consensus. And then everyone will know what it is. :)

  26. Max Hugoson says:

    Signal THEORY…there is a point at which NOISE dominates INFORMATION.

    I have YET to hear anyone talk about SIGNAL THEORY. (Oh, I got it….to much NOISE out there, no SIGNAL..)

  27. Jeff F says:

    @Cynical Scientst: What I didn’t realize, upon interacting with many scientists that work in the environmental sciences, is that they firmly believe that cause and effect can be linked through statistics only. The physics based connection doesn’t seem to be all to important. Worse, if you try to argue the problems associated with a black box statistical approach, you will be chastised for not knowing anything about statistics. Worse still, if you happen to be an engineer, you will be chastised because, apparently, engineers only understand deterministic systems. Thankfully, I have a PhD in statistical data analysis with an electrical engineering degree, so I simply ignore the ad-hominem attacks.

    However, there seems to be a problem with how we are educating some students of the sciences. They are abusing the idea that correlation, with all other factors known is equivalent to causation. This is essentially impossible to establish with complex systems. Worse again, after they mistakenly assume that causation has been established through the use of statistics, everything that they observe a-posteriori becomes evidence. Bad news all around.

  28. BioBob says:

    I would truly like to see how they combine data ” in a statistically rigorous way”, with data drawn from a daily sample size of ONE, non-random, non-replicated grab sample !!!

    Amaze me….

  29. more soylent green! says:

    Spoiler alert — You don’t get “data” from a climate model! You don’t get data from any model. Model outputs are not facts.

  30. Neil says:

    @Lattitude,
    Love the article (http://www.businessweek.com/articles/2014-02-18/the-official-forecast-of-the-u-dot-s-dot-government-never-saw-this-winter-coming) and especially love this little quote:

    ‘Climatologists are trying to use their big miss this winter as a learning experience’. Really? There’s stuff to learn? But this is Settled Science.

  31. John Shade says:

    Looks like a long- winded but potentially face- saving exercise for those ready to distance themselves from the increasingly Baroque and increasingly embarrassing GCMs. If that is what it will take to calm things down a bit on both the policy and the science fronts, then it is good news.

  32. Dung says:

    Why is it wrong to shoot people? It is blatantly obvious that some people are seriously in need of shooting (twice just to make sure) ^.^

  33. Lloyd Martin Hendaye says:

    Seeking qualitative vs. quantitative rationalization of inherently deficient GCMs is a first-order exercise in turd-polishing. No-one is or can be “expert on the future”, and “error bars” pretending otherwise are naught but academic snares, delusions.

    “Climate research” is a classificatory discipline akin to botany. If and when a Mendelian genetic analog appears, benighted catalogers will be the last to know.

  34. Steve from Rockwood says:

    Gamecock says:
    February 19, 2014 at 1:20 pm

    They can’t even predict the past.
    ———————————————
    Haha. That’s because they keep changing it!

  35. DirkH says:

    Steve from Rockwood says:
    February 19, 2014 at 2:28 pm
    “Gamecock says:
    February 19, 2014 at 1:20 pm
    They can’t even predict the past.
    ———————————————
    Haha. That’s because they keep changing it!”

    The past is easy to predict. The past always cools over time.

  36. Gail Combs says:

    Science by a committee of politicians and parasites — OH, How Wonderful.

    Good grief can’t they even READ:

    The IPCC actually said in the Science Report in TAR:

    “in climate research and modeling we should recognise that we are dealing with a complex non linear chaotic signature and therefore that long-term prediction of future climatic states is not possible”

    IPCC 2001 section 4.2.2.2 page 774

    

    So they are going to ignore the fact climate is ‘a complex non linear chaotic signature’

    Instead:
    We’lll toss eye of newt and toe of frog, wool of bat and tongue of dog, adder’s fork and blind-worm’s sting, lizard’s leg and owlet’s wing, into the charmèd computer all shall go, chanting incantations statistical while stirring well. We’ll call it Gore-bull Worming, this charm of trouble, this bubbling broth of he!!.

    (Apologies to the Bard.)

  37. Berényi Péter says:

    authors [...] propose a statistical hierarchical Bayesian model that consolidates climate change information from observation-based data sets and climate models

    Nuff said (“The term hierarchical model is sometimes considered a particular type of Bayesian network, but has no formal definition”). For starters: anything goes.

    Just to make it absolutely clear, Bayesian statistics is a way to build a formidable castle on quicksand by formalizing stubborn prejudices as priors, while “hierarchical” is a buzzword making this process as convoluted, opaque and intractable as possible.

    On the plus side the method is developed to handle marketing issues, to which class “climate change” belongs to.

  38. Adam says:

    “statistical hierarchical Bayesian model”

    This idea will definitely work because the simple ideas are always closer to nature and there is nothing more simple and easy to understand and less convoluted than a good old “statistical hierarchical Bayesian model”. Honestly guv, it is not BS, it is a really simple way at getting to the predefined truth – that global temperatures have been soaring over the past few decades and urgent action is required NOW to save the planet. For our children! [/sarc]

  39. Curious George says:

    “only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”.

    Two basic errors here:
    1. Not all data sources are equally reliable. You need a good crystal ball to assign weights.
    2. Even in the best case, it will only tell us how the temperatures WERE changing.

  40. Londo says:

    Better throw in some climate models in there or else they might just get it right by accident. We can’t have that, can’t we.
    Science by by committee, at its worst.

  41. Adam says:

    Gamecock says:
    February 19, 2014 at 1:20 pm
    They can’t even predict the past.

    Nice one!

  42. Gail Combs says:

    BioBob says: @ February 19, 2014 at 2:04 pm

    I would truly like to see how they combine data ” in a statistically rigorous way”, with data drawn from a daily sample size of ONE, non-random, non-replicated grab sample !!!
    >>>>>>>>>>>>>>>>
    Yes one of my favorite nits to pick.

  43. Stuart Elliot says:

    It’s subprime mortgage thinking, applied to climate models. Over here we have a dog turd, but if I put it in a bag with dozens of other dog turds it becomes a reliable low risk investment, er, model.

    It will probably turn out the same way as subprime, with perpetrators getting rich and victims still not sure how the people “in charge” let it happen.

  44. knr says:

    Its not unheard of , there where in fact various meeting to decide which gospel should stay in and which stay out of the bible and much rewriting of scripture to suit the political purposes of the church. These people are just carrying on in that ‘fine tradition’ , we already have St Gore with the ‘prophet’ Mann , whose words should never be questioned , calls for the prosecution of ‘unbelievers’ and hatred for ‘heretics’ so this hardly a big step. Its just the next logic step.

  45. Londo says:

    Come to think of it, there is this joke about how many French does it take to defend Paris, and the answer is that the number is not known. It hasn’t been tried. I suspect the remake of this for the future would be how may climate scientist does it take to solve the climate problem, and from this post it very much feels like that number is still not known, it hasn’t been tried.
    Now the thing is that France had all that modern (military) equipment at the time but didn’t use it and that pretty much resembles climate science. All that stuff is only being used to scare people but beyond that, it’s not doing much good.

  46. george e. smith says:

    A wonderful new map of the world to be sure. Looks like the USA, is the actual center of the universe.

    I wonder how many orders of magnitude Nyquist under-sampling, THIS land mark study has ??

    So we have a whole new raft of statistical prestidigitation of climate data, that was already known, but that helps nowt, in foretelling what happens tomorrow..

  47. K-Bob says:

    This is kind of like putting a bunch of turds in a blender, set on puree and see what happens. It’ll look a lot like the stuff I get when I eat at a buffet style all you can eat restaurant. I don’t get it, wouldn’t this just reproduce the same ensemble of bad models that we have already?

  48. rogerthesurf says:

    I believe that these activities such as chasing consensus, arguing over theories which are unsupported empirically are all an orchestrated diversion and smoke screen.
    The real danger comes from the United Nations and UN Agenda 21 which is the parent of AGW.
    Is the UN a dangerous and malignant bureaucracy? My research points to this.
    There is no conspiracy as everything is clearly described and published on official websites etc. All we need to do is read it!
    Here is an example on Agenda 21 policy on land.
    See the my highlights in red on page 8

    http://thedemiseofchristchurch.files.wordpress.com/2014/02/unitednations-conference-on-human-settlements_habitat1.pdf

    No wonder property rights are being abused in my city under the guise of an earthquake recovery.
    Ever consider the Agenda 21 wetlands policy. It works like this. 1. Persuade a farmer to part with a swampy area of his farm and label it a ‘wetlands restoration’. 2. Do nothing and let the wetland ‘recover’ which means the original drainage will block and the wetland/swamp will spread further onto the farmer’s land. 3. If the farmer in desperation clears the drainage problem then fine or gaol him. 4. Observing that there are now further spreading swamps ‘wetlands’ encroaching on said farmers land, well take that off him too.

    http://www.stuff.co.nz/marlborough-express/news/8790787/Farmer-guilty-of-damaging-Rarangi-wetland

    Note that this guy once owned the land in question, it is not mentioned how it was transferred to the government. Also willows are mentioned. Willows are not natural in New Zealand and used to be regarded as pests because they block waterways.
    In my country, maybe more than 50% of farms are comprised of more than 90% drained swamp.

    Check out the warped thinking on page 12. Re: educating men or women!

    http://thedemiseofchristchurch.files.wordpress.com/2014/02/education-for-sustainable-development-toolkit.pdf

    My blog has more real life examples!
    Please visit both posts.

    http://www.thedemiseofchristchurch.com

    Cheers

    Roger

  49. george e. smith says:

    I take it that a “Bayesian Model” is a German way to make Aspirin.

    Come to think of it, I could use an aspirin right about now after reading this “peer reviewed” (?) paper. I’ll go with the Monsanto cheap aspirin.

    And they don’t need to try and convince me that their statistical mathematics is rigorous. I assume that, a priori (Latin words, I don’t understand), about virtually any branch of mathematics. It’s the mischief they get up to with it that grabs my attention.

  50. Adam says:

    Climate models are a good idea and they do work as far as they are designed to.

    What is a climate model: They are an attempt to take the known historical data put it on a grid and apply all of the physics that we *can compute*, and to see whether we can make predictions about the future.

    Climate models do work.

    We can make very good predictions out to a couple of weeks and excellent 24 hour predictions on a local scale.

    BUT.

    1. We cannot test predictions on a global scale because we do not even have the global data collection system in place to collect the empirical data that we are trying to predict to any where near the precision being spoken about.

    2. We cannot (and will never) compute the small scale processes, which are vital and which cannot be ignored and cannot be “averaged”, due to lack of computational power.

    3. The equations themselves tell us that predictions beyond a few weeks, on any spatial scale, are COMPLETE NONSENSE because of the error due to the chaotic nature of the nonlinearities. Any real mathematician will tell you that. Just show him the equations and tell him how closely spaced in space and time you have measured the historical data fields. He will tell you that your forward run integrations will get less and less accurate into the future and that after a week (at best!) the uncertainty in your measurement will completely swamp the “signal”.

    On point number three. Where the are all of the Professors of Mathematics? Why are they so silent? Why don’t they pipe up and say something about the insanity of pumping billions of dollars into Climate Predictions. Even if you try to predict the end positions of 10 Snooker balls which start out spaced by 1 foot in a line, with the first one being struck into the next and so on, you will fail. Why? Because you cannot measure the starting locations of the balls and the strike point of the cue well enough for the error in that measurement to not swamp the “signal”. How can we expect to predict where the energy and moisture concentrations in the Ocean and Atmosphere will end up 20 years from now? How? Why do we think it is even possible?

    The thing that has gone wrong in climate science is that the people who are now modelling the climate do not understand the Mathematical facts about the equations they are modelling. The first set of people to write GCM’s did understand. They were Mathematicians who understood chaos theory and the limitations of what could be predicted. I.e. they knew that the best they could achieve were very good weather models.

    But now we have people who are called “Climate Scientists” doing the work, and they do not understand the fundamental mathematics of the equations they are modelling. It is beyond their capacity. Because as Professor Lindzen said recently, “Climate science is for second-raters”. They do not understand that it is IMPOSSIBLE to predict global temperature out to many years or decades.

    I REPEAT AT THE TOP OF MY VOICE: IT IS IMPOSSIBLE TO PREDICT TEMPERATURE OUT TO MANY YEARS OR DECADES, THE EQUATIONS TELL US THAT.

  51. Ralph Kramden says:

    “these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data”. Is this supposed to be funny? I guess it was intended to be a joke?

  52. more soylent green! says:

    Adam says:
    February 19, 2014 at 3:12 pm
    Climate models are a good idea and they do work as far as they are designed to.

    What is a climate model: They are an attempt to take the known historical data put it on a grid and apply all of the physics that we *can compute*, and to see whether we can make predictions about the future.

    Climate models do work.

    We can make very good predictions out to a couple of weeks and excellent 24 hour predictions on a local scale…..

    Are those climate models or weather models? How does “climate” apply to a 24-hour period?

    While I do agree wholeheartedly with your conclusions, your opening statement needs some work.

  53. Katherine says:

    While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data.

    Yeah? Then they should start by throwing out the GCMs that don’t match the current 15-year plateau in global temperature.

  54. Peter Foster says:

    Nothing that contains the results of climate models could possibly pass as a substitute for data taken by observation. This is just a fudge.
    I would have far more faith in say two dozen representative records from rural stations around the world that have records of over 100 years, that meet the required standard for a met station and have not been affected UHI. Just raw data only. no need for any corrections even for time of day, simply plot the changes or anomalies over the last century and average them.

    There are so many fudges in the current data from HadCRU GISS etc to make the whole process very questionable.

  55. Rick K says:

    Well, climate science keeps advancing here in the 21st Century (it just feels like the 13th).
    We’ve gone from:
    GIGO (Garbage In, Garbage Out) to…
    MIMO (Models In, Models Out).

    Somebody shoot me…

  56. Bottom line: They want to MODEL actual temperature measurements. What’s wrong with this idea–anybody, no need to raise your hand; hint: notice the capital letters in the preceding sentence. This is nothing but an admission that the measurements themselves are inadequate to the point of uselessness. (And Bayesian probability? That’s an uncertain modelling already. In order for Bayesian statistics to prove itself, in any application, it has to reduce to a basic, ordinary probability description, and of course that means it is essentially irrelevant from the very start. It’s merely an attempt to pull oneself up by one’s own bootstraps, when one really doesn’t know enough to make intelligent statistical judgments.)

    But even more fundamentally: You cannot reach consensus with criminals. You HAVE to clean house before you can make any REAL progress. But the lukewarmers have so far determinedly ignored this basic reality, to try to curry favor with the criminal system. You are aiding and abetting deluded academics and an insane and tyrannical political agenda, an unprecedented subornation of all of our institutions, and a sullying of the name of science that will take generations to remove (if our civilization lasts that long). And every time you do so, shamelessly and cluelessly, you only convince yourselves further, sink ever deeper, in your false point of view.

  57. pat says:

    16 versions of DENIERS in a single post from Revkin! is that a record per-word count?

    19 Feb: NYT Dot Earth: Andrew C. Revkin: A Look at the ‘Shills,’ ‘Skeptics’ and ‘Hobbyists’ Lumped Together in Climate Denialism
    In his contribution to the recent discussion of Secretary of State John Kerry’s speech on global warming, David Victor of the University of California, San Diego, included this line:
    [T]he whole climate science and policy community is spending too much time thinking about the denialists and imagining that if we could just muzzle or convince these outliers that policy would be different. That’s not right—in part because the denialists aren’t such a hearty band and in part because the real barriers to policy are cost and strategy.
    He attached a speech he recently delivered at the Scripps Institution of Oceanography as part of a seminar series titled “Global Warming Denialism: What science has to say…

    ???Given that I’ve written about the limited value of facile labels of this sort (noting that I am a “recovering denialist” on another climate front), I thought it well worth posting (Victor granted permission).

    His talk is titled “Why Do Smart People Disagree About Facts? Some Perspectives on Climate Denialism.” …

    Here are two of his closing points:

    ***First, we in the scientific community need to acknowledge that the science is softer than we like to portray. The science is not “in” on climate change because we are dealing with a complex system whose full properties are, with current methods, unknowable…

    ***Second, under pressure from denialists we in the scientific community have spent too much time talking about consensus. That approach leads us down a path that, at the end, is fundamentally unscientific and might even make us more vulnerable to attack, including attack from our own…

    http://dotearth.blogs.nytimes.com/2014/02/19/a-look-at-the-shills-skeptics-and-hobbyists-lumped-together-in-climate-denialism/?_php=true&_type=blogs&_php=true&_type=blogs&_r=1

  58. john robertson says:

    @Rick K 3:41
    MIGO fits better, models in Gospel out.
    A capital A would truly acronym this nonsense as AMIGO.
    Pals publishing each others fantasies.
    I have difficultly finding an A without profanity.

  59. Gail Combs says:

    Adam says: @ February 19, 2014 at 3:12 pm

    On point number three. Where the are all of the Professors of Mathematics? Why are they so silent?….
    >>>>>>>>>>>>>>>>>>>
    You can repeat this complaint about ANY use of statistical packages for computers now a days.

    I took a couple college courses in statistics so I cringed when some idiot was brought in to ‘teach’ factory workers and engineers “Six Sigma Green Belt Statistics” He never covered variable vs attributes or plotting the data to determine if the data was close enough to a Gaussian distribution that the statistical treatment in the package he was selling was appropriate.

    Now any fool can plug numbers into a computer and get out pretty numbers. Unfortunately most have zero idea of what those pretty numbers actually are telling them if anything.

  60. just what the wishcasters need: more data smearing

    heaven help science

  61. Mike Maguire says:

    The biggest problem with consensus in this particular field is that the ones that make it up are clearly biased. Most are tied financially and/or have their reputations on the line. Their previous work, based on assumptions made over a decade ago, causes their brain to interpret new data in a biased fashion.

    This would be like deciding who the NFL champs will be by consensus. Football fans are noted for their fierce loyalty to their special team and you’ll get plenty of biased results from tunnel visioned “cheerleaders” of their favorite team(s).

    Climate scientist have their favorite theory too. It has caused them to ignore powerful empirical data which contradicts parts of………..or even most of that theory.

    Just like we let all the NFL teams battle it out on the football field and measure the best by the results of the score at the end of the games……………..we should do the same with climate science.

    Measure everything that relates to this field with non manipulated data gathering. The refs should be scientific principles. The data that doesn’t fit with a legit scientific principle, goes to the bench. That way, we can have an authentic competition, with the winner being the one with the most points scored based on a compilation of comprehensive empirical data during the period in which the data is gathered.

    But then, who gets to interpret the data? Now we’re back to the consensus bs as many biased scientists will throw out everything that doesn’t match up with their pet theory.

    Just too many biased and fraudulent climate scientists out there that have destroyed the credibility of this field, so that a consensus is worthless. It will take decades to repair.

  62. Louis Hooffstetter says:

    “Vast amounts of data related to climate change are being compiled by research groups all over the world.”

    Vast amounts of data related to climate change are being “fraudulently manufactured” by research groups all over the world.

    There, fixed.

  63. Luke Warmist says:

    Ok, I read thru it, and I think I’ve got a handle on it. They’re going to infill data into empty cells by Kridging from cells that were empty but infilled from homogenized data, averaged from adjusted data, and baked at 325F for 45 minutes. (If somehow after all this a real signal does show up, it will be promptly discarded because it doesn’t resemble the rest of the creamy mixture.) Thank Heaven. We’re all going to be saved. /Do I really need the sarc tag?

  64. Bill_W says:

    I agree with many of the above posters. Combining models and data could be problematic if you start with the assumption that the models are correct over 30 year periods. That has not been demonstrated except for the past. And there you know the answers already.

  65. MarkW says:

    They have finally found a way to turn a soe’s ear into a silk purse.
    All you need is enough soe’s ears and then average them together.

  66. Gary Pearse says:

    Even the observations have been modeled! Check out the Watts et al paper on the subject. And the models presently used employ the fiddled numbers. The present work is a Dagwood sandwich.

    http://wattsupwiththat.com/2012/07/29/press-release-2/

    “U.S. Temperature trends show a spurious doubling due to NOAA station siting problems and post measurement adjustments.”

    One deliberate adjustment over time was to push the pesky 1937 US hottest year down a few tenths to make 1998 El Nino year hotter – they knew there may be no better chance than this within several years to finally deep six that embarrassing record. Canadians, who don’t have dozens of Federal weather/climate agencies all working away like the US (why don’t you guys get this egregiously wasteful practice cleaned up in the next election) has one budget squeezed agency doing it all. Not having the resources to diddle data as much as the US, the 1930s temps still stand north of the border and man, if was 45C in 1937 in Saskatchewan, what was it in the US Great Plains and elsewhere?

    43 Hottest Temperatures Ever Recorded in Canada
    Date Recorded Location Temperature
    July 5, 1937 Midale, Saskatchewan and 45.0 °C
    July 5, 1937 Yellow Grass, Saskatchewan 45.0 °C
    July 11, 1936 St. Albans, Manitoba 44.4 °C
    July 11, 1936 Emerson, Manitoba 44.4 °C
    July 5, 1937 Fort Qu’Appelle, Saskatchewan 44.4 °C
    July 16, 1941 Lillooet, British Columbia 44.4 °C
    July 16, 1941 Lytton, British Columbia 44.4 °C
    July 17, 1941 Lillooet, British Columbia 44.4 °C
    July 17, 1941 Lytton, British Columbia 44.4 °C
    July 17, 1941 Chinook Cove, British Columbia 44.4 °C
    July 29, 1934 Rock Creek, British Columbia 43.9 °C
    July 5, 1936 Midale, Saskatchewan 43.9 °C
    July 11, 1936 Emerson, Manitoba 43.9 °C
    July 11, 1936 Morden, Manitoba 43.9 °C
    July 4, 1937 Rosetown, Saskatchewan 43.9 °C
    July 5, 1937 Regina, Saskatchewan 43.9 °C
    July 16, 1941 Oliver, British Columbia 43.9 °C
    June 23, 1900 Cannington, Saskatchewan 43.3 °C
    June 25, 1919 Dauphin, Manitoba 43.3 °C
    July 31, 1926 Fort Qu’Appelle, Saskatchewan 43.3 °C
    July 24, 1927 Greenwood, British Columbia 43.3 °C
    July 25, 1931 Fort Qu’Appelle, Saskatchewan 43.3 °C
    July 5, 1936 Estevan, Saskatchewan 43.3 °C
    July 7, 1936 Emerson, Manitoba 43.3 °C
    July 11, 1936 Waskada, Manitoba 43.3 °C
    July 11, 1936 Virden, Manitoba 43.3 °C
    July 11, 1936 Brandon, Manitoba 43.3 °C
    July 11, 1936 Greenfell, Saskatchewan 43.3 °C
    July 5, 1937 Moose Jaw, Saskatchewan 43.3 °C
    July 5, 1937 Grenfell, Saskatchewan 43.3 °C
    July 5, 1937 Francis, Saskatchewan 43.3 °C
    July 5, 1937 Regina, Saskatchewan 43.3 °C
    July 5, 1937 Estevan, Saskatchewan 43.3 °C
    July 5, 1937 Carlyle, Saskatchewan 43.3 °C
    July 12, 1937 Regina, Saskatchewan 43.3 °C
    July 27, 1939 Oliver, British Columbia 43.3 °C
    July 17, 1941 Oliver, British Columbia 43.3 °C
    July 17, 1941 Skagit River, British Columbia 43.3 °C
    July 19, 1941 Elbow, Saskatchewan 43.3 °C
    July 19, 1941 Lumsden, Saskatchewan 43.3 °C
    August 6, 1949 Rosetown, Saskatchewan 43.3 °C
    July 19, 1960 Newgate, British Columbia 43.3 °C
    August 5, 1961 Maple Creek, Saskatchewan 43.3 °C

  67. John B. Lomax says:

    Garbage In — Garbage Out

  68. old engineer says:

    Col Mosby says:
    February 19, 2014 at 1:43 pm

    “The everlasting delusion of statisticians that they can mine gold from worthless ore
    simply by applying statistics to a giant boatload of crappy data and model estimates.”
    ==========================================================================
    Real statisticians don’t believe this. I had the good fortune to work have three excellent statisticians to keep my data analysis honest during my working career. The first one told me the first day I worked with him:

    “Too many engineers think statistics is a black box into which you can pour bad data and crank out good answers. It is not.”

    As for Bayesian probabilities, I had to go to Google for that. Col Mosby is right on, they:
    “ …aren’t even statistics, simply people giving their own guesses and weighting their belief in same.”

    So while statisticians know better, apparently those who employ statisticians can get them to use enough jargon that the truth is covered up. I wonder who peer (pal?) reviewed the paper.

  69. Rick K says:

    @john robertson 3:46
    I hear ya, John.
    This is nothing more than “We made a model to tell us how all our models are doing and found out they’re doing quite well!”
    Sometimes profanity ain’t enough, John!

    They say, “Our research seeks to combine many different sources of climate data, in a statistically rigorous way, to determine a CONSENSUS on how much temperatures are changing.”

    I hacked up a kidney when I read, “Statistical modeling is a tool to not only get a CONSENSUS ESTIMATE of temperature change but also an ESTIMATE OF OUR UNCERTAINTY about this temperature change.”

    Churchhill described Russia in 1939 as “a riddle, wrapped in a mystery, inside an enigma.”
    Climate science in 2014 is an uncertainty, wrapped in an estimate, inside a consensus.

    It should be inside… [something else]!
    I can’t finish the above sentence without profanity, John!

    Plus, now I’m down to only 1 kidney!

  70. Greg Cavanagh says:

    Londo says: “Science by committee”.

    That is so terry pratchett (and true).

  71. WillR says:

    Gamecock says:
    February 19, 2014 at 1:20 pm

    They can’t even predict the past.

    To repeat myself from a few days ago…

    With Climate Science only the past is uncertain.

  72. Typhoon says:

    Proposals such this these call for the creation of a new field of study:

    statistical wankology

  73. Mike Jonas says:

    Our understanding of how much temperatures are changing is reflected in all the data available to us,” says Heaton. “For example, one data source might suggest that temperatures are increasing by 2 degrees Celsius while another source suggests temperatures are increasing by 4 degrees. So, do we believe a 2-degree increase or a 4-degree increase? The answer is probably ‘neither’ because combining data sources together suggests that increases would likely be somewhere between 2 and 4 degrees.“.

    Leaving aside the fact that “data” appears actually to be model predictions and not data at all, surely that size of discrepancy indicates that we do not yet have a reliable figure. There might be some justification for supposing that the likely ‘correct’ figure is between say 0 deg and 6 deg, but I doubt it. Surely the only correct approach is to test the models until the range of uncertainty for at least one of them is reasonably well known and narrow enough to be useful.

    Versions of climate models are used for shart term weather forecasting. The forecasts a day or two out are often pretty good, but they are still far from perfect. Forecasts more than a week out are still very unreliable and are generally not used, except for the Met in the UK who have made seasonal forecasts that are completely reliable. Completely reliable to be absolutely wrong, that is. The argument that we can’t forecast the short term weather so we can’t forecast the long term climate are IMHO logically incorrect (they are different things that may have different primary drivers), but the total failure of the Met’s seasonal forecasts does show that the models get those primary drivers seriously wrong. Since at least one of these drivers, CO2, is also a primary driver (in the models) for long term climate, it does logically follow that the Met climate models are unfit for purpose. Since all climate models AFAIK use the same primary drivers and the same basic logic, differing only in some detail and some parameterisations, it does logically follow that all climate models are unfit for purpose.

    ie, we still cannot predict climate. Period.

  74. Ken Mitchell says:

    TheLastDemocrat says:
    February 19, 2014 at 1:49 pm
    “One of these models is the most accurate. … Averaging the models is a way to ensure your model is poor by straying away from the best model.”

    The problem is that they don’t yet know which model is the “best”, so they cannot eliminate the others.

  75. Ron House says:

    Curious George says:

    “only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”.

    Two basic errors here:
    1. Not all data sources are equally reliable. You need a good crystal ball to assign weights.
    2. Even in the best case, it will only tell us how the temperatures WERE changing.

    You’ve fallen for it even though you critique it. He said “how much we think temperatures are changing.” Their result is (at best) an estimate of human opinions, not of the facts that those opinions are opinions about. To reach either of the conclusions your two criticisms criticise, one has to add the hypothesis: “Our opinions are always right”.

  76. pochas says:

    Jacques Derrida was a French philosopher best known for developing a form of semiotic analysis known as deconstruction. He is associated with postmodern philosophy, whereby a structure cannot be understood without understanding its genesis. That is, without understanding how the notion of Climate Change was formed, you cannot understand Climate Change. We have here an effort to implant en masse a “consensus” on what climate change is, projecting that this notion will eventually constitute the reality of climate change. Deconstruction exemplified.

  77. Gail Combs says:

    Gary Pearse says: @ February 19, 2014 at 4:18 pm

    … Not having the resources to diddle data as much as the US, the 1930s temps still stand north of the border….
    >>>>>>>>>>>>>>>>>>>>>>>..
    Actually you can thank your weather historian (no doubt close to retirement) for being a very honest and up right man. And yes I do know him but haven’t seen him since I moved south.

  78. Day By Day says:

    Jeff F says

    However, there seems to be a problem with how we are educating some students of the sciences

    I don’t think it is so much with the students as the “self-serving bias.” My brother is a research psychologist and he taught me how to do research. He also taught me how to spot the weal points of research and the difference between correlation and causation which he said (besides fraud) was one of the biggest problems with the current research.(albeit 20 years ago).

    Based on what me taught me, I immediately found the flaws in climate “science.” Started with the NSA website–they referenced charts in their articles and the charts did not say what they said they said–many read the articles and don’t study the links. I was SURE my brother would see what I saw. But he is a professor and part of the academic bubble, apparently. He scoffed at me when I shared my conclusions–he said “Tim Ball” and rolled his eyes–never heard of Dr. Spencer. Anyone i cited, he said wasn’t a “real” scientist and he cited the usual suspects. I went into shock.

    He wants to believe the meme and no amount of real science is going to influence him. I asked what it would take for him to reconsider and he answered, “If it doesn’t warm up for 5 more years.” That was 3 years ago. He asked what it would take for me to reconsider. If any of their models could ever get one thing right. I should have said (referring to previous post) “If any of the models could at least predict the past!”

    HA HA–great line guys.

  79. Geoman says:

    I’m thinking of a number from 1 to 100. Their solution to determining the number is to average all the numbers between 1 and 100. Therefore the number I am thinking of MUST be 50. We must act as if it is 50. My denials notwithstanding, it IS 50.

    I was thinking of 6, but oh well.

  80. Mac the Knife says:

    The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).

    Hmmmm……
    Take one part real data and blend with 2 parts unvalidated climate models.
    What do you get?

    Science Fiction……

  81. Matthew R Marler says:

    I think that was an interesting paper, but for now what it accomplishes is to provide grist to everyone’s mill. Like, before incorporating GCM results into a Bayesian hierarchical framework, shouldn’t we wait until some of their outputs have been shown to be accurate for 20 years or so without ad hoc modifications? Should we ignore (as the authors do) all the interesting curve-fitting that has been performed to extract possibly quasiperiodic oscillations? Is almost everyone satisfied that the numerous data sets (and the reanalyis data set) have been properly curated? Have the effects of the diverse changes in insolation been properly accounted for?

    Here’s from near the end: While many sources of uncertainty are accounted for in this analysis, there are also many sources which are ignored. Notably, uncertainty in initial conditions, imperfect scientific understanding of climate processes, uncertainty in parameterization of processes below grid level, etc. are all unaccounted for in this analysis and could contribute to wider credible intervals than those presented in Figures 4–6.

    In order to perform this analysis, certain assumptions had to be made about the biases
    of climate models in the future. As discussed above, such biases are not identifiable. Hence,
    comparing the validity of two different sets of assumptions is not possible. In regard to such
    assumptions, perhaps the important question is not which set of assumptions is “best” but
    whether different assumptions lead to drastically different results on climate change. That
    is, are climate change results sensitive to different assumptions about future biases in climate
    models? Answering this question is an interesting direction for future work.

    That’s a good short list of important reasons not to believe any putative “projections” from this method. I can see it being refined by generations of graduate students indefinitely into the future.

  82. jorgekafkazar says:

    “By combining information from multiple observation-based data sets, GCMs and RCMs, the model obtains an estimate and measure of uncertainty for the average temperature, temporal trend, as well as the variability of seasonal average temperatures.”

    By combining three different sorts of garbage, we achieve…Der Übergarbage!

    pochas says: “Jacques Derrida was a French philosopher best known for developing a form of semiotic analysis known as deconstruction….”

    Derrida, by the way, was a Marxist.

  83. Tiburon says:

    Anthony – I came across this .pdf posted arxiv.org, a paper by Donald C Morton at the Herzberg Astronomy and Astrophysics Programs, NRC Canada. To my layman’s eye it looks to be a pretty important overview of the potential of solar, interstellar and galactic forcings impacting global climate. Of interest to you for a post?
    Abstract
    This paper describes some of the astronomical effects that
    could be important for understanding the ice ages, historic
    climate changes and the recent global temperature increase.
    These include changes in the Sun’s luminosity, periodic
    changes in the Earth’s orbital parameters, the Sun’s orbit
    around our galaxy, the variability of solar activity, and the
    anticorrelation of cosmic-ray flux with that activity. Finally,
    recent trends in solar activity and global temperatures are
    compared with the predictions of climate models.

    http://arxiv.org/ftp/arxiv/papers/1401/1401.8235.pdf

    I got the reference via Ben Davidson’s Suspicious0bservers News, posted this morning (19th)

    As Ben D comments, it seems there’s a ‘sea-change’ in terms of scientists in a lot of disciplines, distancing themselves from the CAGW models.

    Thanks, sorry if you’ve seen it already – but I follow WUWT pretty closely and haven’t seen it referenced in postings or guest articles….- David S.

  84. Pamela Gray says:

    They are jumping onto the air-brushed band wagon a bit late. People want to see warts and moles these days. The days of an air-brushed-thin Oprah on the cover of her O magazine are over ladies and gentlemen.

    The sad part is that the original hand written logs are gone. Do we not learn? At least let’s keep all the disparate homogenized and sanitized data sets separate. Else we repeat the past: Three data sets disagree. Hey, I have an idea! Why don’t we average them together and let the dog eat the original data sets?

  85. Ossqss says:

    Perhaps OT, but y’all pay attention to this emerging initiative. Kinda fits consensus building, no?

    Just sayin,,, these folks have a game plan and it is coming from all sides.

    http://www.foxnews.com/politics/2014/02/19/fcc-official-others-warn-agency-study-would-squash-news-media-1st-amendment/

  86. SAMURAI says:

    The whole PREMISE of the IPCC and the CAGW government grant process is to PROVE an Anthropogenic causal relationship exists for global warming.

    When a CO2/global warming correlation ceases to exist (as has been the case for almost 18 years) the secondary purpose of CAGW is to come up with excuses for why this mysterious “anomaly” exists… This isn’t science, it’s propaganda.

    The fundamental goal of science is to discover the truth; not to discover ploys to obfuscate the truth and call this hypocrisy science….

  87. norah4you says:

    A scientist of statistic with an academic title who seems to have forgotten what every student of Mathematic Statistic is supposed to have learnt, come rain or sunshine, at least by reading Huff’s How to lie with statistics…….

    Statistic analysed in computers are btw not better than the input data gives possibilities to be. Not to mention that most factors impact on climate never ever been considered in any of the so called computermodels…..

  88. Mike Tremblay says:

    Mike Jonas says:
    February 19, 2014 at 4:44 pm

    “The argument that we can’t forecast the short term weather so we can’t forecast the long term climate are IMHO logically incorrect (they are different things that may have different primary drivers)”
    ——————————————————————————————————————–
    In these cases the models are not forecasting the long term any more than they recognizing the pattern of weather changes for a whole year, and a multitude of years. Any fool can forecast that cold weather will begin in the fall, peak in the winter, get warmer in the spring, and get even warmer in the summer. There is no model made, or that can presently be made, that can forecast the exact temperature (within 0.1 degree C) better in thirty years than even a simple GCM can do for tomorrow. There are simply too many variables and random inputs that will occur the further away you travel from your starting date.

  89. Cynical Scientst says:

    @Ossqss: Don’t jump to conclusions. There are two sides to this story. This isn’t a simple situation.

    There may be a need for new rules because some internet providers are effective monopolies in their local markets and a few of them seem to be trying very hard to find ways to abuse this monopoly position. As well as charging users for connection they are trying to manipulate things so that they can also charge website operators for allowing people to access their content.

    The excuse is that they want to sell the option of providing website owners with a faster connection. The excuse really doesn’t make much sense in terms of the technology, but sounds plausible to legislators. However that is just the foot in the door.

    If they get their way then what sites you can see will depend on where you are accessing from and there will be rent collectors at every step in the road from website to user. At the extreme Anthony might be asked to pay all these rent collectors just so that people can see his website.

  90. James Allison says:

    more soylent green! says:
    February 19, 2014 at 2:07 pm
    Spoiler alert — …….. You don’t get data from any model……….
    ++++++++++++++++++++++++++++++++++++++++++++++
    36″ 25″ 36″ pick yer own model data.

  91. Robert Austin says:

    Reminds one of the multi proxy temperature reconstruction insanity. Who cares if bristle cone pines are not good thermometers. Who cares if the proxy is upside down. Who cares if the temperature response is not linear. Who cares if one rogue tree has inordinate influence.Throw them all in anyway. The magic of statistical prestidigitation converts the sow’s ear into a silk purse, or so we are told by consensus climate science.

  92. Adam says:

    @more soylent green!

    There is no difference between Climate and Weather models. Just the scales. If you like then Climate models are Weather Models with grid sizes of a few Km which cover the entire globe whereas Weather Models have smaller grid sizes and are not global. Due to computing constraints, Climate Models have much larger time steps, typically a few hours.

  93. Catcracking says:

    Ossqss,
    Your post is not off topic. The administration will use this power via the illegally appointments to the FCC to require the media to push their agenda starting with the climate agenda. It will control of the media and stop all criticism. It is one of many fronts to get control of carbon and your energy usage. Based on the Obamacare debacle, we know how well the government can manage our energy.

    “However, one agency commissioner, Ajit Pai, said in a Wall Street Journal op-ed piece Wednesday that the May 2013 proposal would allow researchers to “grill reporters, editors and station owners about how they decide which stories to run.”

    He also said he feared the study might stifle the freedom of the press.

    “The American people, for their part, disagree about what they want to watch,” wrote Pai, appointed to the FCC’s five-member commission in May 2012 by President Obama. “But everyone should agree on this: The government has no place pressuring media organizations into covering certain stories.”
    An Obama administration plan that would get researchers into newsrooms across the country is sparking concern among congressional Republicans and conservative groups.”

    “The purpose of the proposed Federal Communications Commission study is to “identify and understand the critical information needs of the American public, with special emphasis on vulnerable-disadvantaged populations,” according to the agency.”

    The media would be required to promote climate change and global warming even though the facts do not substantiate the claims by deciding what critical information must or cannot be disseminated !!

  94. Chad Wozniak says:

    Obviously the Klimate Khange Konsensus folks are at work here.

    @Katherine – shouldn’t we just throw out ALL the GCMs as useless for forecasting climate change/weather?

  95. Truthseeker says:

    Computer modelling is confirmation bias performed at speeds measured in teraflops.
    Science is finding out about things you do not know. You can only put into a computer model things you already know. Therefore computer models are not science.

    QED.

  96. Jeff F says:

    @Adam: You are right on the money. In my opinion, the reason that the long-term models fail is the key point. It is a subtle point that was made clear in a post on Dr. Curry’s website about spatio-temporal chaos that was discussed here at another time. The fact is that almost all climate scientists (modelers or otherwise) firmly believe that their system can be modeled as a stochastic system. It is pretty simple to see why they think this way. All of their science training tells them to treat every complex system as a stochastic one. Worse, those in climate science usually have less background in physics (as a meteorologist would have) and more background in large-scale systems theory. As a result, we have bad models and a complete disregard for how to model the behavior of the atmosphere over long time scales (which is, as you point out, currently impossible to do). In the end, it is best to let climate science fail in this endeavor – and that is the one and only thing in climate science that I will attach a high confidence to.

  97. David A says:

    Latitude says:
    February 19, 2014 at 1:56 pm
    Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates…..
    You’re going to take bad data….average it….and get what??
    ==========================================================
    LOL, well yes they are climate scientists. They take several dozen wrong, in one direction “over warm” models, and then project the mean of dozens of wrong models.

  98. Ossqss says:

    The point of my prior posts on ths thread were quite simple and observational.

    The recent activity of the FCC seems to follow a pattern. Is it similar to that of which we have seen via others prior? DOJ, IRS, EPA, NSA, DOE, and the like.

    No matter your affiliation, do you feel more freedom from this type of stuff?

    Think about it

  99. Rhys Jaggar says:

    There is nothing rigorous about inputting results from models which fail. Any inputting of ‘model data’ which does not represent reality is unacceptable, anti-scientific nonsense.

    The only things which can be combined in a rigorous study are primary data sets which are accepted as accurate and consistent.

  100. tty says:

    “The everlasting delusion of statisticians that they can mine gold from worthless ore
    simply by applying statistics to a giant boatload of crappy data and model estimates.”

    Not true. The delusion of people who think they are statisticians. Or even know that they aren’t but pretend that they are.
    As several people has already said: there is so much wrong statistically with this “new approach” that it is meaningless to even start listing the faults.
    Just for the record I’m not a professional statistician myself, but I have read enough statistics and worked enough with statistics and together with statisticians to know what I don’t know.

  101. R. de Haan says:

    Consensus building = GOEBBLARIAN PROPAGANDA RIGHT OUT THE BOOK

  102. Stacey says:

    Confucius says Consensus is for fools

    Consensus says I agree.

  103. anticlimactic says:

    The climate was the inspiration for Chaos Theory. The point about chaotic systems like this is that if you know everything about every molecule on the planet, and you have infinite computing power, you still can not predict the future state of the system.

    So this is just a completely pointless waste of money.

  104. Henry Galt says:

    Without any models this should end up looking much like the Stadium Wave. With all traces dropping fast and decades to pass before levels once more attain those at the end of the last century.

    Add in the models and you may obtain whatever your fevered mind can conjure.

  105. Robin Hewitt says:

    I can happily use Pythagoras’ theorem or Avogadro’s Hypothesis and they seem to work as close as I can measure. They are not proven but everyone agrees they do a pretty good job. Nothing wrong with consensus as part of your argument if it works, but therein lies the problem.

  106. Jimbo says:

    We will soon have a new and improved consensus, one based on reality. Have you noticed after each IPCC report the graph showing temperature projections keeps getting lowered towards observations? It keeps getting lowered to the sceptics point of view. This is truly the biggest failure of their ‘science’ – alarmism. Alarmism to force us to act on a non-problem while wasting billions on useless, improbable research deserving of the Ig Nobel Prize. / end rant.

  107. Bob Layson says:

    Science is the search for true theories, correct explanations. A true theory has no chance of being refuted. Many a ‘probably true’ theory was wrong from the start and would have been found to be so much sooner had it been tested. Looking for confirmation and data ‘consistent with’ a conjecture is unholy work best left to those Bayesian Priors of the Church of Latter-day Alarmists.

  108. Mervyn says:

    “A big advantage of GCMs over observed and reanalyzed data is that GCMs are able to simulate climate systems in the future.”

    Really? How can a computer model simulate the unknown. Nobody, least of all a modeller, can know what the climate will be in the years or decades ahead. The truth is modellers cannot simulate climate systems in the future because it is impossible? It’s gobbledegook!

  109. Mike M says:

    A little OT … I can’t recall where I read it but someone finally came up with what I think is the perfect antithesis for their word “denier” – “CLIMAPHOBE” ,climaphobic, climaphobism, etc.

    (Plus you can Homer-ize it too! “Climamaphobe”)

  110. Clovis Marcus says:

    A meta-study that accumulates underlying real data might be valuable.
    A meta-study based on models that do not match observations is of very dubious value except to attract funding.
    And it looks like an open ended job as it will need regression every time someone revises the numbers.
    It has already been said here but GIGO seem to apply.

  111. Mike M says:

    Mervyn says: “Really? How can a computer model simulate the unknown.”

    Well that won’t matter. Eventually no one will know what’s going on outside because everyone will be plugged into The Matrix (or some sort of 24/7 online 100% immersive experience) and then ‘they’ can make up whatever they want for ‘reality’.

  112. Mike M says:

    Bob Layson says: February 20, 2014 at 2:13 am ” Looking for confirmation and data ‘consistent with’ a conjecture is unholy work best left to those Bayesian Priors of the Church of Latter-day Alarmists.”

    .. and their tree in Yamal.

  113. RobRoy says:

    Having no consensus, Gallileo was obviously wrong about that solar centric orbits theory. The fool!

  114. David in Cal says:

    I asked my wife, a retired biostatistician with many publications, to read the paper. Her conclusion was that it was too complex for her to judge without more effort than she was willing to exert. IMHO the complexity itself is both a strength and a weakness. It’s extremely difficult to show that the author’s are correct or to show that they’re wrong.

    I wonder how the Committee on Review of Papers dealt with this problem.

  115. GH05T says:

    How many years of math study does it take to learn that the average of wrong is still wrong?

  116. Chuck Nolan says:

    markstoval says:
    February 19, 2014 at 1:37 pm

    “They can’t even predict the past.”

    Funniest comment in a while. I’ll be stealing that one! :-)
    ========================================
    RH says:February 19, 2014 at 1:39 pm

    And I thought they ran out of ways to fudge data.
    ——————————————————————
    Now that’s the funniest comment I’ve heard.
    cn

  117. Bonanzapilot says:

    Hello all: Long time watcher first time commentor.

    I’m reminded of an old adage in marketing. “97% of money spent on advertising is wasted, but it’s very hard to know in advance which 97%.”

  118. Bonanzapilot says:

    Personally, I’d skew my sample as far as possible to the “denier” tail, increasing the chances of the “consensus” underestimating the “problem”. Alarmists got it wrong from the beginning, when they could have as easily convinced an unwitting public that a very small dT would produce disastrous results. Now they’re stuck trying to deny temperature reality. Had they set up the rules differently at the start, the “cause” of all that’s bad in the world would be a given but uncontrollable, and they only way for the human race to survive would be to submit to totalitarian control of all global resources.

  119. catweazle666 says:

    There’s only one thing t o say to that, and it would get moderated.

  120. Bonanzapilot says:

    Really? I want to see it happen for myself rather than take your word for it.

  121. anticlimactic says:

    It is interesting that the Farmers Almanac correctly predicted a bitterly cold winter in the States back in August. This is not the first time that the Almanac has predicted weather at odds with all the usual weather bureaus, and been proved correct. But then they use sunspots, tidal action, lunar cycles, and planetary positions to make their predictions. Given their [claimed] 80% success rate it is an area climate science should be investigating.

    By ignoring almost all possible influences on climate the climate ‘science’ can never hope to successfully predict the climate. I look forward to the day that climate science studies the climate and tries to understand it rather than thinking up propaganda to support a false idea.

    http://www.climatedepot.com/2014/02/20/feds-failed-with-winter-forecast-but-farmers-almanac-predicted-a-bitterly-cold-winter/

  122. In Bayesian parameter estimation, a non-informative prior probability density function (PDF) serves as a premise to an argument. As prior PDFs are of infinite number, the law of non-contradiction is violated by this method.

    There is an exception to the rule that Bayesian methods violate non-contradiction. This exception occurs when the quantities being estimated are the relative frequencies of events. By their decisions, the designers of the study of global warming that is referenced by the IPCC in its periodic assessment reports have ensured that there are no observed events or relative frequencies. Thus, the Bayesian method cannot be used without violation of non-contradiction. It follows that conclusions cannot be logically drawn through the use of this method about Earth’s climate.

Comments are closed.