Guest essay by Eric Worrall
h/t Dr. Willie Soon – NASA GISS head Gavin Schmidt has voiced his support for EPA director Scott Pruitt’s call for more climate research transparency, though Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.
Climate scientists call for more transparency in their field
Scott Waldman, E&E News reporter
Published: Thursday, May 10, 2018
…
Making data available is part of publishing in the modern era, and there needs to be better methods for verifying the results of a study are statistically valid, said Rich Loft, director of the technology development division at the National Center for Atmospheric Research.
“In the age of big data, journal publications which would have been suitable a hundred years ago [are] not suitable anymore because it’s not actually enough information to reproduce the thing, so somehow we have to extend our definition of peer-reviewed into these analyses,” he said.
One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt, director of the Goddard Institute for Space Studies at NASA. In addition to storing the data, researchers must make the coding used to synthesize it available, he said. In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said.
“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,” he said. “So there is a cost to the community, but the benefit is of course understanding how robust particular results are.”
Read more: https://www.eenews.net/climatewire/stories/1060081313/feed
Some critics have pointed out that Gavin Schmidt’s friend and colleague Michael Mann never disclosed full details of how he produced his iconic climate hockey stick.
Hey Dr. Hypocrite @ClimateofGavin, wake us when you press your fake Nobel prize-winning bud @MichaelEMann to be transparent with his hokey stick data. pic.twitter.com/x3T04EQYnn
— Steve Milloy (@JunkScience) May 10, 2018
The ridiculous defence of data obscurity we’ve seen since EPA director Scott Pruitt announced his open science initiative was never going to last, but its good to see how rapidly some members of the climate community are coming to accept that they have to start providing full method and data to back their research results.
The following is a video of EPA Director Scott Pruitt announcing the end of “secret science”.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Is this the start of the ‘big climb down’ ? Low sensitivity papers published, and now Schmidt calling for something his ex-boss actively fought. Maybe the wheels have finally come off the wagon, lets hope so.
Slightly off the main track, but still in the general area of scientific ethics and release of data. I have my own opinion on it which I will reserve for the moment, as I would like to see what others here think. (This subject was once a very lively discussion on Baen’s Bar, a science fiction forum.)
Are there cases where perhaps methods should be fully revealed – but the data gathered be ruthlessly suppressed, and not used due to the methods used to gather it?
I have in mind the nutrition experiments performed by “Doctor” Mengele in the concentration camps, and the similar ones performed by the Environmental “Protection” Agency for PM 2.5.
Please. Virtue signaling and damage control does not impress me. Saying that you “support” transparency while whining about how hard it is to post raw data (presumably because D-niers would naively analyze the data without first making the necessary “adjustments”?)
Schmidt is doing nothing but prolonging his own job. He’s as guilty as Mann and for the same reasons. If I wrote what I really think of him I’d be banned for life. *Swamp creature*
Uh-oh. Does this mean that Gavin will now be subjected to the wrath of the UCS (for disagreeing with them)?
https://www.ucsusa.org/news/press-release/scientists-oppose-new-pruitt-restrictions#.WvVzpK2B1E4
Science is much less useful if it cannot be relied upon. The complaint that making science useful comes at a cost strikes me as weak. I could use the word “embarrassing” but I think I’ll go with “beclowning.”
Science can never be settled, or secret. It has to be open for review, debate and acceptance of different interpretation. That way it can only get stronger and be more acceptable to people generally. The more open and frank it is the better for everybody, so long as we are all promoting the truth and not following some political dik tak.. One of the strongest reason why I always fought against the “science” of climate change is because it was closed to scrutiny and to me that smacks of the old freemasons, if it doesn’t stand up to examination then it is not trustworthy. You can not vilify people just because they believe differently, that is draconian and not the least bit intelligent.
Such virtues were lost long ago among the advocacy armies involved. It is another country of nonprofits and money involved at this point with sizable payoffs for certain key players.
I am not sure where the ‘extra cost’ comes from. Let’s say a high powered team of researchers have a copy of the dataset on their own pc, each researcher will be running tests on the data. When they want colleague to see a result they will not expect them to type in a series of commands with a possibility of error. The will send them a script, a file with a few lines of text that performs the data analysis and produces some results or graph. This is most important when researchers are working in different universities or institution.
So, publication time, just upload your data files and script. No cost, no time no problem. Just upload the copy you sent to the journal.
No wonder climate science is in a bad way if trivial acts are deliberately made difficult or opaque.
Steve, you are assuming honest quality work. The things done to data and statistics to get it to give the results they want is messy. “Hiding declines” and the fudging requires much trial and error. I predict a surge of climate scientist retirements and an order of magnitude or two reduction in number of papers published.
I think this requirement should be retroactive with the option to provide the data and code or withdraw the paper published. Pruitt should announce that work that has been published without data and code will not be given any consideration. This might be the key to overturning the CO2 endangerment finding and other laws.
“Just upload the copy you sent to the journal.” Will the journals go out of business if this procedure could be implemented? Follow the money….
Yep. Follow the money. The journals will be one of the entities hurt the most. Not only in “climate”, but also in many studies of other fields that use “global warming” as a crutch to prove their hypothesis.
“Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.”
Simply not enough facepalms. Doubt Gavin even knows how that reads to an actual scientist.
What is the point of research if no one can trust the results?
BTW, the best definition of trust I ever heard was from Rodney Dangerfield:
“Trust is asking for a bj from a cannibal.”
It’s the typical bloviating by “experts” who are able to rely on the fact that 95% of the public can’t make a confident judgement about the validity of what they are claiming. Gavin knows that it stinks to high heaven that he wants to hide the data or the methods. He’s a political actor above all. So he “supports” transparency, but like Bill Clinton promising a tax cut that he never planned to deliver, we’ll soon find out that he tried as hard as he has ever tried anything in his life, but it was just TOO HARD. Sorry, he will probably also take FULL RESPONSIBILITY for the failure, and DEEPLY REGRET it. But chances are, the only story anybody hears will be that Gavin Schmidt is leading the drive for transparency. What a guy.
Let’s see, you could post a few gigs of raw data on Google Drive for free, along with your source code that might reach a few 100kb. Now you had to complete the source code in order to analyze the data, and you had to collect the data in order to analyze it. So help me out here people. Where is this complicated? Click, click, done. Furthermore, if you can’t or won’t explain your logic for how you analyzed (or is the right term “tortured”?) the data, then how could anybody independently replicate the results?
They need a course in data management. It would even streamline research (if the study is a legitimate one) having the database and code archived as you go. The real problem they are having is it would be a discipline that would discourage the stats and data iterations and weighting done in search of support for preconceived results and significance – this is messy work.
The sudden steep drop in papers published will be a very direct measure of the abysmal quality of all the stuff cluttering a profusion of journals.
How overinflated is your sense of self importance if you think you should be the one to decide whether other scientists should spend their time reproducing results or doing new research.
FOLKS, THERE ARE NO “EXPERIMENTS” GOING ON HERE!!!
If you do not understand how there are no experiments going on here, go back to the beginning and start over. I went to “open house” at my kid’s middle school, and saw in his science classroom:
the scientific method, in a series of posters on the wall. Thankfully, the teacher got them in the right order.
In this, a fledgling scientist with a hypothesis, and some kind of beaker, tests whether the predictions of the hypothesis emerge, or not, when the fledgling scientist MANIPULATES SOME VARIABLE UNDER STANDARD CONDITIONS.
No one is manipulating any independent variables, or dependent variables. Not Gavin, not Pruitt, no one.
“We” have shown up late to the game. The phenomena have already happened. We are just trying to piece together what happened. This is a historical investigation, not an “experiment.”
Wikipedia hits it out of the park with their opening statement under their “experiment” entry:
An experiment is a procedure carried out to support, refute, or validate a hypothesis. Experiments provide insight into cause-and-effect by demonstrating what outcome occurs when a particular factor is manipulated. Experiments vary greatly in goal and scale, but always rely on repeatable procedure and logical analysis of the results. There also exists natural experimental studies.”
Why can I be a decent climate-science critic even though I never took geology or meteorology? Because I know science.
https://en.wikipedia.org/wiki/Experiment
“will distract scientists from research” is truly lame. He is thinking of the hide and seek fortune types out there that he works with, not real scientists.
“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios…”
The benefit of reproducibility is being able to distinguish between what actually is science and what is purported to be science. If John Lennon were alive to write about the state of climate science research and policy impacts it might go something like…
Imagine there’s no pikas,
It isn’t hard to do…
No polar bears or penguins,
Except in a zoo…
Forests turned into wood pellets,
To limit CO2…
… etc.
@Nick Werner
Target destroyed. With sense of humour.
And my own two penn’orth…
…as others have observed replication is the larger cost. To do so is a choice it is not mandatory.
In fact it may be free, I can see (so called) citizen scientists even treating it as “fun”.
Saving your data & code is a very modest overhead as is making it available.
The former becomes second nature to anyone outside academia. Indeed without it you or your employer may be liable should something go wrong. It helps to get it right next time.
The latter is usually already there, most educational establishments have web access for stuff such as course notes.
This comes as no surprise. It is becoming more and more evident that much science (medical, social?, climate, etc.) has broken down. One of the main reasons is government funding and acceptance of poor scientific methods. It is more important to get the next round of funding than it is to take the time to make sure your current science is correct. The plethora of statistical software, data manipulation, and computer models rather than physical results has made it easy to obtain made up results and publish them regardless of whether they are real or not. Too many so called scientists are really THEORETICAL climate scientists (just like theoretical physicists) who work on a blackboard but never do the physical experiments to prove their theories. Politicians don’t care as long as they have something in their hands to justify spending money on.
The focus on a global temperature rather than global heat is indicative of how perverted the science has become. Just for fun, play along and assume a reasonable global temperature is developed. Will this really tell you what the amount of global heat truly is? Will it tell you what the ‘climate’ in the Sahara, Andes, or Steppes will be? Climate science has jumped over a large number of basic steps in determining how the earth works, yet want folks to think their hypothesis’ are all correct. Would any self-respecting physicist have accepted there truly being a Higgs boson based solely upon computer models?
He’s going to need quantum computers now to stay ahead of model and data checking resulting from integrity disclosure rules. I predict much more complicated models consuming every more computing hardware budgets in this complexity-for-deception arms race. The IRS computing budget requests will pale in comparison.
Expecting Gavin Schmidt to protect the honesty of the historic temperature record is like expecting Bernie Madoff to protect your retirement savings.
Hah, Schmidt himself knows that ‘extra cost’ isn’t salving his conscience. Time to feel sorry for him? No, not yet, but soon, and for the rest of his life.
========================
“NASA Gavin Schmidt Joins Call for Climate Data Transparency”
I’d like to know what program(s) were/are used to archive the data and just what were/are the settings.
One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt,
Wow – that’s an admission – climate scientists can’t handle large amounts of data
As well as the ability to replicate an experiment, the other big quality assurance factor is measurement uncertainty.
The error envelopes that one sees so often on graphs are almost always a partial expression of uncertainty. In a lot of climate work, if the realistic, full uncertainty was estimated and shown, so much data would simply rattle around within those limits that typical exercises like the trends of temperature over times would become meaningless. All within the error bounds.
There are formal structures for treatment of errors, a prominent one being from the Paris-based BIPM, the International Bureau of Weights and Measures. I have never seen a reference to the BIPM in a climate paper, though there must be some mentions that I have missed.
Proper error bounds can be embarrassing. You write your paper, play with your conclusions, then see that the bounds are so large that you have no paper at all.
However, if you are a climate researcher, you often set down what you want to prove, find some data to fit the preconception, then do rote service to a simple error estimation which you might or might not bother to show.
Consequently, a large number of climate papers would not pass peer review if the realistic error bounds were correctly derived and shown. There it a nice feature of proper error bounds. They are of great assistance in separating gold from dross. Hence, many modern authors try to evade them. Geoff.
Even replication has its flaws because it cannot detect when the process being replicated has a flaw leading to a false result.
For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.
For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.
Nobody believed it when he said it, or did they?
Per:
https://www.forbes.com/sites/realspin/2016/03/02/so-much-for-obamas-pledge-to-transparency/#12e644816cf9
“President Barack Obama repeatedly pledged he would run the most transparent administration in the history of the United States during both of his presidential campaigns, but the evidence shows Obama’s administration has not only failed to meet that standard, it has actively worked to conceal important information from the public.”
================
Lots of things need to be opaque, just talk to us like adults.
What can they say, we dont agree with transparency? What they do say is its extra work and it can be difficult to do, therefore Im sure it will be difficult to do accurately, and inaccurate data is worst than no data..
“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,”
This translates as:
“So if your analysis is wrong, it’s better to move on to something else than to have your errors brought to light.”