NASA Gavin Schmidt Joins Call for Climate Data Transparency

Guest essay by Eric Worrall

h/t Dr. Willie Soon – NASA GISS head Gavin Schmidt has voiced his support for EPA director Scott Pruitt’s call for more climate research transparency, though Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.

Climate scientists call for more transparency in their field

Scott Waldman, E&E News reporter

Published: Thursday, May 10, 2018

Making data available is part of publishing in the modern era, and there needs to be better methods for verifying the results of a study are statistically valid, said Rich Loft, director of the technology development division at the National Center for Atmospheric Research.

“In the age of big data, journal publications which would have been suitable a hundred years ago [are] not suitable anymore because it’s not actually enough information to reproduce the thing, so somehow we have to extend our definition of peer-reviewed into these analyses,” he said.

One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt, director of the Goddard Institute for Space Studies at NASA. In addition to storing the data, researchers must make the coding used to synthesize it available, he said. In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said.

Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,” he said. “So there is a cost to the community, but the benefit is of course understanding how robust particular results are.”

Read more:

Some critics have pointed out that Gavin Schmidt’s friend and colleague Michael Mann never disclosed full details of how he produced his iconic climate hockey stick.

The ridiculous defence of data obscurity we’ve seen since EPA director Scott Pruitt announced his open science initiative was never going to last, but its good to see how rapidly some members of the climate community are coming to accept that they have to start providing full method and data to back their research results.

The following is a video of EPA Director Scott Pruitt announcing the end of “secret science”.

0 0 votes
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
May 10, 2018 8:09 pm

LOL, you mean this comes from the same Gavin that would not sit on the same set with Dr. Spencer at the same time? He is obviously concerned with his longevity in his current position from his past actions. Somebody queue the videos!

George Lawson
Reply to  ossqss
May 11, 2018 2:35 am

He obviously feels that if you don’t agree with the increasingly powerful Trump administration then you are out!

Reply to  George Lawson
May 11, 2018 4:24 am

He could possibly have heard that science that’s not reproducible, is not really science. But what he says about the cost, is somewhat incoherent.
“In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, [dr Gavin Schmidt] said.”
Reproducibility means the results scientists claimed are actually valid and will reproduce when tested. We are not talking about benefitting to the researcher, we talk about if the science holds.
To allow reproducibility, scientists must document their set-up and publish then document. In order to know if their work is ‘robust’, it is a must, not something that ‘comes with a cost’. Be aware the administration that pays your salary, want to know ‘what came with the cost’ you’re talking about. If you’re making shit that doesn’t reproduce, it might come as a surprise to you the administration doesn’t want to cover the costs for that. In fact, if a large-scale bunch of non-science in a Nasa office is found non-reproducible, it kind of suggests the people responsible were trying to fraud, and an investigation should be started. Not that it is easy because many of those have good friends in the administration ready to stop any attempts to enforce transparency.

Reply to  George Lawson
May 11, 2018 4:38 am

And “Mosher says” I’m talking about being able to replicate, replicability. Fool me. But is requires transparency, and reproducibilty.

Dr Deanster
Reply to  George Lawson
May 11, 2018 6:49 am

Hugs … what struck me in that sentence was “benefit to the researcher” …. and then he goes on to say the researcher spends all his time reproducing experiments.
That’s just warped. FIRST, the data methods is provided so that OTHER researchers can reproduce and validate your work, not the original researcher … and SECOND, the validation by other scientist certainly IS a benefit to the researcher, both original and replicating.
Sometimes I don’t know where they find these guys.

Reply to  George Lawson
May 11, 2018 7:10 am

” In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher,”
…so we will just generate a bunch of crap and take our word for it

Mark Luhman
Reply to  George Lawson
May 11, 2018 11:57 am

Oh like what happen to Spencer and Christy. They did not follow that Global Warming gospel they were out, I wonder how much Schmidt and Hanson had to do with that. Oh by the way the last EPA administrator said openly a Climate Change Denier had no place in the government. Funny how leftist always kill dissent and dissenters both figuratively and literally.

May 10, 2018 8:11 pm

Trapped like a rat.

Reply to  markl
May 11, 2018 5:33 am

And looking for an escape route…

Tom Anderson
Reply to  MattS
May 11, 2018 10:03 am

Could he be sniffing a felony rap under 18 USC 2071 for falsifying the temp record?

Steven Fraser
May 10, 2018 8:17 pm

I wonder if we will get to the point where there will be separate peer reviews for the statistical procedures, or (gasp) simply an honest declaration of the uncertainty.

May 10, 2018 8:27 pm

There is a cost, time and money, in public data. Accessibility, security, etc.
The cost of not being able to reproduce an experiment, or at least the analysis, is much, much higher.

Tom Halla
Reply to  Writing Observer
May 10, 2018 8:36 pm

Definitely. If the government is funding a study, it should be available online, with all the supporting data also available. If the study is used as support for a policy, transparency requirements should be rigorous.

Crispin in Waterloo
Reply to  Tom Halla
May 11, 2018 5:22 am

Tom Halla
I think it will come as a shock to many of the younger researchers that public funding means they really, really work for the public, not their private selves for their private gain in career and fortune. If the public funds an experiment then we want to watch. If I am watching and understand how the experiment works, I am free to interpret the results as to meaning and import.
We don’t spend much time on the philosophy of science but there is a clear difference between the result of an experiment and the interpretation of that result. Interpretation involves the exercise of ‘authority’ in a great many cases.
When fundamental errors are included in the interpretation such as the conceptual error in the feedback calculation pointed out by Monckton, and the comparison of the temperature of a planet with a GHG atmosphere to one with no atmosphere at all, instead of with a non-GHG atmosphere, the correct answer to questions about the impact of CO2 enhancement will evade them.
Show us the inputs and data and the formulas. We understand what a first order approximation is. And a second order. Let us check the work and validate it.

John Harmsworth
Reply to  Tom Halla
May 11, 2018 6:17 am

I think for government funded studies, where politicians fingers are so close to the pie, a rule imposed from the director is completely insufficient. The next director could eliminate it or handicap it.
It should be enshrined in the foundational charter or legislation. Much more difficult to change. Our democracies need to begin to put some boundaries on the politicians to keep them from co-opting the apparatus of government for their own corrupt ends.

Justin McCarthy
Reply to  Tom Halla
May 12, 2018 9:23 pm

As a former public employee; (intern to city manager) anything I touched, files, work product, emails, text messages, computer files, with few exceptions (personnel records, ongoing real estate negotiations, litigation) were subject to the Public Records Act requests by anyone at anytime for any reason. Don’t work for the government or take its money if you do not want to be subject to transparency. Especially, if you intend to promote trillions of dollars in policy that impacts hundreds of millions of citizens.

Reply to  Writing Observer
May 11, 2018 1:38 am

Agreed, the cost of government regulation based of fake science is terrifying, and often measured in lives rather than money.

Crispin in Waterloo
Reply to  Writing Observer
May 11, 2018 4:51 am

Writing observer: the cost of not showing the methods and data should be ‘no publication’.
When the cost of not following the norms of evidence is ‘no one believes you’, that cost will quickly be absorbed and the necessaries handed over.
At present there is quite a bit of ‘believe me, I am a scientist’. I hold that a scientist knows they will not be believed unless the evidence is physically produced in the court of public opinion.
Suppose a dinosaur expert produced a paper claiming to have found a fossilized humanoid body 86m years old overthrowing a number of hitherto accepted hypotheses about the evolution of hominids. And when asked to see the fossil, he replied, “Why should I show you? You’d just try to find something wrong with my interpretation. Only my friends can see it.”
Such people should be guided to the Story Corner at the local library.

May 10, 2018 8:30 pm

Oh Eric you are being too kind to Gavin Schmidt and his ilk, whom you are kindly referring to as “the Climate Community”. The pathetic excuses he is giving for failing to provide data to allow reproducability are revealing. The purpose of preventing reproducibility was to hide the fact that “climate scientists” have no evidence of human-caused warming, and in fact, don’t know what causes the climate to change even remotely.

Joel O’Bryan
Reply to  hollybirtwistle
May 10, 2018 8:46 pm

My guess is Gavin Schmidt is close to ground-level on this data transparency issue within the broader science community. And resisting transparency probably smells pretty bad at ground-level.

Joel O’Bryan
May 10, 2018 8:38 pm

The responses from the AAAS, the AGU, and the 5 Journal editors (Nature, Science, PNAS, Cell, PLoS) coming out against the EPA’s adoption of data transparency rules were a behind-the-scenes, coordinated effort.
IOW, Just a few deep-pocketed puppet masters tugging the strings of those puppet editors and association presidents to make them dance and sing a little jig, because the death of EPA secret science spell deep trouble for them.
Why does it spell deep trouble, one might ask?
It spells deep trouble because Crown Jewel for the Watermelons is the EPA’s CO2 Endangerment Finding … and it was likely built on Secret Science.
And when it gets re-opened, that old Secret Science will not be allowed to be considered under those new transparency rules.
This link is the official Technical Support Document for the EPA’s Endangerment Finding.
Within this Technical Support Document, when you go to Pages 4-5, Box 1.1, one finds this statement regarding the U.S. Climate Change Science Program (CCSP):
CCSP integrated federal research on climate and global change, as sponsored by thirteen federal agencies and overseen by the Office of Science and Technology Policy, the Council on Environmental Quality, the National Economic Council and the Office of Management and Budget and the 21 Synthesis and Assessment Products (SAPs) “that address the highest priorities for U.S. climate change research, observation, and decision support needs. ” :

“… Global Climate Change Impacts in the United States that incorporated all 21 SAPs from the CCSP, as well as the IPCC Fourth Assessment Report. As stated in that report, “This report meets all, Federal requirements associated with the Information Quality Act, including those pertaining to public comment and transparency.”

It might have met the old “transparency rules” (i.e. lack of), but it likely won’t meet the new transparency rules if the Pruitt EPA can re-open an examination of the Endangerment Finding.
And in a certain court challenge to any EPA overturning the CO2 Endangerment Finding, the first thing that will come up to the Court is “What science was used?” If the rules the EPA used/uses doesn’t allow “secret science” then the battle is probably over if the old relied on Secret Science.
P.S. If you really want a good belly-laugh, read pages 2-3, “Section 1(a) Scope and Approach of This Document ” of the above linked Technical Support Document.

Steve O
Reply to  Joel O’Bryan
May 11, 2018 4:39 am

Transparency allows others to rely on the science. If other scientists can’t rely on a study, then what’s the point of even doing it? Of what use is it to pretend there is transparency?

Reply to  Joel O’Bryan
May 11, 2018 7:16 am

joelobryan–Actually some of the Technical Support Document may not be all that secret, maybe snuck in by a real biologist. EPA did have them. Reference is to an IPCC report. Maybe they should require reading assignments.
Box 14.1 Ocean Acidification p134–
“The overall reaction of marine biological carbon cycling and ecosystems to a warm and high-CO2 world is not yet well understood. In addition, the response of marine biota to ocean acidification is not yet clear, both for the physiology of individual organisms and for ecosystem functioning as a whole (Denman et al., 2007). ”

Reply to  Joel O’Bryan
May 11, 2018 8:57 am

EPA’s 2.5 Particulate rule was also built on secret science that they refuse to release.

Reply to  Taphonomic
May 11, 2018 9:48 am

Technical Support Endangerment Document Again
p 22 “According to the IPCC (Fischlin et al., 2007) elevated CO2 concentrations are resulting in ocean acidification, which may affect marine ecosystems (medium confidence).”
p 7 “Medium confidence — About 5 out of 10 chance ”
I seem to remember something explained about this in a statistics course, must have changed due to expert opinion. They apparently ignore their own use of the word “uncertainty.” Do different ones write different sections?
Does the 2.5 particulate rule report have similar contradictions that would give it away? Also if they reference the very large dust storm that blew across Texas during the height of the 1950s drought. I saw it. Followed later by a very large flood.

May 10, 2018 8:39 pm

Title should be “Guess who joined climate transparency, 20 years late.”

P Walker
May 10, 2018 8:43 pm

Gavin is trying to keep his job-and what little credibility he has left. That said, I’m under the impression that the EPA’s position on transparency is on research going forward and not retrospectively- which is far more important,imho. Am I missing something?

Dave Fair
May 10, 2018 8:58 pm

Schmidt actually said: “In addition to storing the data, researchers must make the coding used to synthesize it available, he said.”
Then he said ““Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments ..”
Two different things! He is a snake.

Joel O’Bryan
Reply to  Dave Fair
May 10, 2018 9:09 pm

He is acknowledging an obvious fact that gaining confidence in important results comes at a cost.
If one wants to increase confidence, one must attempt to independently reproduce the reported result. If you want to have little confidence (or don’t care about it) in a reported result, then don’t expend the resources needed to reproduce.

Jeff Mitchell
Reply to  Dave Fair
May 10, 2018 11:27 pm

Yes, it has a cost to the community, but how fast do you think he’d flip if we came up with studies that show warming isn’t a problem. He’d want lots of reproduction of results. But results don’t mean anything without reproduction. If he’s suggesting that we skip that step, he is anti science.
Mike’s nature trick exposed the trickery going into some studies. I strongly suspect that the secret science of the past has a bunch of that too. I would like them to review all that secret stuff and see if it really meets real scientific standards.

Kristi Silber
Reply to  Jeff Mitchell
May 11, 2018 1:20 pm

Jeff Mitchell,
What “secret stuff” do you want reviewed? How do you know it hasn’t been done already?
If you can’t accept that “Mike’s trick” was perfectly legitimate, how are you going to know what is acceptable science, anyway? How much evidence, and what kind of evidence, do you need in order to trust scientists to do their job?

Rich Davis
Reply to  Kristi Silber
May 11, 2018 1:47 pm

Nice try Kristi. Another of your patented strawMann arguments. Unless Jeff Mitchell happens to be a climatologist, he or I for that matter would not be the ones advancing science by challenging the methodology. But the assumption that either of us would be hopelessly unable to follow the science is equally wrong.
In a healthy scientific debate, proving an alternative theory or enhancing an existing theory is what earns scientists their credentials. There is a competition of ideas and there is no such thing as deciding who is right by a consensus vote. It shouldn’t be go to college and do your post doc work and then you get a license to “do your job”, meaning everybody has to just bow to your superior knowledge and expertise.
With apologies to John Lennon, “All we are asking is Give the scientific method a chance”

Dave Fair
Reply to  Kristi Silber
May 11, 2018 2:01 pm

“Mike’s Trick” was to truncate proxy data in 1960 and replace it with measured data on their graph without explaining it to reviewers, Kristi. That is called scientific fraud. Spin that baby.

Reply to  Jeff Mitchell
May 11, 2018 2:52 pm

If the review has to be done in secret, then for all practical purposes, it didn’t happen.

Reply to  Jeff Mitchell
May 11, 2018 2:53 pm

Rich, a couple of days ago, Kristi told us that it was OK to hide data, because skeptics might use it to sow confusion.

Reply to  Jeff Mitchell
May 12, 2018 4:36 am

and with that response kristi you show your ignorance.

Hokey Schtick
May 10, 2018 9:12 pm

Funny how there are only two directions temperature can go (up or down) yet seemingly infinite number of ways to argue about it.

Reply to  Hokey Schtick
May 11, 2018 6:36 am

Argument is an emergent property of data.

Mark T
May 10, 2018 9:32 pm

He’s trying to save his job.

Reply to  Mark T
May 11, 2018 5:35 am

I expect to see the same thing from many scientists as the wheels start coming off. They will be looking for exit routes, ways to maintain their credibility, at least those with any sense of shame (excluding the likes of Mann and Hansen for example).
I think it is why the IPCC left in some low climate sensitivity estimates, it gives them a ‘well, we always thought this might happen anyway’ excuse.

David Chappell
May 10, 2018 9:37 pm

“…providing enough data and method to ensure reproducibility will distract scientists from research.”
Isn’t ensuring reproducibilty the be-all-and-end-all of research rather than a distraction?

Reply to  David Chappell
May 10, 2018 10:12 pm

There is a diffeerence between REPRODUCABILITY and REPLICATION.
REPRODUCABILITY: Scientist delievers the DATA AS USED, and CODE AS RUN, to enable others
to REPRODUCE the CLAIMED RESULT. The results are typically tables, graphs and charts.
You suply your data, your code, and I should be able to get the SAME RESULT using the SAME DATA
and the SAME METHOD.
REPLICATION: I get a sufficiently similar result to you using the same data and different methods. different
data and the same method, or different data and different methods. For example: you test substance A
on one set of test subjects. I choose different subjects, the same substance and the same methods.
I’ve replicated your result not reproduced it exactly.
the aim of reproduceability is QA & building foundational tools and data for other scientists.
for example. When Willis gives me his code I can run it to quickly check that he publsihed the
result of the code, and more importantly I can use his code to do my own EXTENSIONS of his
the aim of replication is to test the robustness of the conclusion to changes in data and methods
Broadly speaking of course

Reply to  Steven Mosher
May 11, 2018 12:30 am

Wow thanks. And the caps are a real help. But we know this already. Why not send it to your climate friends, with a little note explaining how not doing these things means your work can be and should be inired?

Reply to  Steven Mosher
May 11, 2018 3:24 am

reproducability (uncountable)
Misspelling of reproducibility.

Not a typo.

Latimer Alder
May 10, 2018 9:46 pm

‘We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it’
The phrase that originally told me there was something fundamentally wrong with ‘climate science’
Uttered by one of Schmidt’s fellow Climategateers, Phil Jones.

Latimer Alder
May 10, 2018 9:52 pm

To editor: please look again at the video of Mr Pruitt. 45 minutes of a blank stage and then his actual words are very jerky..some missed out?

Reply to  Latimer Alder
May 11, 2018 5:08 am

At 43.44 and 46.21 someone thanks him.

J Mac
May 10, 2018 10:01 pm

Gavin’s skin is so thin, you can see right through it. Transparency…. indeed.

May 10, 2018 10:04 pm

“Some critics have pointed out that Gavin Schmidt’s friend and colleague Michael Mann never disclosed full details of how he produced his iconic climate hockey stick.”
huh. what is the point of that?
I might point out that Scaffetta, a friend of Anthony’s, refused to give his data to mcintyre.
I might point out that Monkton, a friend of Anthony’s refused to share his code.
to what end?
In fact the only people to deny me code or data, successfully deny me, are skeptics.
heck even Mcintyre refused to share the data he used for Watts 2012 when they published that on the web.
Does any one want to criticize Willis or Anthony because their ‘friends’ did something wrong?
FFS, ya”ll ever hear of the concept of individual responsibility?
Mann is responsible for Mann
gavin is responsible for gavin.
there is no point in holding one guilty for the sins of another.
Enourage open data and open code. demand it of everyone. praise those who come to the right side of the issue and move forward.
win gracefully

Warren Blair
Reply to  Steven Mosher
May 10, 2018 11:10 pm

“Encourage open data and open code. demand it of everyone. praise those who come to the right side of the issue and move forward. win gracefully”
Don’t agree with all Mosher pronouncements; however, the above gem is likely the best advice ever written on WUWT, IMHO.

Bill Treuren
Reply to  Steven Mosher
May 10, 2018 11:18 pm

Totally agree.
Everybody who works in the area of research know the huge force of confirmational bias.
If they Gavin and co see an issue with their past or hold the view that this could make more people agree with their view then all power to them.
Maybe the old saying of science changes one funeral at a time, omitted the the corollary, one Pruitt at a time.
Be gracious as a grandmother would say to me.

Kristi Silber
Reply to  Bill Treuren
May 11, 2018 1:41 pm

“If they Gavin and co see an issue with their past or hold the view that this could make more people agree with their view then all power to them.”
Why not just take it at face value and accept that scientists have integrity? This is in the best interest of scientific advancement, it’s not some political ploy. Schmidt has suggested this before, encouraging others to make their models available.
Pruitt, on the other hand, is not acting in the nation’s best interest. There are data that cannot be shared due to confidentiality agreements, and not being able to use these in policy decisions is counterproductive. The scientific community is already handling its problems and doesn’t need the government to step in. When Pruitt is cutting important research programs that are in place and gathering data, he has no business adding bureaucratic red tape that will cost millions to oversee.
I support transparency, but not this way.

Rich Davis
Reply to  Bill Treuren
May 11, 2018 2:04 pm

Why not just take it at face value and accept that scientists have integrity?

Seriously? In Kristi’s world, Scientists are some kind of race of angels apparently. They do not respond to incentives such as the need to get a grant from political organizations that already know what answer they want. They have never been known to, I don’t know, try to “hide the decline”. They do not care a whit about disincentives like losing their grant money if they give the wrong answer.

Reply to  Bill Treuren
May 11, 2018 2:56 pm

Kristi, unlike you, we don’t accept things at face value just because somebody told us to.
That’s why we are called skeptics. As opposed to true believers.
We want you to prove it. Anyone who has to hide their data is proving that they can’t be trusted. We have 40 years of experience with the same people that proves that.
Confidentiality agreements. The second oldest excuse.
If a study can’t be replicated, then it must be regarded as disproven. It doesn’t matter why it can’t be replicated.

Reply to  Bill Treuren
May 11, 2018 2:57 pm

Rich, it’s worse than that. Only the scientists that Kristi agrees with are of the angels.
Anyone who produces a study that she disagrees with is just a pawn of the big oil companies.

dodgy geezer
Reply to  Steven Mosher
May 10, 2018 11:44 pm

…heck even Mcintyre refused to share the data he used for Watts 2012 when they published that on the web….
Given McIntyre’s major committement to the cause of open data sharing, I find that assertion suroprising. Do you have any reference for it?

Reply to  dodgy geezer
May 11, 2018 6:24 am

this july marks 6 years.
i would even settle for 100 stations of data.

John Endicott
Reply to  dodgy geezer
May 11, 2018 7:56 am

You mean the post to which Anthony replied:
“Mr. Mosher knows my email, and has my telephone number, and mailing address, and so far he hasn’t been able to bring himself to communicate his concerns to me directly, but instead chooses these potshots everywhere.”
So, did you ever communicate directly with him Mosh? you have his email, telephone, and mailing address so surely 6 years is plenty of time to dash off an email or make a phone call!

Reply to  dodgy geezer
May 11, 2018 5:13 pm

McIntyre’s collaborator, Mckitrick, has refused to share data citing protection under canadian law.

Reply to  dodgy geezer
May 12, 2018 2:08 am

““Mr. Mosher knows my email, and has my telephone number, and mailing address, and so far he hasn’t been able to bring himself to communicate his concerns to me directly, but instead chooses these potshots everywhere.”
So, did you ever communicate directly with him Mosh? you have his email, telephone, and mailing address so surely 6 years is plenty of time to dash off an email or make a phone call!”
yep sent a mail. Even met with him after this.
Its a simple request. For 6 years he has help on to the reclassification of the surface stations.
1. All I need are the class 1 and Class 2. Not all the stations
2. I could even work with a sample ( 30-60) of those
3. I am willing to sign an NDA and never publish any results using that data.
Still no.
So as I predicted 6 years ago.. the paper was published on the web. People still take the results
at face value. the key data will never be shared, yet people will still refer to it
For a similar case se the Gergis paper, also retracted at the same time and subsequently published

Reply to  dodgy geezer
May 12, 2018 2:10 am

“McIntyre’s collaborator, Mckitrick, has refused to share data citing protection under canadian law.”
yes I found a large error in mcKittricks data and he refused to acknowledge it or re run his results

Reply to  Steven Mosher
May 11, 2018 12:13 am

Steven Mosher
“win gracefully”
I wasn’t aware it’s a competition.

Brett Keane
Reply to  Steven Mosher
May 11, 2018 12:28 am

Steven Mosher
May 10, 2018 at 10:04 pm: Mosh, you have zero credibility, simple as that, and your word cannot be trusted. Just a shill for the conmen…..

Reply to  Steven Mosher
May 11, 2018 12:32 am

Right, because the Climategate emails showed that the individuals never, ever, never discussed withholding data?
FFS, you never actually read this stuff?

Reply to  Steven Mosher
May 11, 2018 11:16 am

please describe the public funds that McIntyre & Monkton (and anyone else you want to reference to support your point) have utilized, to acquire the data that “they wouldn’t share”.
then please go to the produce section (that’s where they sell fruit and veggies) of your local store, pick up an apple, carry to the next aisle and plop it down in the middle of the oranges. see the difference?

Kristi Silber
Reply to  DonM
May 11, 2018 1:58 pm

No. There is absolutely no difference. This isn’t a matter of who “owns” the data, it’s a matter of the advancement of science. Scientists are part of a community that works together in the search for truth. Easy access to data and code is a fairly new issue primarily because of the internet, although the whole climategate fiasco put it into the public spotlight.
The idea that anyone should have access someone’s data just because they pay taxes makes little sense. That’s like saying, since I pay taxes, I should be able to sleep at the White House.

Reply to  DonM
May 11, 2018 2:58 pm

Fascinating, just a few minutes ago Kristi was telling us how it was legitimate for scientists to with hold data.
I guess it’s only legitimate if you like what the “scientist” is producing.

Reply to  DonM
May 11, 2018 5:08 pm

It is obvious that you did not put in the effort to to go to the produce department, to actually compare the apples and oranges.
But maybe I give you too much credit … maybe you really can’t tell the difference … no matter how hard you try.

Mr Bliss
Reply to  Steven Mosher
May 11, 2018 3:36 pm

Then please ask Mann for his data, code and methods

Reply to  Steven Mosher
May 12, 2018 4:41 am

have to agree strongly with steven mosher on this point. there should be no get out of jail free cards for anyone on any side of the debate. level playing field for all and the science will win out in the end.

May 10, 2018 10:34 pm

“…..reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said….”.
Well it does have a clear benefit to those that wish to use the information.
Nothing like knowing something has a sound base Gavin and, is not just opinion, or have you forgotten the Lecture on scientific method 101??

John Hardy
May 10, 2018 10:48 pm

There seem to be many personal attacks on Mr Schmidt in these comments. That isn’t right, fair or even sensible. As reported here, his action demonstrates a respect for correct procedures and may have been done at the cost of vilification by the “never mind the truth, feel the propaganda value” brigade. Credit where credit is due, even from someone with whom you disagree radically

Reply to  John Hardy
May 11, 2018 5:44 am

NB: Gavin Schmidt draws ire because he has already demonstrated himself to be scurrilous.

Reply to  John Hardy
May 11, 2018 12:01 pm

Gavin is a shill and a coward. He uses the term denier to attack skeptics and would not share a TV set with a “denier”. Gavin is being paid with my tax dollars. I’ll call him what I want until he behaves like a professional and a gentleman.

Kristi Silber
Reply to  John Hardy
May 11, 2018 2:06 pm

John Hardy,
I like the fact that you’ve pointed out that it is not right to attack Dr. Schmidt. At the same time, I think it’s highly unlikely that his actions “may have been done at the cost of vilification by the ‘never mind the truth, feel the propaganda value’ brigade.” I’m not even sure what brigade you mean, or that there is one that says, “never mind the truth” unless it’s the skeptics who deny there is evidence of AGW, and that it’s a problem.

Reply to  John Hardy
May 12, 2018 4:46 am

john, i think steve has a fair point. gavin works for the american tax payer. i always laugh when i hear the term government employee when the government are employees themselves. i is an easy mistake to make given the attitudes of those in government the world over.

May 10, 2018 10:49 pm

Oh, goody… Government “transparency”.. A dichotomy of terms..
Perhaps Gavin could start his “transparency” initiative by ordering NOAA to reinstate/ update their graph entitled “DIFFERENCE BETWEEN RAW AND FINAL USHCN DATA”, which NOAA deleted from their website in June 2017 ( here is Wayback-machine archive capture):comment image
Moreover, this deleted graph only showed NOAA/NASA’s raw data tampering up to 1999, and I’d love to see how much heat they’ve added since 2000, especially following the major KARL 2015 “Pause-Buster Fix”…
Put up, or shut up, Gavin….

Reply to  SAMURAI
May 11, 2018 6:17 am

1. thats a version of data no one uses.
2. all the raw data is posted.
3. same for Karl..

Reply to  Steven Mosher
May 11, 2018 7:34 pm

1) Of course CAGW grant grabbers don’t use this raw-data tampering format anymore because it’s far too “transparent”, and exposes the HUGE amount of raw data tampering that is required to keep the CAGW grant train chugging along….
The fundamental problem with the CAGW sc@m is that it doesn’t follow the rules of the scientific method, which demand scientists to adjust the HYPOTHESIS when empirical data doesn’t support hypothetical projections…. Conversely, CAGW “scientists” adjust the empirical evidence to support their hypothetical projections, which is the definition of junk science..

Crispin in Waterloo
Reply to  SAMURAI
May 11, 2018 6:37 am

Good choice. I support your call. Adjustments are a highly controversial action when so much rides on the adjusted result. Mosher says all the adjustments are justified. Fine, then there is absolutely nothing wrong with showing them, with an explanation for each so they can be peer reviewed. Let’s get everyone on the same page with facts and evidence.
Personally I am waiting for evidence that the water vapour feedback factor exceeds 1.05.

Jim Gorman
Reply to  Crispin in Waterloo
May 11, 2018 10:22 am

Yes. I’ve got a real problem with adjusting past data because of current data, especially past data that were manually recorded. And any stations past data that has been adjusted multiple times is especially suspicious. Once adjusted there should be little reason to do it again.

Kristi Silber
Reply to  Crispin in Waterloo
May 11, 2018 3:08 pm

How about we make every profession equally transparent? Then we can all pick apart the way lawyers and bankers and concierges and plumbers do their jobs. We can say, I own stock in this business, so they have a duty to show me every transaction they make. A totally transparent society, so we can tell if people are doing their jobs correctly. Heck, being a defense lawyer in a criminal trial can’t be that tough; maybe we could show them where they go wrong.
Do you see what i’m getting at? We are not qualified to determine whether scientists are doing their job. Data adjustment is ESSENTIAL in cases where there are systematic biases in the data. It is not simply a matter making a graph and looking for oddities, there are algorithms used to pick up peculiarities in the data, and adjustments are validated afterwards.
People have a very peculiar habit of assuming something how not been examined or reported in climate science, although they don’t ever look for it. That’s fine that they don’t comb through the scientific literature – that’s what scientists are for.
If you skeptics want to question and critique science and play scientist yourselves, fine. But remember that this is why you are known as deniers. You deny the science. You don’t trust scientists. You think you know who they are, as if you can see into their hearts and minds and allows you to make all kinds of assumptions about them – they suffer from “groupthink” or socialist ideology or whatever. You think you would do their job better than they. So you side with the few but vocal scientists who don’t agree about extent or reason for global warming, but are united in their support of the fossil fuel industry. Many are associated with conservative think tanks that were paid by FF. Some scientists were directly paid by FF, some through front organizations. So, um, why is their science better and why are they more trustworthy?

Rich Davis
Reply to  Kristi Silber
May 11, 2018 4:44 pm

Already said this, but the point has nothing about any of us who may not be scientists. It is that scientists can check each other’s work, even if they don’t agree with each other or maybe don’t even like each other. Not just fat balding middle age guys with a weird circular goatee checking each other’s secret work.

Paul Courtney
Reply to  Crispin in Waterloo
May 11, 2018 6:27 pm

Kristi: No, none of us see what you’re getting at. Please explain it, and don’t leave any thought unexpressed. We’ll soak it all in. Make it really epic, just let it rip, and it’s bound to sway us all.
Better still- don’t.

Warren Blair
May 10, 2018 11:04 pm

Adolf would have liked Gavin on his team . . .
Impaired empathy with bold, disinhibited and egotistical traits.
Smooth seamless delivery.

Barry Sheridan
May 10, 2018 11:39 pm

I feel it rather ridiculous that those working in the scientific field have so corrupted their own sphere by misuse of data and methodology that it takes a political figure to bring about a change in attitude. Kudos to Mr Pruitt who has introduced sanity to the EPA. Let us hope the data and analysis used to determine public policy will in the future be free of any such concerns.

May 11, 2018 12:28 am

… Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.

He’s absolutely right. That will probably happen … one time in ten thousand.
The computer programming equivalent is writing brilliant highly efficient code with no documentation or comments. Worse than useless.
The brilliant cowboy paradigm for computer programming usually just hides hacking. It might work but the programmer can’t explain exactly why it does. The resulting programs break and can’t be maintained, even by the cowboys who originally wrote them. That approach is actually quite bad for productivity.
Similarly, science has a replication crisis. Much of the time, scientists can’t even reproduce their own experimental results.
Science will be much better off when scientists start doing the job right. We also have to get rid of the perverse incentives that reward bad science. link Productivity will improve if for no other reason that garbage will be easier to detect and remove.
On the other hand, one of the biggest time wasters in science is the grant writing process. link Fixing that would provide the time necessary to do the science right.

Reply to  commieBob
May 11, 2018 2:03 am

From a laymans perspective, surely the cost of the computer/data work ought to be included in the initial costings of the project. That in turn ought to be scrutinised and assessed relative to the project objectives
Great link to the perverse incentives by the way. Almost from para. 1 it explains why scrutiny isn’t undertaken. Very bad news for the scientific community.

Reply to  HotScot
May 11, 2018 4:46 am

The other thing is work habits. If you’re organized you avoid a lot of wasted effort. If you do things properly in the first place you won’t have to scramble to make your data and code presentable after the fact.

May 11, 2018 12:49 am

Pigs demand pilot licences.

May 11, 2018 2:28 am

I’m not sure that climate science has evolved to the point where a lack of reproducible results is it’s biggest flaw. It’s biggest flaw is the sheer amount of subjectivity that infects the field, which is why the IPCC has to rely on “opinion” and “judgment” to “interpret” the research in a way that gets to the critical issues of how much warming is due to man and what the quantifiable impacts will be..Reproducibility presumes objective results that need to be reproduced via experimentation. Almost all of the climate “research” I’ve read forms subjective conclusions based on a whole series of subjective assumptions about data collected, or the best way to analyze and/or “correct” data.
In fact, the very definition of “climate” is nothing but a set of abstract, easily manipulable statistics, with no standardized metrics for quantifying those statistics. Do a Google search and try to lock down the period of time that temperatures, for example, have to be averaged before you get “climate” instead of “weather.” You’ll find a lot of weasel language like “between decades to sometimes centuries” or “typically thirty years.” It’s almost as if the scientists want the flexibility to find the results they want. But in any case, when no one has bothered to define the set of essential statistics that make up “climate” and how to standardize the units by which “climate” is quantified – surely a necessary step to accurately determine how climate is changing – how are you going to “reproduce” climate research?
Real scientists try their best to insulate their opinions from their research when trying to quantify how much influence something has on a system. That’s why we have double blind studies. Climate scientists, however, wallow in their unscientific opinions, illogically bootstrapping their own fictional expertise in the Earth’s climate system to compensate for their inability to produce any kind of objective performance.

Reply to  Kurt
May 11, 2018 3:31 am

Well stated Kurt. Back in the day we produced data dictionaries. When I see the word climate change my head starts spinning.

Reply to  Kurt
May 11, 2018 9:21 am

To go even further and to try and not teach Grannie to suck eggs:
The fundamental issue with current Climate science is that the idea (not even a proper hypothesis as other elements of the system and effects are unknown) requires a level of uncertainty not present in the data. So immediately you have to massage the data and hence are performing a priori a hypothetical exercise i.e. mathematical masturbation.
The chestnut I hear is that’s because it’s an observational science. Okay, then get good observations or tell everyone up front you can’t so that we don’t have to listen to you rant about the End of World.
It isn’t a scientific issue anyway; it’s an advocacy issue with shades of Marxism. People don’t like to hear that either but if the shoe fits.

May 11, 2018 2:35 am

Yes, reproducibility has the cost of slowing down pursuing new experiments. However, it’s a necessary cost to prevent research going down ‘blind alleys’. The only way to speed up scientific research isn’t to limit reproducibility, but to make it quicker and easier to do. That’s best accomplished by making all the data and methods used in all research readily available.

May 11, 2018 2:43 am

Is this the start of the ‘big climb down’ ? Low sensitivity papers published, and now Schmidt calling for something his ex-boss actively fought. Maybe the wheels have finally come off the wagon, lets hope so.

May 11, 2018 3:04 am

Slightly off the main track, but still in the general area of scientific ethics and release of data. I have my own opinion on it which I will reserve for the moment, as I would like to see what others here think. (This subject was once a very lively discussion on Baen’s Bar, a science fiction forum.)
Are there cases where perhaps methods should be fully revealed – but the data gathered be ruthlessly suppressed, and not used due to the methods used to gather it?
I have in mind the nutrition experiments performed by “Doctor” Mengele in the concentration camps, and the similar ones performed by the Environmental “Protection” Agency for PM 2.5.

Rich Davis
May 11, 2018 3:19 am

…its good to see how rapidly some members of the climate community are coming to accept that they have to start providing full method and data to back their research results.

Please. Virtue signaling and damage control does not impress me. Saying that you “support” transparency while whining about how hard it is to post raw data (presumably because D-niers would naively analyze the data without first making the necessary “adjustments”?)

Greg Cavanagh
May 11, 2018 3:44 am

Schmidt is doing nothing but prolonging his own job. He’s as guilty as Mann and for the same reasons. If I wrote what I really think of him I’d be banned for life. *Swamp creature*

Kurt in Switzerland
May 11, 2018 3:48 am

Uh-oh. Does this mean that Gavin will now be subjected to the wrath of the UCS (for disagreeing with them)?

Steve O
May 11, 2018 4:31 am

Science is much less useful if it cannot be relied upon. The complaint that making science useful comes at a cost strikes me as weak. I could use the word “embarrassing” but I think I’ll go with “beclowning.”

Sue Thornton
May 11, 2018 4:35 am

Science can never be settled, or secret. It has to be open for review, debate and acceptance of different interpretation. That way it can only get stronger and be more acceptable to people generally. The more open and frank it is the better for everybody, so long as we are all promoting the truth and not following some political dik tak.. One of the strongest reason why I always fought against the “science” of climate change is because it was closed to scrutiny and to me that smacks of the old freemasons, if it doesn’t stand up to examination then it is not trustworthy. You can not vilify people just because they believe differently, that is draconian and not the least bit intelligent.

Reply to  Sue Thornton
May 11, 2018 1:31 pm

Such virtues were lost long ago among the advocacy armies involved. It is another country of nonprofits and money involved at this point with sizable payoffs for certain key players.

May 11, 2018 4:46 am

I am not sure where the ‘extra cost’ comes from. Let’s say a high powered team of researchers have a copy of the dataset on their own pc, each researcher will be running tests on the data. When they want colleague to see a result they will not expect them to type in a series of commands with a possibility of error. The will send them a script, a file with a few lines of text that performs the data analysis and produces some results or graph. This is most important when researchers are working in different universities or institution.
So, publication time, just upload your data files and script. No cost, no time no problem. Just upload the copy you sent to the journal.
No wonder climate science is in a bad way if trivial acts are deliberately made difficult or opaque.

Gary Pearse
Reply to  Steve Richards
May 11, 2018 5:09 am

Steve, you are assuming honest quality work. The things done to data and statistics to get it to give the results they want is messy. “Hiding declines” and the fudging requires much trial and error. I predict a surge of climate scientist retirements and an order of magnitude or two reduction in number of papers published.
I think this requirement should be retroactive with the option to provide the data and code or withdraw the paper published. Pruitt should announce that work that has been published without data and code will not be given any consideration. This might be the key to overturning the CO2 endangerment finding and other laws.

CC Reader
Reply to  Steve Richards
May 11, 2018 9:07 am

“Just upload the copy you sent to the journal.” Will the journals go out of business if this procedure could be implemented? Follow the money….

Jim Gorman
Reply to  CC Reader
May 11, 2018 10:32 am

Yep. Follow the money. The journals will be one of the entities hurt the most. Not only in “climate”, but also in many studies of other fields that use “global warming” as a crutch to prove their hypothesis.

May 11, 2018 4:48 am

“Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.”
Simply not enough facepalms. Doubt Gavin even knows how that reads to an actual scientist.

tom in lorida
Reply to  cephus0
May 11, 2018 5:44 am

What is the point of research if no one can trust the results?
BTW, the best definition of trust I ever heard was from Rodney Dangerfield:
“Trust is asking for a bj from a cannibal.”

Rich Davis
Reply to  cephus0
May 11, 2018 8:17 am

It’s the typical bloviating by “experts” who are able to rely on the fact that 95% of the public can’t make a confident judgement about the validity of what they are claiming. Gavin knows that it stinks to high heaven that he wants to hide the data or the methods. He’s a political actor above all. So he “supports” transparency, but like Bill Clinton promising a tax cut that he never planned to deliver, we’ll soon find out that he tried as hard as he has ever tried anything in his life, but it was just TOO HARD. Sorry, he will probably also take FULL RESPONSIBILITY for the failure, and DEEPLY REGRET it. But chances are, the only story anybody hears will be that Gavin Schmidt is leading the drive for transparency. What a guy.
Let’s see, you could post a few gigs of raw data on Google Drive for free, along with your source code that might reach a few 100kb. Now you had to complete the source code in order to analyze the data, and you had to collect the data in order to analyze it. So help me out here people. Where is this complicated? Click, click, done. Furthermore, if you can’t or won’t explain your logic for how you analyzed (or is the right term “tortured”?) the data, then how could anybody independently replicate the results?

Gary Pearse
May 11, 2018 4:53 am

They need a course in data management. It would even streamline research (if the study is a legitimate one) having the database and code archived as you go. The real problem they are having is it would be a discipline that would discourage the stats and data iterations and weighting done in search of support for preconceived results and significance – this is messy work.
The sudden steep drop in papers published will be a very direct measure of the abysmal quality of all the stuff cluttering a profusion of journals.

May 11, 2018 5:25 am

How overinflated is your sense of self importance if you think you should be the one to decide whether other scientists should spend their time reproducing results or doing new research.

May 11, 2018 5:53 am

If you do not understand how there are no experiments going on here, go back to the beginning and start over. I went to “open house” at my kid’s middle school, and saw in his science classroom:
the scientific method, in a series of posters on the wall. Thankfully, the teacher got them in the right order.
In this, a fledgling scientist with a hypothesis, and some kind of beaker, tests whether the predictions of the hypothesis emerge, or not, when the fledgling scientist MANIPULATES SOME VARIABLE UNDER STANDARD CONDITIONS.
No one is manipulating any independent variables, or dependent variables. Not Gavin, not Pruitt, no one.
“We” have shown up late to the game. The phenomena have already happened. We are just trying to piece together what happened. This is a historical investigation, not an “experiment.”
Wikipedia hits it out of the park with their opening statement under their “experiment” entry:
An experiment is a procedure carried out to support, refute, or validate a hypothesis. Experiments provide insight into cause-and-effect by demonstrating what outcome occurs when a particular factor is manipulated. Experiments vary greatly in goal and scale, but always rely on repeatable procedure and logical analysis of the results. There also exists natural experimental studies.”
Why can I be a decent climate-science critic even though I never took geology or meteorology? Because I know science.

May 11, 2018 7:12 am

“will distract scientists from research” is truly lame. He is thinking of the hide and seek fortune types out there that he works with, not real scientists.

Nick Werner
May 11, 2018 8:04 am

“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios…”
The benefit of reproducibility is being able to distinguish between what actually is science and what is purported to be science. If John Lennon were alive to write about the state of climate science research and policy impacts it might go something like…
Imagine there’s no pikas,
It isn’t hard to do…
No polar bears or penguins,
Except in a zoo…
Forests turned into wood pellets,
To limit CO2…
… etc.

colin smith
Reply to  Nick Werner
May 11, 2018 9:44 am

Target destroyed. With sense of humour.
And my own two penn’orth…
…as others have observed replication is the larger cost. To do so is a choice it is not mandatory.
In fact it may be free, I can see (so called) citizen scientists even treating it as “fun”.
Saving your data & code is a very modest overhead as is making it available.
The former becomes second nature to anyone outside academia. Indeed without it you or your employer may be liable should something go wrong. It helps to get it right next time.
The latter is usually already there, most educational establishments have web access for stuff such as course notes.

Jim Gorman
May 11, 2018 11:14 am

This comes as no surprise. It is becoming more and more evident that much science (medical, social?, climate, etc.) has broken down. One of the main reasons is government funding and acceptance of poor scientific methods. It is more important to get the next round of funding than it is to take the time to make sure your current science is correct. The plethora of statistical software, data manipulation, and computer models rather than physical results has made it easy to obtain made up results and publish them regardless of whether they are real or not. Too many so called scientists are really THEORETICAL climate scientists (just like theoretical physicists) who work on a blackboard but never do the physical experiments to prove their theories. Politicians don’t care as long as they have something in their hands to justify spending money on.
The focus on a global temperature rather than global heat is indicative of how perverted the science has become. Just for fun, play along and assume a reasonable global temperature is developed. Will this really tell you what the amount of global heat truly is? Will it tell you what the ‘climate’ in the Sahara, Andes, or Steppes will be? Climate science has jumped over a large number of basic steps in determining how the earth works, yet want folks to think their hypothesis’ are all correct. Would any self-respecting physicist have accepted there truly being a Higgs boson based solely upon computer models?

May 11, 2018 11:25 am

He’s going to need quantum computers now to stay ahead of model and data checking resulting from integrity disclosure rules. I predict much more complicated models consuming every more computing hardware budgets in this complexity-for-deception arms race. The IRS computing budget requests will pale in comparison.

The Original Mike M
May 11, 2018 12:46 pm

Expecting Gavin Schmidt to protect the honesty of the historic temperature record is like expecting Bernie Madoff to protect your retirement savings.

May 11, 2018 12:54 pm

Hah, Schmidt himself knows that ‘extra cost’ isn’t salving his conscience. Time to feel sorry for him? No, not yet, but soon, and for the rest of his life.

Gunga Din
May 11, 2018 3:01 pm

“NASA Gavin Schmidt Joins Call for Climate Data Transparency”
I’d like to know what program(s) were/are used to archive the data and just what were/are the settings.

Mr Bliss
May 11, 2018 3:14 pm

One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt,
Wow – that’s an admission – climate scientists can’t handle large amounts of data

Geoff Sherrington
May 11, 2018 5:14 pm

As well as the ability to replicate an experiment, the other big quality assurance factor is measurement uncertainty.
The error envelopes that one sees so often on graphs are almost always a partial expression of uncertainty. In a lot of climate work, if the realistic, full uncertainty was estimated and shown, so much data would simply rattle around within those limits that typical exercises like the trends of temperature over times would become meaningless. All within the error bounds.
There are formal structures for treatment of errors, a prominent one being from the Paris-based BIPM, the International Bureau of Weights and Measures. I have never seen a reference to the BIPM in a climate paper, though there must be some mentions that I have missed.
Proper error bounds can be embarrassing. You write your paper, play with your conclusions, then see that the bounds are so large that you have no paper at all.
However, if you are a climate researcher, you often set down what you want to prove, find some data to fit the preconception, then do rote service to a simple error estimation which you might or might not bother to show.
Consequently, a large number of climate papers would not pass peer review if the realistic error bounds were correctly derived and shown. There it a nice feature of proper error bounds. They are of great assistance in separating gold from dross. Hence, many modern authors try to evade them. Geoff.

May 11, 2018 5:27 pm

Even replication has its flaws because it cannot detect when the process being replicated has a flaw leading to a false result.

Reply to  ferdberple
May 11, 2018 5:32 pm

For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.

Reply to  ferdberple
May 11, 2018 5:32 pm

For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.

May 11, 2018 5:49 pm

Nobody believed it when he said it, or did they?
“President Barack Obama repeatedly pledged he would run the most transparent administration in the history of the United States during both of his presidential campaigns, but the evidence shows Obama’s administration has not only failed to meet that standard, it has actively worked to conceal important information from the public.”
Lots of things need to be opaque, just talk to us like adults.

May 11, 2018 7:07 pm

What can they say, we dont agree with transparency? What they do say is its extra work and it can be difficult to do, therefore Im sure it will be difficult to do accurately, and inaccurate data is worst than no data..

Pat Lane
May 11, 2018 9:23 pm

“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,”
This translates as:
“So if your analysis is wrong, it’s better to move on to something else than to have your errors brought to light.”

Verified by MonsterInsights