Berkeley Earth finally makes peer review – in a never before seen journal

berkeley_earth_surface_temperature_logo[1]After almost two years and some false starts, BEST now has one paper that has finally passed peer review. The text below is from the email release sent late Saturday. It was previously submitted to JGR Atmospheres according to their July 8th draft last year, but appears to have been rejected as they now indicate it has been published in Geoinformatics and Geostatistics, a journal I’ve not heard of until now.

(Added note: commenter Michael D. Smith points out is it Volume 1 issue 1, so this appears to be a brand new journal. Also troubling, on their GIGS journal home page , the link to the PDF of their Journal Flier gives only a single page, the cover art. Download Journal Flier. With such a lack of description in the front and center CV, one wonders how good this journal is.)

Also notable, Dr. Judith Curry’s name is not on this paper, though she gets a mention in the acknowledgements (along with Mosher and Zeke). I have not done any detailed analysis yet of this paper, as this is simply an announcement of its existence. – Anthony

===============================================================

Berkeley Earth has today released a new set of materials, including gridded and more recent data, new analysis in the form of a series of short “memos”, and new and updated video animations of global warming.  We are also pleased that the Berkeley Earth Results paper, “A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011” has now been published by GIGS and is publicly available.

here: http://berkeleyearth.org/papers/.

The data update includes more recent data (through August 2012), gridded data, and data for States and Provinces.  You can access the data here: http://berkeleyearth.org/data/.

The set of memos include:

  • Two analyses of Hansen’s recent paper “Perception of Climate Change”
  • A comparison of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques on ideal synthetic data
  • Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques

and are available here: http://berkeleyearth.org/available-resources/

==============================================================

A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011

Abstract

We report an estimate of the Earth’s average land surface

temperature for the period 1753 to 2011. To address issues

of potential station selection bias, we used a larger sampling of

stations than had prior studies. For the period post 1880, our

estimate is similar to those previously reported by other groups,

although we report smaller uncertainties. The land temperature rise

from the 1950s decade to the 2000s decade is 0.90 ± 0.05°C (95%

confidence). Both maximum and minimum daily temperatures have

increased during the last century. Diurnal variations decreased

from 1900 to 1987, and then increased; this increase is significant

but not understood. The period of 1753 to 1850 is marked by

sudden drops in land surface temperature that are coincident

with known volcanism; the response function is approximately

1.5 ± 0.5°C per 100 Tg of atmospheric sulfate. This volcanism,

combined with a simple proxy for anthropogenic effects (logarithm

of the CO2 concentration), reproduces much of the variation in

the land surface temperature record; the fit is not improved by the

addition of a solar forcing term. Thus, for this very simple model,

solar forcing does not appear to contribute to the observed global

warming of the past 250 years; the entire change can be modeled

by a sum of volcanism and a single anthropogenic proxy. The

residual variations include interannual and multi-decadal variability

very similar to that of the Atlantic Multidecadal Oscillation (AMO).

Full paper here: http://www.scitechnol.com/GIGS/GIGS-1-101.pdf

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

247 Comments
Inline Feedbacks
View all comments
Latitude
January 20, 2013 2:53 pm

Steven Mosher says:
January 20, 2013 at 11:47 am
Muller didn’t even know about the journal until it was presented as an option.
==========================
ROTFLMAO……you mean he didn’t know of something that didn’t even exist
..then ditto everything Willis said much better than I would have

Rhoda R
January 20, 2013 2:57 pm

Thank you, Latitude.

John West
January 20, 2013 3:11 pm

Steven Mosher says:
it is not presented as THE answer……..this data is consistent with the theory. Nothing more.”
That’s not how I (or probably anyone that speaks English) interpret(s): (paraphrasing) ”There’s no reason to be skeptical anymore”. That’s saying THE answer has been found and (if you look the context) it is anthropogenic. Do I really need to go find the quote? The gist of it is burned into my memory.
Also, the paper states that there’s no reason to assume there’s any other cause but fails to mention there’s no reason not to either, and actually finds a glaring bit of data that is inconsistent with the AGW supposition.
On top of that I’m about up to 3 feet over my head with “consistent with”. A half eaten cookie on Christmas morning is consistent with a visit from Santa Claus, but that doesn’t make it even reasonably so.

Jimbo
January 20, 2013 3:23 pm

Mosher says:…….
“Muller didn’t even know about the journal until it was presented as an option.”

Mosher, who presented it as an option? Was it the journal or its group which has been accused of spamming scientists? Please do not skip my questions.

Kev-in-Uk
January 20, 2013 3:37 pm

Mosher
and
@Willis
Now, now, guys – lets stay calm!
I accept Steves point that we have to accept the original document as ‘validated’ in itself, as we have to accept eyewitness information or any other form of historical recording from a single source (the only one available) as being ‘it’.
My problem arises from the subsequent treatment of that data. If somebody CRU/GISS/whoever, takes a shitload of paper documets/logs and enters the data into a computer ‘as read’ – this would essentially be the raw data – with of course a few typos/number transpositions, etc! (which is why – at the very very first quality control stage – ANY and ALL queries must be cross checked against the original ‘paper’ record, yes?)
Anyway, let us presume that the first QA check passes off as ‘ok’.
We then have the likes of Mr Hansen, sitting in his office, looking at the data and thinking, ‘hmm, this data looks a bit off – let’s adjust it because I think this is wrong with it, etc, etc’
But this doesn’t happen just once, but many times – with each adjustment to the dataset being recorded as a new ‘version’.
I have no problem with the adjustment, so long as it is valid, and MORE IMPORTANTLY – it is recorded and both pre and post adjusted data are preserved in toto!
Now, on the basis, that we know full well that Jones et al (or the CRU/Met office, as you prefer) have ‘lost’ original data – what does any proper real scientists think about this? This is like, man, the worst ever FAIL possible in science!
So, roll forward a few decades, or whatever, and some chap decides he wants to use the data (without taceability) and it passes peer review, etc – then somebody uses his data, etc, etc, etfeckingcetera!
I just want someone, somewhere to tell me, nay, ‘PROVE’ to me – that the data CURRENTLY being used is VALID, and not validated by the likes of Hansen, Jones or Mann, but by some process of traceability and independent data storage/verifiable sources and documented methods for changes,when why, where and how, etc!
I’m afraid, I just don’t see this as being possible (but I’m happy if someone shows my gut feeling to be wrong!)
Ergo, as I said earlier, the Best findings are , at best, a rehash of the old data, with all its inherent faults. yes? I know it sounds crazy, but until someone proves otherwise, that is what I think! and FFS don’t bother quoting the fact that BEST (or any of its input datasets) passed peer review at me! Show me the data, show me the workings ON ALL that data, and show me it is goddamned correct!
Otherwise, shut the feck up about global temperature datasets completely – because if you ain’t got traceability and scientific standards – you ain’t got Sh1t. Period. (and yeah, Steve, I know it maybe all we have to work with – but I don’t see the warmista advertising the fact that their data isn’t perfect along with all the headlines!)

michael hart
January 20, 2013 4:15 pm

Hell, if the authors wish to pay me, then I’ll set up a new journal myself and publish it again.
For an additional fee I’ll even name the journal “Proceedings of The Masters of The Universe” or some similar catchy title.

Andrejs Vanags
January 20, 2013 4:17 pm

My problem is this statement: ““Our analysis does not rule
out long-term trends due to natural causes; however, since all of the
long-term (century scale) trend in temperature can be explained by
a simple response to greenhouse gas changes, there is no need to
assume other sources of long-term variation are present.”
We are pretty sure that all of the long-term century trend in temperature CANNOT be explained by a simple response to green house gas changes.
As far as I understand it, climate scientist have no clue as to what caused the little ice age, no clue as to what caused the medieval warm period, no clue as to what caused the cold and famine of the middle ages, no clue what caused the hot roman optimum, no clue as to why the entire northern Africa suffered drought, creating deserts and toppling the egyptian civilization and in general no clue why temperatures shoot way up at the beginning of an inter-glacial and slowly decrease until suddenly dropping in the next ige age. No explanation as to why have the very long term temperatures been decaying since 10,000 years ago.
So there is EVERY reason to assume that “other sources of long-term variation are present” its unscientific not to do so.

Don Monfort
January 20, 2013 4:19 pm

I wonder if Mosher, Muller and the BEST team expected their work to gain credibility by being published in the Premier/Grand Opening! issue of G&G (aka ‘journal of last resort’).
Willis’ gloating is justified, Steven.

jim2
January 20, 2013 4:21 pm

WRT RAW data:
Berkeley Data Sets:
Colonial (from the read me)
The original data files are not publicly available.
GHCN Data
GHCN-Daily is comprised of daily climate records from numerous sources that have been integrated and subjected to a common suite of quality assurance reviews.
Looks like there might not be a lot of RAW data to be found, although I didn’t take time to look at all the read-me files.

Catcracking
January 20, 2013 4:23 pm

How do they get 95% confidence in figures to 5/100ths of a degree from measurements that are nowhere near this accuracy?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
They didn’t and they can not. In statistics if you had thirty thermometers reading simultaneously the same area at the same time you could do the statistics on the data to get a better estimate of the true value and get better precision than you had with one reading.
Thank you Gail et al.
While I am not versed in statistics, the +/- 0.05 degree claim does not seem credible to me. Why would they make such a “stretched” claim? It only seems to detract from the credibility of the entire paper.

Editor
January 20, 2013 5:21 pm

Mosher answer questions about the choice of journal etc in the comment string at Climate Etc. starting at:
http://judithcurry.com/2013/01/20/berkeley-earth-update/#comment-287652

Editor
January 20, 2013 5:28 pm

Oops: Mosher answers questions,….

Eugene WR Gallun
January 20, 2013 5:58 pm

Following the link David Davidovics (jan. 19 — 6:45pm) offers leads me to this conclusion. It seems that vanity publlishing has found a new literary area to exploit. Now like poets and novelists obsessed with publication “scientists” can also pay to see their work in print.
BEST published in a “vanity journal”. This is like something out of Monty Python.
Eugene WR Gallun

January 20, 2013 7:26 pm

Mosher answer questions about the choice of journal etc in the comment string at Climate Etc.

Let’s see. Sentences and Paragraphs are capitalized. Punctuation nearly perfect. Adequate linefeeds. Are we sure it’s Mosher?
😉
NB: For the terminally humorless this is just a friendly joke, possibly an inside one for those unaware of Steve Mosher’s sleuthing out Peter ‘Principle’ Gleick in the Heartland phishing affair. Having to explain it does sort of ruin it, but I’ve been misunderstood once already in this thread.

DirkH
January 20, 2013 7:59 pm

Andrejs Vanags says:
January 20, 2013 at 4:17 pm

“My problem is this statement: ““Our analysis does not rule
out long-term trends due to natural causes; however, since all of the
long-term (century scale) trend in temperature can be explained by
a simple response to greenhouse gas changes, there is no need to
assume other sources of long-term variation are present.””

What Mosher and Muller are doing here is they are using the Schneider trap; insinuating that CO2AGW is now the Null hypothesis, not some arbitrary theory with GCM predictions as its hypothesis.
Switching the null hypothesis. An Orwellian, revisionist strategy. Stephen Schneider suggested this Null hypothesis switch first.

January 20, 2013 8:22 pm

I’m prepared to accept whatever result they produce, even if it proves my premise wrong.
REPLY: and I did, read here: http://wattsupwiththat.com/2012/07/29/press-release-2/ – A

Glenn
January 20, 2013 8:32 pm

michael hart says:
January 20, 2013 at 4:15 pm
“Hell, if the authors wish to pay me, then I’ll set up a new journal myself and publish it again.
For an additional fee I’ll even name the journal “Proceedings of The Masters of The Universe” or some similar catchy title.”
Better check with OMICS first, they likely already have that title in use.

mpainter
January 20, 2013 8:52 pm

DirkH says: January 20, 2013 at 7:59 pm
“Our analysis does not rule
out long-term trends due to natural causes; however, since all of the
long-term (century scale) trend in temperature can be explained by
a simple response to greenhouse gas changes, there is no need to
assume other sources of long-term variation are present.””
What Mosher and Muller are doing here is they are using the Schneider trap; insinuating that CO2AGW is now the Null hypothesis, not some arbitrary theory with GCM predictions as its hypothesis.
Switching the null hypothesis. An Orwellian, revisionist strategy. Stephen Schneider suggested this Null hypothesis switch first.
=============================
And that does not sell in skeptic land.

Manfred
January 20, 2013 10:35 pm

DirkH says: January 20, 2013 at 7:59 pm
What Mosher and Muller are doing here is they are using the Schneider trap; insinuating that CO2AGW is now the Null hypothesis, not some arbitrary theory with GCM predictions as its hypothesis.
——————————
Mosher does not seem to be too convinced about that bit either.
http://judithcurry.com/2013/01/20/berkeley-earth-update/#comment-287810
Nevertheless he blames previous peer review for not waving through and praises this new review process. Weird.

Kev-in-Uk
January 20, 2013 11:43 pm

jim2 says:
January 20, 2013 at 4:21 pm
That’s is my view also. Various correction/adjustments are mentioned, usually as quality control checks, but AFAIK the majority of temperature adjustments are likely to have been done by computer algorithms. Fine, so long as it is a valid adjustment – but it is not strictly valid to assume that station X, in the middle (spatially) of half a dozen other stations, but showing a significantly different temperature on a given day, must be wrong. Automatically Weighting ‘against’ that station (without reason) just because it is different – is potentially wrong. But my beef is that once that any valid adjustment has been made – that should really be ‘it’ – so how come, we have ‘had’ to have different versions of datasets, with ‘continual’ adjustments of the old past data? And, the biggest query of all, is; are these later adjustments, simply cheeky little adjustment on top of the original adjustments? – because to my mind – you can easily lose sight of the ‘real’ or ‘raw’ data!
Put it this way, does anyone here think that a current dataset exists where they can printout a list of adjustments for any given station, in chronological order, and with the reason for each adjustment? I dunno, something like:
Station X – May 1885, to March 1989; temps adjusted -0.5C due to thermometer error
Station X – April 1901; new thermometer installed +0.1C added to all previous data
…continuing up to more recent times..
Station X – June 1988; whole dataset shifted +0.2C from due to estimated UHI effect
Station X – Jan 1995; data from 1930-1940 adjusted by -0.1C due to loss of UHI because of financial depression
etc, etc.
These kind of things are exactly what needs to be recorded to keep the dataset ‘intact’ – but I don’t believe they have done it in such a fashion – if at all!

Scarface
January 21, 2013 12:24 am

OMICS Publishing Group…
Now that they are into CAGW, also known a COMICS

Jimbo
January 21, 2013 2:58 am

Read one commenter who has complained about their paper being published without permission and getting repeated requests to give OMICS Group their fee. The founder of the group, Dr Srinu Babu Gedela, replies offering a discount. These are the people BEST have published with. Sad.

Hello.
I had a serious problem with one of the journals of OMICS Group. After receiving a lot of emails offering me publish my works in their journals, I asked them about the possibility of publishing a research paper. They asked me to read the paper and I made ​​the mistake of sending it. I did not hear anything about this editorial, until three months later they told me that they had accepted the job and would publish them if I were paid to them $ 2,700. Then the manuscript has not been published yet and I told them to publish in their magazine not interested me. I did not receive any review of the manuscript and I saw that the data on the web magazine about impact index were false. I only asked for information and I never authorized the publication of my work. Two months later, they published it without my permission. The published paper is full of errors. Since then I have sent a dozen emails urging the withdrawal of my work on their site. However, they did not withdraw and would require payment of $ 2700. What do you recommend I do? No doubt this is a fraud, and I do not know how to get them to withdraw the work and they stop sending payment requirements.
http://poynder.blogspot.co.uk/2011/12/open-access-interviews-omics-publishing.html

Prof. Natarajan Muthusamy, Associate Professor of Internal Medicine and the Ohio State University Medical Center has been named as the Editor-in-Chief of a journal from the OMICS Publishing Group, Journal of Postgenomics: Drug & Biomarker Development. “I am not aware that I am Editor-in-Chief [of this journal]. I do not recall having committed to this job,” he told The Hindu in an email.
http://www.thehindu.com/sci-tech/technology/on-the-net-a-scam-of-a-most-scholarly-kind/article3939161.ece

January 21, 2013 3:03 am

On page 2

This empirical approach implicitly assumes that the spatial relationships between different climate regions remain largely unchanged, …….. in the period 1750 to 1850 when our evidence shows a strong influence from large volcanic eruptions, a phenomenon for which there are only weak analogs in the 20th century ……. our results are accurate only to the extent that the spatial structure of temperature does not change significantly with time.

So they modeled on the assumption that the spatial structures were the same, stated that volcanoes had a strong influence then, but, they don’t now. Presumably, volcanoes would alter the spatial temperature structures, but they modeled on them being the same.
About volcanoes……. one of the premises Mosh states BEST operates under.
“2. Given: Volcanos cause cooling”.
No they don’t. At least, not in the last century for any significant period of time. Looking at all the volcano eruptions in the last century with a VEI of 5 or 6, I don’t see much that would indicate substantial cooling caused by volcanoes. Temps after Mount Agung did drop about 0.5 C, that one was the most noticeable, but, we saw the temps increase by about 0.6C after Bezymianny. 1991 saw Mount Pinatubo and Mount Hudson erupt! What was the temp response? Well, it cooled by about 0.2 C, but that’s just what the temps did, there’s no real attribution to the volcanoes and well within the normal temp swings we see from year to year. Just like nearly all of the 12 VEI 5-6 volcano eruptions last century. ENSO seems to have a much stronger impact.
I think the paper would be easier to take serious if they’d left the 10 and 20 thermometer periods out of the work, and didn’t worry about attribution. The early history period is nothing but circular logic and tautological (as Dirk points out).
Volcanoes made the earth cool in the 1750-1850 time period. We don’t have good analogs for it in the 20th century, but, we used 20th century temp spacing for the earlier time period that volcanoes wrecked havoc on the earth. How stupid is that? I can’t get to discussing scalpels and jack-knifes when this madness is offending my eyes! They even stated this approach was “empirical”!!
As to the other premise about CO2 causing warming……. well, things are modeled that way, aren’t they?

Eliza
January 21, 2013 5:38 am

Wow maybe OT but arctic extent is now highest since records began 5 years ago LOL
http://ocean.dmi.dk/arctic/icecover.uk.php
In fact I dare predict that NH ice may enter the average values quite soon and maybe stay there for the whole year this time really will terminate the farce forever we can hope!