After almost two years and some false starts, BEST now has one paper that has finally passed peer review. The text below is from the email release sent late Saturday. It was previously submitted to JGR Atmospheres according to their July 8th draft last year, but appears to have been rejected as they now indicate it has been published in Geoinformatics and Geostatistics, a journal I’ve not heard of until now.
(Added note: commenter Michael D. Smith points out is it Volume 1 issue 1, so this appears to be a brand new journal. Also troubling, on their GIGS journal home page , the link to the PDF of their Journal Flier gives only a single page, the cover art. Download Journal Flier. With such a lack of description in the front and center CV, one wonders how good this journal is.)
Also notable, Dr. Judith Curry’s name is not on this paper, though she gets a mention in the acknowledgements (along with Mosher and Zeke). I have not done any detailed analysis yet of this paper, as this is simply an announcement of its existence. – Anthony
===============================================================
Berkeley Earth has today released a new set of materials, including gridded and more recent data, new analysis in the form of a series of short “memos”, and new and updated video animations of global warming. We are also pleased that the Berkeley Earth Results paper, “A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011” has now been published by GIGS and is publicly available.
here: http://berkeleyearth.org/papers/.
The data update includes more recent data (through August 2012), gridded data, and data for States and Provinces. You can access the data here: http://berkeleyearth.org/data/.
The set of memos include:
- Two analyses of Hansen’s recent paper “Perception of Climate Change”
- A comparison of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques on ideal synthetic data
- Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques
and are available here: http://berkeleyearth.org/available-resources/
==============================================================
A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011
Abstract
We report an estimate of the Earth’s average land surface
temperature for the period 1753 to 2011. To address issues
of potential station selection bias, we used a larger sampling of
stations than had prior studies. For the period post 1880, our
estimate is similar to those previously reported by other groups,
although we report smaller uncertainties. The land temperature rise
from the 1950s decade to the 2000s decade is 0.90 ± 0.05°C (95%
confidence). Both maximum and minimum daily temperatures have
increased during the last century. Diurnal variations decreased
from 1900 to 1987, and then increased; this increase is significant
but not understood. The period of 1753 to 1850 is marked by
sudden drops in land surface temperature that are coincident
with known volcanism; the response function is approximately
1.5 ± 0.5°C per 100 Tg of atmospheric sulfate. This volcanism,
combined with a simple proxy for anthropogenic effects (logarithm
of the CO2 concentration), reproduces much of the variation in
the land surface temperature record; the fit is not improved by the
addition of a solar forcing term. Thus, for this very simple model,
solar forcing does not appear to contribute to the observed global
warming of the past 250 years; the entire change can be modeled
by a sum of volcanism and a single anthropogenic proxy. The
residual variations include interannual and multi-decadal variability
very similar to that of the Atlantic Multidecadal Oscillation (AMO).
Full paper here: http://www.scitechnol.com/GIGS/GIGS-1-101.pdf
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Wait till Laden sees you’ve endorsed this paper.
Peer review or beer review?
When I couldn’t find a publisher, I also created my own blog. Of course, that doesn’t mean anyone will read it.
BEST of OMICS
http://www.vukcevic.talktalk.net/Bestomics.htm
WOW, Mosher has out done himself, Nice work thar. 😆
Now we are geoinformatics denialists.
http://poynder.blogspot.co.uk/2011/12/open-access-interviews-omics-publishing.html
Scitechnol is registered to OMICS which appears to be a vanity press for ‘peer’ review
No Steve, those cranks are on your side, they are your allies. They gave us the grassy knoll nonsense, tried to debunk Apollo, and tell us Jews didn’t show up for work on 9/11 to avoid the controlled demolition of WTC. Naturally they would line up to participate in th AGW hoax. Now what is your excuse?
Mosher thinks he is being clever there. But it is actually an insult to everyone here at WUWT. Wallowing in the sewer with leftist liberal psychopaths has harmed him greatly.
[Reply: Can we cut back on insults? I know a lot of nice “leftist liberals” and hold some such attitudes myself. Calling them ‘psychopaths’ is, er, problematic. -ModE ]
Past, present and pending pre-science.
Parallels.
Berkeley Earth, NASA GISS, and Hadley CRU.
The presumptive parallels found in them all in the blogs bind them to mere perceptions of AGW’s alluring pall.
The back story tells the new story, to repeat the same inconclusive old story, again.
. . . and some things are as they were before as we see that Mosher’s gone on about the moon, again.
It’s dialog on time, again.
John
Steven Mosher says: January 19, 2013 at 10:22 pm
… Do you think we landed on the moon?
And we still have the sound stage here in Houston if we ever wanna go back, too. Freshen up the grey dust, touch up the paint on the LEM, and we’re in business.
SteveB says: January 19, 2013 at 8:26 pm
Hmm. The OMICS Publishing Group has been accused, in the past, of being “a predatory Open Access publisher” and “of tacitly saying it will publish anything”.
Anything, like, maybe a science fiction novel from a budding author?
Not that we really need another take on all the adjusted, homogenized, and value-added raw data, but anyone doing so much PR and puffery deserves to get something out of it. Like tenure, or a reserved parking space.
I would sure like to see the before and after of how the “scapel” method creating 179,000 new stations changed the overall trend over time.
What is the temporal distribution of the changes brought on by this scapel method.
If it works out to be a simple 0.0C for the most part starting in 1753 and going into 1763 and in 1944 and in 2010, it would be more believable. But I suspect it’s not since it is not shown or even described in the paper.
One would think this should be more fully outlined. For all I know it could be -0.5C in the beginning of the record and +0.5C towards the end.
Steven Mosher says:
January 19, 2013 at 10:07 pm
“Correct me if I’m out to lunch, but did Phil Jones not lose the CRU raw data and the MET was still promising to reconstruct that record? Or did that get done?
#################
not lost. never been lost. It still exists at NWS. the good news is you can take every station in CRU, delete it, and you still have 32,000 stations. And of course the answer doesnt change.
facts. hard to deal with. but thems the facts.
————————–
Then what was Harry (of the “read me file” fame) working on?
I thought that was the original CRU data.
Am I wrong?
cn
Here in the UK there is a series of journals with the prefix ‘PRACTICLE’ they are known by us oldies as ‘Camms Comics’.
This series of electronic journals may just come to be known as ‘Sham Omics’
In order to be science, the results of the Berkeley Earth Surface Temperature paper must be reproducible. This would imply the data used be available to anyone wanting to recreate the results. The question arises as to were the source data resides.
Assuming they used Dr. James Hansen’s GISS data, then the current published data base, which has been recently adjusted, will not fit the data used by BEST invalidating an accurate reproduction. On the other hand if BEST archived the GISS data, one could compare the archive to the current data base and derive the adjustments.
oldfossil do you understand what something has to be to be consider in scientific terms ‘bona fide’?
Although some have claimed so , Watts did not say that BEST could produce ANYTHING and he accept it has VALID.
FerdinandAkin its still strikes me has amazing in what is described by its advocates has the most important thing ever , that how very badly the data has been handled with raw data gone altogether, and adjustments guest at. Once again the ‘professional ‘ in climate science operate at a level unacceptable to undergraduate student handing in a essay . And the worst part is the wall of silence from their fellows over this , which means that way AGW theory falls it will take much more with it .
oldfossil says:
January 20, 2013 at 2:12 am
That seems a crazy statement, Sir!
The supposed method as originally outlined, as Anthony surmised, did indeed have promise – but on a cursory glance at the paper this morning I would strongly suspect that the actual applied method is somewhat suspicious – station weighting, averaging, then averaging of averages of averages, etc (which makes Moshers statement of removing stations seem a bit silly!). I’m busy today, but I will be throughly reading this paper in due course, very slowly, as it seems to be a bit smelly on first preview and not very detailed about certain aspects!
The supplementary pdf isn’t all that helpful either. I think they will need to produce very detailed methodology, e.g. how they have dealt with station dropouts and outliers – with actual demonstrated ‘outlier’ procedures, code, etc; along with the datasets, and if necessary all the little ‘notes’ describing how/why stuff was adjusted – UHI anyone?. As it stands, I am suspicious and I am sure Anthony will remain so until proven otherwise.
Your calling Anthony out seems crass in the extreme, as you seem to be suggesting that Anthony should automatically accept this papers’ findings and chat nicely to the warmista? Just because he is being cautious and ‘wondering’ as to how the peer review process has taken place (and so bloody long!) is hardly ‘thumbing his nose’ at the other side – it is indeed, a perfectly valid stance!
Bill Illis says:
January 20, 2013 at 2:51 am
absofeckinglutely Bill!
How do they get 95% confidence in figures to 5/100ths of a degree from measurements that are nowhere near this accuracy?
“And of course the answer doesnt change.”
yes it does!
raw Sydney data
http://users.tpg.com.au/johnsay1/Stuff/Sydney.png
nothing like Best!
Mosher, USHCN is considered best of class and it has demonstrably abominable quality issues. Remove that and you’re using stations that are worse.
From the first public announcement of the BEST project, I think its public face has been managed in an unprofessional manner. Now we can look at what the actual project product looks like and judge its scientific professionalism.
The dialog is in.
John
The publisher is called SciTechnol
http://www.scitechnol.com/aboutus.php
On- line, right-on but not well written.
Its facebook page has a link to the Union of Concerned Scientists
On the paper itself, CO2 concentration is not a proxy for anthropogenic effects. It is a proxy for solar effects.
ATTENTION: MODS
ironically, my last comment got sucked into the blackhole. Can you save it?
*** PLEASE DELETE THIS ONE ***
Thanks!
Reply: I’d rather delete the black hole one as it is largely a personal communication to me (and your prior comment is ‘up’ already). I have no ‘ax’ here. Just wish to request a more ‘polite and professional’ tone. -ModE ]
Steven Mosher says:
January 19, 2013 at 10:12 pm
Steve, there is no doubt that some of the data is ‘bad’. I don’t have a problem with that, it is perfectly normal in science to ignore bad data if it can be shown to be bad/suspect, etc.
What is NOT normal, is to include/exclude data, kind of, at will – to suit ones agenda or anticipated findings. To avoid accusation of this, you must provide the full monty of data, used/unused/adjusted, etc – do you not agree?
Similarly, when someone does an experiment, takes readings, etc – those readings are the ‘holy grail’ of data – and should be carefully kept in PRISTINE condition – in effect, untouched by human hand. Ok, ok – we all know that data today, is held on computer, but even so, the raw data should be ‘visible’ and carefully held in its pristine condition.
In the case of all the various temperature datasets, the raw data is ‘entered’, saved, adjusted, averaged, readjusted, homogenised, etc, etc. No problemo – but when I want to know WHY something was adjusted, where do I find this out? Where, in the records is the reason/method of adjustment recorded?
And just to keep it simple in respect of wondering how good the ‘current’ dataset record is. I would like to know ONE, yes, only ONE – temperature dataset that has been maintained and recorded for a decent period of time, with each and every RAW recorded reading, still in its ‘pristine’ condition, with a description of each and every recorded adjustment and the reason for such adjustment from day one, such that the TRACEABILITY of the currently used ‘value’ can be worked all the way back, without break, to the raw data. In effect, after the folks crowing on about the 1780’s thermometer readings for Sydney not being traceable to known/validated calibrations, etc – can you, or anyone else demonstrate an adequate (read any, IMHO) level of traceability for the current longer term temp datasets?
Now, I know you have been asked this before – but I am asking once again, DO YOU (or anyone else) KNOW OF SUCH A DATASET? Where is it – and is it publicly available? If not, why not?
Welcome Paul Dennis,
are you the same Paul Dennis who works at the University of East Anglia?
http://www.uea.ac.uk/environmental-sciences/people/facstaff/dennisp
I look forward to reading your further input.
James