UK Met Office says 2014 was NOT the hottest year ever due to 'uncertainty ranges' of the data

Met Office Confirms 2014 Continues Global Warming ‘Pause’

Guest essay by Dr David Whitehouse, via The GWPF

With the release of the 2014 HadCRUT4 data by the UK Met Office, and the previous release of global temperature data by Berkeley Earth, Nasa and Noaa, the main conclusion to be drawn from the data is that 2014 was a warm year, but not statistically distinguishable from most of the years of the past decade or so meaning that the “pause” in global annual average surface temperatures continues.

The Met Office said:

“The HadCRUT4 dataset (compiled by the Met Office and the University of East Anglia’s Climatic Research Unit) shows last year was 0.56C (±0.1C) above the long-term (1961-1990) average. Nominally this ranks 2014 as the joint warmest year in the record, tied with 2010, but the uncertainty ranges mean it’s not possible to definitively say which of several recent years was the warmest.”

new-hadcrut4[1]
HadCRUT4 global temperature data plot – Click on image to enlarge.
Quoting the temperature to one hundredth of a degree and the error on that measurement to a tenth of a degree is not normal scientific practice. It is against normal scientific practice to have an error of the measurement larger than the precision of that measurement. This means that most scientists would have rounded the data so that it was 0.6 +/- 0.1 °C. If this is done to the HadCRUT4 dataset it is even more obvious that there has been a warming “pause” for the past 18 years.Warm Pacific

Looking in detail at why 2014 was a warm year shows that it was down to unusually warm temperatures for a few months in the northeast Pacific. It is also obvious that had December not been such a warm month 2014 would have been much cooler. The Met Office says in its press release:

“Phil Jones, of the University of East Anglia, said: 2014 was an exceptionally warm year which saw warm tropical pacific temperatures, despite not being officially regarded as an El Niño.”

Unusually warm Pacific temperatures in the region they were observed indicates that what made 2014 interesting was not down to any predicted manifestation of “global warming.”

However, the Met Office considers that the temperature attained in 2014, and therefore all of the years of the past decade or so, would not have been achieved without human influence. In a press release put out in December (when HadCRUT4 data was available to October), when it was still possible that 2014 would have set a “record” and could have been treated as a separate event, they said that new research techniques developed by them allow for rapid assessment of how human influence might have affected the chances of breaking temperature records. They said:

“This technique, known as an attribution study, uses climate models and observations to see how likely an event would be in the real world and in a world without human greenhouse gas emissions – enabling assessment of how human influence has altered the chances of an event.”

Peter Stott, Head of Climate Attribution at the Met Office, said: “Our research shows current global average temperatures are highly unlikely in a world without human influence on the climate.” Such attribution research is highly speculative and should have been flagged as such in a press release whose aim was the get the media to print a story suggesting that 2014 would be a ‘record’ year, and give them an explanation for it. As it turned out November’s and December’s HadCRUT4 data whittled away the chances of 2014 being a “record.”

In general the Met Office and before them the Berkerley Earth project were reasonable about the data in pointing out that a new record was not established unequivocally because of the large error bars that encompass 2014 and many other recent years. This is in contrast to the stance taken by NASA who proclaimed without doubt, and without even quoting the temperature and any error information, that 2014 was the warmest year ever.

2014 fits in perfectly with the suggestion that for the past 18 years HadCRUT4 is best represented by a constant temperature.

Feedback: david.whitehouse@thegwpf.com

– See more at: http://www.thegwpf.com/met-office-confirms-2014-continues-global-pause/#sthash.sp1Zg6FC.dpuf

Warmist_Year_Evah_scr

0 0 votes
Article Rating
82 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
GeeJam
January 27, 2015 11:10 am

Words like ‘Face’ & ‘Egg’ spring to mind . . . .

emsnews
Reply to  GeeJam
January 27, 2015 1:44 pm

The media in the EU and US continue to ignore ALL of this interesting information. It is in a very black box. Locked by the UN. Totally air tight. Few outside of us angry people will be told this information, they would have to dig it out of the Boston snow banks today.

Tonyb
Reply to  emsnews
January 27, 2015 2:09 pm

The met office said it is one of the ten warmest years on record.
http://www.metoffice.gov.uk/news/release/archive/2015/2014-global-temperature
That seems to me to be using the right amount of circumspection bearing in mind the tiny margins between the recent warm years and the margins of error.
BEST were also circumspect with their claims.
Tonyb

Alan the Brit
Reply to  emsnews
January 28, 2015 3:05 am

I am trying to enjoy the Inter-Glacial guys, give me a break!

asybot
Reply to  GeeJam
January 27, 2015 8:09 pm

Tar and Feathers more likely.

icouldnthelpit
Reply to  GeeJam
January 28, 2015 1:30 am

(A wasted posting effort by a banned sockpuppet. Comment DELETED. -mod)

philincalifornia
Reply to  GeeJam
January 28, 2015 8:53 pm

They had to read it on here first though !

January 27, 2015 11:15 am

Josh, you’ve out done yourself. So many layers of satire.
That last bit: “a 1 in 27E6 chance that a Climate Alarmist will tell the truth.” is wickedly funny.

Aaron Smith
January 27, 2015 11:19 am

NASA will need to work on rebuilding its credibility if this keeps up.

Mick
Reply to  Aaron Smith
January 27, 2015 12:41 pm

NASA should just focus on space exploration.

RWturner
Reply to  Mick
January 27, 2015 1:27 pm

It’s programs like NASA GISS that makes American citizens shake their heads when congress seemingly can’t find ways to reduce the deficit. Cutting GISS would save over $1 Billion each year.

george e. smith
Reply to  Mick
January 27, 2015 4:48 pm

Some of them actually do. I had lunch today in the NASA Ames Research cafeteria on Moffet Field in Mt View / Sunnyvale Ca.
And we talked about his modeling and hypersonic wind tunnel validation of CFD models of hypersonic flight problems.
And yes he told me that nothing flies anywhere if the wind tunnel doesn’t agree with the floppy disc. [Their] CFD model does predict what they can check out in their giant vacuum cleaner. I can hear that thing at my desk, when they are “adjusting their data. ”
Well they don’t do that like Hansen does. He’s also an expert on the insulating heat shield technology, so we gabbed a bit about the good old days of ablative heat shield cooling systems.
No I dunno who he is, but you never know who’s going to sit beside you when you eat at the (CA State sales tax free) NASA Ames Research Cafeteria. It’s fed property so Nyet on CA State taxes on base.
Oh he did tell me who is running the HS wind tunnel, and it is nice to know that someone is paying attention to the store.
G

Alan the Brit
Reply to  Mick
January 28, 2015 3:06 am

I really do not think they are capable of such activity, may God truly help the next men on the Moon if they try!

Harry Passfield
January 27, 2015 11:23 am

Hope you don’t mind but I’ve forwarded this to the Daily Telegraph (UK).

January 27, 2015 11:24 am

Strictly going by the numbers, without regards to error bars, 2014 was first on Hadcrut4.
But if we assume that any anomaly that is 0.03 above or below the 2014 ranking is in a statistical tie with 2014, then there is a statistical 4 way tie from ranks 1 to 4.
However if we assume that any anomaly that is 0.1 above or below the 2014 ranking is in a statistical tie with 2014, then there is a statistical 11 way tie from ranks 1 to 11.
Almost the identical thing can be said for GISS. The only difference is that the second tie is 3 ways and the last is 10 ways.
But satellites are way different!
If we assume that any anomaly that is 0.1 above or below the 2014 ranking is in a statistical tie with 2014, then both UAH version 5.5 and RSS show a statistical 12 way tie from ranks 3 to 14.

george e. smith
Reply to  Werner Brozek
January 27, 2015 12:14 pm

No matter how “unusual” an event is, or how low the probability of it occurring, so long as it is NOT zero, and not in violation of any of the known laws of Physics; it can occur again tomorrow, and then again on the next day.
But, after thousands or millions of opportunities, or occasions for such an event to occur, you can expect the frequency of such occurrences to coalesce around the rates calculated from the statistics.
But you can NEVER rule it out on any one occasion, no matter how low its non zero probability is.
Statistics is all about results you already have observed; and has no place in contemplation of events not yet observed. Well it might suggest how surprised you are going to be when you get the next data, but it doesn’t effect what that data will be.
G

Reply to  george e. smith
January 30, 2015 8:53 am

Well said! The next 10 years may be hotter than each of the preceding ones, but if the next 10 are all cooler than the preceding ones, then overall warming will be zero +/-.
Lady Luck has no memory!

Reply to  Werner Brozek
January 28, 2015 5:41 pm

My guess is that the uncertainty of the extrapolated ground-based measurement estimate of global temperature anomaly for recent years could easily be as much as 0.5C or more for each year. There are many sources of uncertainty, but the greatest are probably the simple lack of spatial coverage and uncertainties associated with representativeness of the measurements. The uncertainties are likely to be even larger for older estimates that are based on even more sparse data subjected to “homogenization” that may actually introduce further uncertainty. The satellite derived estimates of global lower atmospheric temperature anomalies have better spatial coverage but suffer from poor temporal coverage and may have uncertainties as large as for our current ground-based measurement estimates.

Patrick Bols
January 27, 2015 11:25 am

then things are said. perfect.

The Ghost Of Big Jim Cooley
January 27, 2015 11:31 am

Don’t trust the Met Office anyway, they have trotted out too much garbage in the past, like this…
http://youtu.be/WyDmdcPw7Uw
Did this ridiculous woman lose her job? No.

John M
Reply to  The Ghost Of Big Jim Cooley
January 27, 2015 11:37 am

Wait for it. Phil-dot or Nick Stokes will be along any minute to “prove” she wasn’t “very far off”.

Reply to  The Ghost Of Big Jim Cooley
January 27, 2015 12:06 pm

For those who may be interested, this is how Hadcrut3 would have done if it were still around. Assuming that Hadcrut3 would have gone up as much from 2013 to 2014 as Hadcrut4 did, then Hadcrut3 would have had a 2014 anomaly of 0.529. This would have placed it in 2nd place. Prior to this year, 1998 was at 0.548 and 2005 was at 0.482.

The Ghost Of Big Jim Cooley
Reply to  Werner Brozek
January 28, 2015 12:57 am

The HadCRUt3v anomaly in 2004 was 0.444. Had it gone up by 0.3 degrees then it would be 0.744. The last full year recorded under this set was May 2013 to May 2014. This means the anomaly was 0.472. Phil-dot or Nick Stokes cannot say this ridiculous woman was anywhere even remotely close. It was an absurd prediction from an absurd woman. The Met Office should be ashamed of their conduct. As a climate prediction centre, it is completely useless.

Sal Minella
Reply to  Werner Brozek
January 28, 2015 12:12 pm

The anomaly cannot be measured to three decimal places and thus cannot be calculated to three decimal places.

Stephen Richards
Reply to  The Ghost Of Big Jim Cooley
January 28, 2015 1:28 am

Don’t forget the perrenial Betts, Wilson et al

January 27, 2015 11:35 am

“Quoting the temperature to one hundredth of a degree and the error on that measurement to a tenth of a degree is not normal scientific practice.”
There was an asterisk on the original which has gone missing. And it pointed to a footnote:
“*0.1° C is the 95% uncertainty range.”
So first, the normal practice referred to would use the standard deviation, which is less. But more importantly, it isn’t the measurement error. It is the CI’s from an ensemble of simulations, which principally express the variation you might expect if you had been able to measure in other places.

kenw
Reply to  Nick Stokes
January 27, 2015 12:02 pm

Doesn’t “*0.1° C is the 95% uncertainty range.” sound so much more certain than
“*0.1° C is the 5% certainty range.” ??

Reply to  Nick Stokes
January 27, 2015 1:07 pm

Nick. “the normal practice referred to would use the standard deviation,” has nothing to do with measurement error. Measurement error is the sum of all of those errors that go into a particular measurement. If some of those are statistically derived, then that component of the error may be estimated and have an associated standard deviation. Not all measurement error is estimated. The OP is correct. If you quote a value with a higher precision than the precision of the measuring device, you will get into trouble, particularly if what you say influences monetary decisions. Of course, there are a lot of things that climate scientists get away with that securities regulators would jail others for.

Reply to  John Eggert
January 27, 2015 1:32 pm

“Not all measurement error is estimated.”
Then where does it come from?
“If you quote a value with a higher precision than the precision of the measuring device”
Again, the 0.1°C is not related to the precision of the measuring device.

Reply to  John Eggert
January 27, 2015 1:56 pm

Nick: You said:
“Not all measurement error is estimated.”
Then where does it come from?
You got me there. Should have left that bit out as it detracts from my original point. Regarding unestimated error, I am referring to standard measurement accuracy of instruments or ‘test equipment variation’, one of at least 5 sources of measurement error. You are correct. It is estimated and has an associated standard deviation. Lab folk, like myself, take this error as an absolute, but it is indeed a standard deviation. That being said. It is highly dangerous to quote results with a precision greater than the measurement error. Securities regulators refer to it as “fraudulent”.

Reply to  John Eggert
January 27, 2015 2:17 pm

Second (and last) point. Nick. You said “Again, the 0.1°C is not related to the precision of the measuring device.” First of all, given that we are picking nits, we are talking about accuracy, not precision. Moderate accuracy RTD’s are usually, at best, ±0.15°C. (yes, yes the platinum thingy’s are 0.003°C, but they are not used outside of very expensive labs). Pretty sure most weather station devices don’t have that accuracy. Possibly better precision, but we are talking accuracy, not precision. ±0.1°C is a pretty close to the accuracy of most weather monitoring devices. I would be all ears if you could point to a weather monitoring device that has an accuracy greater than 0.1°C and is in wide use. Argos don’t count. They are not weather monitoring devices.

Reply to  John Eggert
January 27, 2015 2:20 pm

Just in case the flagged word gets my post deleted. Otherwise, this will be a duplicate.
Nick: You said:
“Not all measurement error is estimated.”
Then where does it come from?
You got me there. Should have left that bit out as it detracts from my original point. Regarding unestimated error, I am referring to standard measurement accuracy of instruments or ‘test equipment variation’, one of at least 5 sources of measurement error. You are correct. It is estimated and has an associated standard deviation. Lab folk, like myself, take this error as an absolute, but it is indeed a standard deviation. That being said. It is highly dangerous to quote results with a precision greater than the measurement error.

Reply to  John Eggert
January 27, 2015 3:20 pm

John,
The uncertainty of 0.1°C is based on the paper of Brohan et al, 2006. It includes instrument error, but that is a very small component. An annual average is formed from about a million daily readings. Conventionally, the error of the mean of instrument error is down by about sqrt(N), or 1000, on the individual. Even if you allow for lack of independence, it is still pretty small. The big factor is the uncertainty of temperatures elsewhere from where measured.

D.J. Hawkins
Reply to  John Eggert
January 28, 2015 3:58 pm

Stokes
The pass on inaccuracy you get from large numbers applies only to measurements of the same sample. Since the temperatures being measured vary in both time and space, generally you have a sample size, “N”, of exactly “1”. Just like you never step into the same river twice, neither do you measure the same temperature twice. Or a million times, for that matter.

Robert B
Reply to  Nick Stokes
January 27, 2015 1:28 pm

Normal practice is to use 2SD (95%CI) when making a judgement about two values being equal or significantly different.
Are you implying an error when you quote the article? I’m sure that its correct to round your results so that error is only one significant figure (or two if the first is 1) and the value is to the same decimal place as the error.

Alx
Reply to  Nick Stokes
January 28, 2015 4:56 am

The measurements in question are not measurements, they are calculated values so the precision can go out to 3, 10, 15 places or however far as one wants to go. The question is does it make sense to calculate values out to hundredths when measuring devices are precise to tenths, and collecting, collating, re-factoring, homogenizing, and in-filling introduces many opportunities for error as well as increased muddiness in the data. In my opinion numbers published out to hundredths of a degree based on those factors defies common sense.
The fact that this data manipulation consistently makes the past cooler and the future warmer may be an indicator of a warming trend, but is just as likely to be an indicator of bias; an assertion is made and then the assertion becomes the proof driving assumptions and adjustments.

Sal Minella
Reply to  Alx
January 28, 2015 12:18 pm

You cannot calculate a value to a greater precision than that of the original measurement. The error figure represents accuracy in measurement rather than it’s precision. If the accuracy is +/- 0.1 then the precision cannot exceed one decimal place without regard to how accuracy is determined.

January 27, 2015 11:47 am

Health warning. Nick Stokes defending the indefensible ….. again. Is he for real?

Reply to  phillipbratby
January 27, 2015 12:00 pm

Sadly yes, although he does add some merriment to the blog. every now and then.

Reply to  phillipbratby
January 27, 2015 1:03 pm

Nick comes here and speaks his mind. He disagrees with many of the things here. When he posts, it is usually a clear statement that can be argued. He is civil and intelligent. I disagree with much of what he says, particularly his defense of homogenization techniques, but I always read his posts. As Willis says, if you disagree with something he says, quote it and prove him wrong. Otherwise you are just a drive-by sniper.

Rogueelement451
Reply to  John Eggert
January 28, 2015 1:27 am

And whats your problem with snipers ? Are you even American?
/sarc

DD More
January 27, 2015 12:19 pm

What would it have been if they were still using the semi-adjusted HADCRUT3 version instead of the full-adjusted HADCRUT4.
12 month
average
anomaly HADCRUT3 HADCRUT4
Dec 1998 0.55 0.52
Dec 2011 0.34 0.40
Increase/ -0.21 -0.12
The new version increases warming (or rather decreases cooling) since 1998 by 0.09C, a significant amount for a 13 year time span. Whilst the changes should not affect the trend in future years, they will affect the debate as to whether temperatures have increased in the last decade or so.
https://notalotofpeopleknowthat.wordpress.com/2012/10/10/hadcrut4-v-hadcrut3/

KTM
January 27, 2015 12:52 pm

“This technique, known as an attribution study, uses climate models and observations to see how likely an event would be in the real world and in a world without human greenhouse gas emissions – enabling assessment of how human influence has altered the chances of an event.”
IF they could predict global temperatures to exquisite precision with a “baseline” climate model, I might buy the idea that they could then compare new temperatures against that baseline and try to attribute any differences as “other than baseline”.
BUT they can’t predict global temperatures to exquisite precision, they can’t even accurately HINDCAST global temperatures with their model. And they certainly can’t say that any perceived difference in current temps is caused specifically by CO2 or even by human activity in general, only that something has changed from the baseline conditions of their model. They don’t even know what number to set the Climate Sensitivity knob to on their CO2 model.
The whole premise is laughable and fundamentally unscientific.

RBG
January 27, 2015 1:05 pm

I would presume that a warmist might say that even though 2014 temps might be tied with 2005 & 2010, it was still close to being a record. And that simply joining those 3 extreme temperatures with a line is evidence of nothing…
R

Rex
January 27, 2015 1:20 pm

My rule of thumb is that if you double (at least) the statistical
error you may, just, get somewhere close to the overall survey error.

January 27, 2015 1:29 pm

I guess the missing heat wasn’t hiding in 2014.
Back to The Time Tunnel!

herkimer
January 27, 2015 2:28 pm

The year 2014 was supposed to have been be the hottest year on record globally according to NOAA/NASA. Yet a review of the numbers shows otherwise. It was not a record for half the globe namely Southern Hemisphere land and oceans, nor a record for total global land areas, nor a record for North and South America. Even the global temperatures measured by satellites showed it was not a record.
The 2014 record global annual temperature was mostly a NORTH PACIFIC record SST event.
If one tracks the global ocean temperatures, It was the Northern Hemisphere ocean temperature and more particularly the North Pacific Ocean starting from October to December that pushed 2014 into the record temperature range. In September the Northern Hemisphere ocean temperature anomaly was still tied with 1998, but by October, 2014 became the record highest all the way to the year end. So the prime reason for the warm 2014 record was the due to the extra warming of North Pacific during the last 3 month of the year. It was not a global yearlong event at all.

knr
January 27, 2015 2:45 pm

Quoting the temperature to one hundredth of a degree and the error on that measurement to a tenth of a degree is not normal scientific practice.
Its also BS has they have no way of actually having that degree of accuracy, expect via ‘models’
But I will happily give you 50-1 , that this year will also be claimed has the ‘warmest ever’ no matter what actually happens. Life is just so much easer when its you controlling the ‘data ‘ used to make a judgement on if your right or wrong.

DDP
January 27, 2015 2:47 pm

I’m sure the BBC will be reporting the MetO update to the masses any moment now just like they always do being honest journalists. Any moment now. Any moment. Any moment, right about…now. Now?
How about now?

highflight56433
January 27, 2015 2:51 pm

How ridiculous to start with 1961, then use the words “hottest year EVER”….The rest of the conversations is a waste of time. 🙂

Mick
January 27, 2015 2:52 pm

So Herkimer, excuse me if this question has been answered here. I cannot possibly read every thread or comment.
Where did the extra warming of the North Pacific originate? And why was this crucial information not included in media press releases?
Is there more than one theory on the origin of the warming in the North Pacific?

xyzzy11
Reply to  Mick
January 27, 2015 3:25 pm

The vast majority of heat in the oceans comes from the sun. An almost trivial amount comes from geothermal activity, and a (really) minuscule amount from anthropogenic activity. Virtually none comes from LWIR emissions from carbon dioxide.

mebbe
Reply to  Mick
January 27, 2015 9:51 pm

Mick,
If I’ve understood Bob’s explanations, warmed surface water is piled up by steady winds, exposing cooler water that is heated and driven. Relaxed winds firstly allow that mounded water to smear back across the surface and then be warmed but not blown away. The result is a larger surface area for the same volume of warm water.

Rud Istvan
January 27, 2015 2:55 pm

They may have learned from the GISS fiasco since UKMet circumspection has not previously been evident. Hopeful sign?

herkimer
January 27, 2015 3:30 pm
January 27, 2015 3:39 pm

Thanks, Dr. Whitehouse, for bringing some scientific common sense to the discussion.
Since the very strong El Niño in 1998, global temperatures stopped rising while CO2 kept on increasing. Atmospheric CO2 can not be the controlling factor for global temperatures.
2014 was a weak El Niño year (MEI around 0.8).

luysii
January 27, 2015 3:46 pm

Good to hear. I essentially said the thing in a post the day after the NYT story came out. https://luysii.wordpress.com/2015/01/18/the-new-york-times-and-noaa-flunk-chem-101/
The New York Times and NOAA flunk Chem 101
As soon as budding freshman chemists get into their first lab they are taught about significant figures. Thus 3/7 = .4 (not .428571 which is true numerically but not experimentally) Data should never be numerically reported with more significant figures than given by the actual measurement.
This brings us to yesterday’s front page story (with the map colored in red) “2014 Breaks Heat Record, Challenging Global Warming Skeptics“. Well it did if you believe that a .02 degree centigrade difference in global mean temperature is significant. The inconvenient fact that the change was this small was not mentioned until the 10th paragraph. It was also noted there that .02 C is within experimental error. Do you have a thermometer that measures temperatures that exactly? Most don’t, and I doubt that NOAA does either. Amusingly, the eastern USA was the one area which didn’t show the rise. Do you think that measurements here are less accurate than in Africa, South America Eurasia? Could it be the other way around?
It is far more correct to say that Global warming has essentially stopped for the past 14 years, as mean global temperature has been basically the same during that time. This is not to say that we aren’t in a warm spell. Global warming skeptics (myself included) are not saying that CO2 isn’t a greenhouse gas, and they are not denying that it has been warm. However, I am extremely skeptical of models predicting a steady rise in temperature that have failed to predict the current decade and a half stasis in global mean temperature. Why should such models be trusted to predict the future when they haven’t successfully predicted the present.
It reminds me of the central dogma of molecular biology years ago “DNA makes RNA makes Protein”, and the statements that man and chimpanzee would be regarded as the same species given the similarity of their proteins. We were far from knowing all the players in the cell and the organism back then, and we may be equally far from knowing all the climate players and how they interact now.

Mark
Reply to  luysii
January 27, 2015 4:34 pm

I think you are wrong. Division always increases the number of significant digits. So 3/7=0.43
Undoing the division shows why, for your example : 0.43 x 7 = 3.0 (< 1% error) ; 0.4 x 7 = 2.8 (10% error).

Mark
Reply to  luysii
January 27, 2015 5:04 pm

I think you are wrong. Division can increase the number of significant digits. So 3/7=0.43
Undoing the division shows why, for your example : 0.43 x 7 = 3.0 (< 1% error) ; 0.4 x 7 = 2.8 (10% error). Hidden behind both the 3 and 7 is an extra digit of uncertainty (+/- 0.5) that does not disappear. If your rule were correct 3 x 7 would become 20 whereas it's actually 21 which has 2 digits of accuracy just as 20 does. Does that make sense?

Will Nelson
Reply to  Mark
January 27, 2015 6:59 pm

Assuming the numerator and denominator are measured, 3/7 has a range of 0.34 to 0.52. 3 x 7 is 16 to 25. So 0.4 (which has a possible true value of 0.35 to 0.44) and 20 (possible true value 15 to 24) are the correct answers for round-off and everything has 1 significant digit. In short, division cannot increase the number of sigfigs.

old construction worker
January 27, 2015 4:31 pm

I still think the alarmist crowd had this “2014, warmest year on record” scenario planed many months ago. Remember, the alarmist crowd was hoping for a strong El Nino signal.

herkimer
January 27, 2015 4:32 pm

Mick
You asked “Where did the extra warming of the North Pacific originate? And why was this crucial information not included in media press releases?”
One can only speculate why the original press release was poorly conceived with inadequate scientific explanations prior to its release . The fact that this information was released just before the State of the Union speech and prior to the upcoming 2015 Paris climate conference has not gone unnoticed by the public . In my opinion the best thing that NOAA/NASA should do now is to withdraw the release or modify their comments. Left in in its present form, it further confuses the public even more when considering the comments from other scientists , the contrary Met Office comments as noted above and the different satellite data.

Reply to  herkimer
January 27, 2015 9:35 pm

The problem is that retracting it now does little good since Obama used the hottest year on record in the SOTU speech. CAGW continues to be a purely political effort. Another issue that has sprung to mind is that for whatever reason the temps have reminded stationary during the last 18 years. That could also look like instrument calibrations might have been set differently. The temps could have been steady world wide disregarding the run up in 1998. Laws of diminishing returns as they tweak the data to show the results they want.

Richard
January 27, 2015 9:33 pm

“US snow: National Weather Service admits forecast error”
“Rapidly deepening winter storms are very challenging to predict,”
But what is not at all challenging to predict and totally free from error is the climate 50 to 100 years hence

knr
Reply to  Richard
January 28, 2015 2:59 am

Using a length of time which means the person making the claim will no longer be around to be reminded of their BS , is one of the few ‘smart things’ climate science actually does.

Richard
Reply to  knr
January 28, 2015 11:36 am

Smart? Cunning, deceitful and fraudulent are better words

Dr. Strangelove
January 27, 2015 10:14 pm

High school students should teach these scientists how to read statistical data. From HadCrust4 dataset, from 2001-2014, warmest year = 0.563, coolest year = 0.394, difference = 0.563 – 0.394 = 0.169
Error in data = +/- 0.1
All the years from 2001-2014 are statistically equal. They are all within the error range.
Peter Stott, Head of Climate Attribution at the Met Office, said: “Our research shows current global average temperatures are highly unlikely in a world without human influence on the climate.”
Then how come my random walk function can replicate the observed warming trend from 1951-2014? Without human influence, random number generators can do the job.

masInt branch 4 C3I in is
January 27, 2015 10:33 pm

Wow. This is really Gay News.
A slap-fight between Hansen, Jones (whose the mummy and whose the daddy) and the children Mann and Schmidt (first son and bastard son).
Love-spat quadrangles between gays is a battleground.
I choose the parachute and bailout of the this 747 dreadnaught to hell before it crashes and burns.
Ha ha.

GregK
Reply to  masInt branch 4 C3I in is
January 28, 2015 5:01 pm

Who’s the mummy and who’s the daddy….

David
January 27, 2015 11:06 pm

“current global average temperatures are highly unlikely in a world without human influence on the climate.”
So the many times in the past where it has been exactly this global average temperature, usually higher (it has been hotter and colder than today and logically it must travel through this current temperature) even thousands of years ago and before humans came on the scene it was humans?
FAIL.

Stephen Richards
January 28, 2015 1:21 am

Don’t forget, when assessing, validating and verifying UKMET data. IT IS ALL MANIPULATED. IE 0.1°C UHI

jaffa68
January 28, 2015 1:51 am

Another post discussing nothing, the alarmists probably can’t sleep for laughing when they trot out distractions about “warmest ever” and every sceptic is instantly diverted like a dog chasing a stick.
Even if the likes of Phil Jones can accurately determine the average temperature of the entire globe (which I seriously doubt) It doesn’t matter whether it’s warmer or not, all that matters is whether it is unnatural.
Please stop letting the alarmists lead you by the nose into discussing their talking points and focus on the CO2 link, that’s the justification for everything they’re demanding and it is also the hole in their theory – so dig there.

Carbon500
Reply to  jaffa68
January 28, 2015 7:33 am

Agreed entirely,jaffa68.
Pre-industrial (pre-1750) CO2 levels were 280ppm, so we’re told.
Now the figure’s 400, an increase of almost 43%. A cynic could say that 43% of not a lot doesn’t come to much anyway!
Yet the corresponding fraction of a degree changes over the centuries are being mulled over and treated as harbingers of doom – ‘hottest yet’ etc.
How did the CO2 nonsense ever get such a grip worldwide?

herkimer
January 28, 2015 7:56 am

Anyone who has not read the NOAA GLOBAL ANALYSIS-ANNUAL 2014 report I urge you to read it . It was very cleverly written .It claims 2014 to be the hottest record year but only presented evidence of record year for 8 northern European countries out of 19 that they claim had record years . Europe only represents 6.6 % of all global land and Northern Europe perhaps only a half or a third of this. The cold 2014 temperatures of North America which represents about 16% of global land only received brief coverage . They state that most areas of the world experienced “above average annual temperatures” . “Above average “temperatures do not constitute “ record” temperatures . Total Global land temperatures were not at record level at all but only 4th . There was no record warm temperature for Northern hemisphere land areas nor Southern hemisphere both land and ocean areas were not at record temperatures
http://www.ncdc.noaa.gov/sotc/global/2014/13
So excluding the statistical consideration , from a regional perspective , the record temperatures were due mostly due to North Pacific SST and to very minor degree due to record warming of Northern Europe [less than 6.6 % of global land.

herkimer
January 28, 2015 8:48 am

North Pacific Ocean represents about 21 % of global oceans or about 77 million sq km or the equivalent of about 1/2 of all global land areas . So when this area is extra warm it will have a global impact. Yet a detail analysis of how this area got extra warm was not presented . I wonder why ? Because it warmed due to natural causes and this would undermine the AGW alarmism prior to the Paris Conference /

Nick
January 28, 2015 8:56 am

Temperature anomalies do not exist in a vacuum. If the annual global surface temperature anomaly for 2014 were +0.6°C relative to 1961 to 1990 averages, then if this anomaly does tell us anything, it tells us, that the mean global surface temperature of the earth during 2014 is 0.6°C above the mean global surface temperature of the earth during the thirty years from 1961 to 1990. Temperature anomalies only have meaning in relation to the temperature of the base period from which they are departures.
In January 1998 NOAA claimed that (a) the mean global surface temperature for 1907 is 0.5°C below the 1961 to 1990 global mean temperature of 16.5°C, and (b) 1907 has the lowest mean global surface temperature of all the years from 1900 to 1997.
http://www.ncdc.noaa.gov/oa/climate/research/1997/climate97.html
In plain English, 1907 has a mean global surface temperature of 16°C, and this is the lowest annual mean global surface temperature of all the years from 1900 to 1997.
In December 2014 NOAA claimed that (a) the mean global surface temperature for 2014 is 0.69°C above the 20th century mean global surface temperature of 13.9°C, and (b) 2014 has the highest mean global surface temperature of of all the years from 1880 to 2014.
http://www.ncdc.noaa.gov/sotc/global/2014/13
In plain English, the mean global surface temperature for 2014 is 14.59°C, and this is highest annual mean global surface temperature of all the years from 1880 to 2014.
Nobody has ever measured the surface temperature of the earth with sufficient rigour to rule out the possibilities that (a) 16°C is the mean global surface temperature for 1907, and (b) 14.59°C is the mean global surface temperature for 2014.
http://data.giss.nasa.gov/gistemp/abs_temp.html
Therefore, for all we know the mean global surface temperature for 1907 could be 16°C, and the mean global surface temperature for 2014 could be 14.59°C. If it is even possible for the year with {the lowest annual mean global surface temperature of all the years from 1900 to 1997} to have a higher annual mean global surface temperature than the year with {the highest annual mean global surface temperature of all the years from 1880 to 2014}, then nobody has the foggiest idea of which really are the “hottest” and “coldest” years on record since 1900, or before.

Reply to  Nick
January 29, 2015 1:11 pm

Right so when the MET office say that nominally it was the warmest year on record, they don’t even know if that is true.
But the average global temperature anomaly isn’t really of importance to any living thing or even drop of melting ice. The only thing of importance is local temperature and then really only the extremes and not the average.

rooter
January 28, 2015 9:02 am

Was 1998 the warmest year?
No. Not according to this post. It never was. Not significantly warmer than 1997. And when 2001 came along that year too was a statistical tie with 1998. And all the years after 2001.
1998 was never the warmest year. Neither was1997, nor 1995, nor 1991. Etc.
Guess that means that hadcrut4 is best represented as a constant temperature since the start of the series.

Don
January 28, 2015 12:06 pm

From the article:
“It is also obvious that had December not been such a warm month 2014 would have been much cooler.”
And thus December temperatures should be suspect. Worked like a charm for them though.

Kitefreak
January 28, 2015 12:40 pm

Oh what a wicked web they weave…

Kasuha
January 29, 2015 5:26 am

“Quoting the temperature to one hundredth of a degree and the error on that measurement to a tenth of a degree is not normal scientific practice. It is against normal scientific practice to have an error of the measurement larger than the precision of that measurement. ”
I don’t see that as true. In particle physics, for instance, it is common to report central value with much greater precision than the uncertainity. Quick example: “1.14 +0.26/-0.23” which can be found here:
http://arxiv.org/abs/1407.0558