Frank Lansner writes via email:
I just want to make sure that people are aware of this little maybe-oddity regarding argumentation for ERSSTv4 data changes due to the shift from ship measurements to buoys.
We are all focusing on the data changes ERSSTv3b to ERSSTv4 after year 2003 as these affect the pause.
And so the argument goes, that it’s the shift to buoys that makes warm-changes necessary in ERSSTv4.
But if so, what about the period 1986 to 2003 ? In this period we have an increase from around 8% to 50% use of buoys.
So why is it not necessary to warm adjust 1986-2003, just like 2003 – 2016 ?
I’m aware that this may have been addressed many times, but I just have not noticed.
Data of buoy fractions In % are estimates from this article:
https://judithcurry.com/2015/11/22/a-buoy-only-sea-surface-temperature-record/
Until 2006 percentage values are read from the graphic in the article. And in addition they write that the fraction of buoys after 2006 is above 70%. Thus the 2007-2016 number are just reflecting that we are over 70 % in 2015.
But the point remains, buoy fraction when up to 50% in 2003 but was only accompanied by still colder temperature adjustments.
So, there “should” be a very important reason to cold adjust around 1986-2000 – that dominates over the “necessary” warm adjustment due to buoys.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Change this data, Karl, at your peril: https://www.whaleoil.co.nz/2017/02/guest-post-men-science-visited-shores/
h/t Slater@whaleoil
That’s interesting.
In that link
“Towards the end of 2016 modern scientists from the New Zealand government released a publication. Somewhat pessimistic in nature it forewarned of the possibility of boiling oceans, caused by a thing called ‘global warming’. The scientists conferred upon us the knowledge that they had been assiduously monitoring ocean temperatures around New Zealand for over thirty years and that in that time the measured temperatures had changed; not one jot, by zero, there was no trend, no change; at all.”
Here’s how the new Zealand study explains the absence of any warming from the study he cites:
http://www.mfe.govt.nz/sites/default/files/media/Environmental%20reporting/our-marine-environment.pdf
“Data on sea-surface temperature and filling critical knowledge gaps
Data on sea-surface temperature provide a good example of a tension that can arise between
reporting using the highest-quality data and providing a comprehensive report. Our reporting
programme has very reliable satellite data on sea-surface temperature available from 1993,
available from our data service. These data showed no determinable trend over the past two
decades. This is not surprising, as on such short time scales – from year to year and decade to
decade – natural variability can mask long-term trends.
Long-term data on sea-surface temperature do show a statistically significant trend. These
data come from a range of measurement instruments and sites. While the data do not have
the consistent and broad spatial coverage of satellite-only data across the entire time series,
scientists still have high confidence in the findings.
Given the importance of long-term data to understanding how the state of our environment is
changing, we report on both the satellite-only data and the long-term data in our key findings
and the body of the report. However, we did not acquire the long-term data as part of our
reporting programme and these are not available on our data service. “
Please excuse my ponderous and extensive inputs, given my severe professional frustration and dismay on this matter!
I speak just as a retired Chartered Professional Engineer and Project Manager and not as a Scientist, but even so I cannot even begin to accept current Energy policies’ with their unnecessary, massive subsidies and other additional costs which are supposedly justified, based on the supposedly “settled” CAGW/Climate Change science that the warmists’ have imposed! Over a period of 20-25 years, ongoing we have and are still wasting unaffordable expenditure of $billions worldwide of taxpayers’ and consumers’ money!
Yet even now, due to ongoing temperatures inconveniently never matching past forecasts :
1. Warmists have continually needed to perform feats of intellectual and scientific acrobatics and contortions to reconcile their forecasts against records of ongoing events. Ongoing record data “adjustments” by the warmists only further discredits such a theory
2. Warmists have had to massively reduce their forecast future temperatures rises based on a reduced sensitivity, i.e. a fundamental consideration of their original theory that started this religious calling. This now involves a reduced temperature rise per doubling of CO2 levels from roughly 4 degrees C to “only” 2 degrees C. Even the latter, they now say will occur by 2100. This means very roughly only a 0.15 degree C per decade temperature rise over 85 years or so.
3. Past and ongoing records of such minute temperature rises, as needed to properly, adequately and scientifically “substantiate” any theory for such forecasts, requires almost superhuman efforts, even in a laboratory, let alone in the open sea and air around us, to obtain the reliability, the accuracy, and the consistency and frequency of methods of measurement and measuring station environment needed for such an exercise. Clearly such necessary actions were not taken in the past and are not being provided now and as such no adequate data is available to “substantiate” and prove the warmist theory!
4. Now even CO2 level data is similarly being disputed, although accuracy needed is nowhere near as that required for temperature readings.
5. Even now no one appears to have adequately and accurately separated out the causes of very many naturally induced and varying temperature rises – many of which are cyclical, and also other human induced temperature rises than CO2 emissions. This is clearly required to identify actual man-made CO2 emission induced temperature rises to compare with CO2 increases. As such there is no basis for any correlation let alone cause and effect linkage to “substantiate” the warmist theory.
6. As repeatedly reported by many, vague and disputed estimates of future costs incurred should current warmist policies not be adhered to totally disregard the benefits CO2 provides including increased crop yields, greening and reduced plant watering which have been recorded and substantiated over the last few years. The latter is particularly beneficial for non-developed nations who suffer from famines, droughts and floods. As such the warmists “investment analysis” is fundamentally flawed and unscientifically biased and needs to be ignored.
7. Even as a non-scientist, I know that the onus is on the warmists to substantiate their theory, and not that “deniers” or even “luke warmists” need to disprove it!
If this was my project, and someone came to me with such a proposal based on the above, and wanting the monies now being spent as an investment to achieve the above-mentioned project objective, then I would simply pick up the telephone and ring the local asylum and have the person immediately certified and incarcerated.
Or, in my humble ignorance, am I missing something?
David Long ‘Not dumb at all! It’s become my pet peeve from years of reading climate science.’
Look at the way proxy data is spliced to ice core data to instrument data with out a clear acknowledgement of where one stops and the other begins, think time gaps.
Didn’t they throw out the data from about 50% of the buoys anyway, because those buoys showed a cooling trend?
See here
http://m.earthobservatory.nasa.gov/Features/OceanCooling/
Also here, because part of the NASA page that described what they did seems to have “disappeared”:
http://jennifermarohasy.com/2008/11/correcting-ocean-cooling-nasa-changes-data-to-fit-the-models/
AP, the document you referenced says just that. The researcher did not recover some of the low-reading buoys and test them to see whether the data were erroneous. Nor did he recover some of the high-reading buoys and test them for error. He just threw out the low data. This “scientific” approach is described nicely by Ferd Berple, above. Perhaps we could call the method Berple Burp Adjustment.
This is rich.
Ho Hum, Paul Vaughan has been highlighting this problem for a long time.
His recent quote; “ERSSTv4 should have been retracted (and v3b2 reinstated) instantly upon release. WE HAVE KNOWN THIS FOR YEARS.”
Scientific Fraud used to be a big deal?
The FDA would never allow any of this data.
There is also a false implication of an imbedded precision that never existed when data from a ship’s intake is included in averages that are then reported in 1/100th of a degree. Sure when you take an arithmetical average you end up with .xx, but that’s an artifact of the math. I bet intake data was never logged or recorded in any fraction of a degree; who would have cared? Even the gauges used are calibrated +/- 1 degree; at least pre-1990. And this number from a ship can be conflated and used in the determination to claim “hottest year ever”?
Mairon, it may seem that this precision is not achievable given the tolerances of the system itself and the reading process. But that’s exactly the reason why everybody is giving anomalies instead of temperatures. The reasoning behind is: Once you have a really big number of readings with high tolerances the average of them will be nearly the same as with systems and reading processes with much smaller tolerances. Plusses and Minusses equal each other over time. Once you have the average temperature of let’s say all Januaries of all places in the world of the last 100 years, the delta to the current January reading of all places in the world should have a much smaller tolerance and higher precision.
I don’t know whether the precision is as high as claimed but I believe that the precision of the anomalies is much higher than that of an absolute temperature. Look at the other side: Satellite data, such as UAH, also give anomalies and claim an uncertainty of +/- 0.1 deg. C wehreas the system itself measures with a tolerance of +/- 1.0 deg. C.
Anomalies are a propaganda tool to make graphs look worse than they are. If actual temperatures were plotted who would pay attention to 0.01 C changes? Granted anomalies have a purpose but don’t use them for a substitute for accuracy or precision.
You need to go read about accuracy and precision. If you can only measure to within a degree, your range will probably be close to that. Range is what determines precision. Averaging anomalies simply won’t change that. Precision is also only useful when applied to one measuring device. Averaging anomalies to obtain a precise number just doesn’t work. Here is an example. Think of 10 people shooting at a target with 10 different pistols. Does an average precision number really tell you anything? If you were going to purchase 1 million pistols would you want your range officer to provide you with a group average precision number to make a decision with?
Accuracy is no different. The numerator of an accuracy figure is the difference between a given device and a known standard. This doesn’t vary by averaging other readings from other devices into a combined figure. A 1% device averaged with readings from a 10% device doesn’t give you a <1% measurement. Think of the targets again. Again would you want to know an average accuracy of all 10 pistols when deciding which one to buy? Does it even have any meaning?
This doesn't even address significant digits.
Jim,
I agree with you in terms of accuracy and precision. The advantage of anomalies is that the uncertainty or the range of the results changes. Under the condition that there are no systemic errors (which of course there are in reality) you will have only random results in a range given by accuracy and precision of your measurement system and process. Let’s call that range tolerance. By switching to averages and anomalies you statistically reduce that tolerance to a small uncertainty.
Example: You measure something that has a sinus behaviour and theoretical values are between the extrema 90 and 110 following an ideal sinus curve around 100 which would also be the average. Let’s say your measurement system & process has a tolerance of plus/minus 5. The range of the measured values will then be between 85 and 115 in the extrema. Supposing that (because of no systemic errors) there are as many and as high outliers in plus direction as there are in minus direction the average of all values will be still 100. The uncertainty of that average is not the tolerance of the measured values but the probability that the average does tend in one direction (+ or -). It is only the number of readings that determines this uncertainty. With a very high number of readings and perfect statistical distribution of the values the uncertainty of the average value will be basically zero. But let’s say the average would be 100 with an uncertainty of +/- 0.2.
Now imagine the sinus measurement above is a series over time and space. And every single point is an average of many spatial readings at that point of time. The original tolerance band of the spatial measurements was also reduced to a smaller uncertainty of their average, let’s say 0.3 because the number of redings is smaller than that of the total series. If you now calculate the uncertainty of the difference between both averages (the anomaly) you could probably do it by a simple tolerance chain calculation (t = (a^2+b^2)^0.5) which would result in a total uncertainty of 0.36.
That means from measuring values of let’s say 105 plus/minus 5 you get anomalies of 5 +/- 0.36. That does not mean that the single values of 105 suddenly have a tolerance of +/- 0.36. It just means that the global average at a certain point of time deviates from the total average over time with un uncertainty of +/- 0.36.
While I admit there is much propaganda in the whole AGW theater, the anomaly issue is not part of it, at least as far as I can get it. The more serious issue in temperature data sets are the systemic errors in the measurements and the dealing with it by the adjustments. There are huge opportunities for interpretations.
Yes and a bit No.
GISS, yes, 2015
HadCrut, similar but still different adjustments, 2015
NOAA, yes, 2015
I am not a scientist, although long ago I did get an MA in geology as my second major. It seems to me that the correct adjustment should be to lower ship-measured temperatures to match the more-accurate buoys, as the group at Berkeley did. If you extend their method back to about 1980, when there were proportionately many more ship measurements, if would seem to necessarily eliminate much of the reported seas surface warming of the 1980s and 1990s. Wasn’t the reported warming of the ’80s and ’90s the whole reason for this global warming panic to begin with? Think anyone could get funding for such a study?
Nechit, you have a masters degree in geology, but you say “I am not a scientist?”
You are too modest. You might not be doing scientific research, but you are certainly a scientist.
Bill Nye has only a bachelors degree in mechanical engineering, and works as a children’s television entertainer. But he nevertheless has a profile on famousscientists.org, and is known far and wide as “the science guy.” If he’s a scientist, then you certainly are, too.
Sorry, I have a BA — and evidently I can’t type very well.
One can also illustrate the point of this article in another way:
The adjustments for NOAA SST data is around the same in 1988 as in 2009.
But
in 1988, SST readings were around 12 % from buoys
in 2009, SST readings were around 62% from buoys
So.. adjustmenst are not at all primarily due to fraction of buoys.
Kind Regards, Frank
Zeke ran away once the questions got interesting and some explanations were requested.
This is the same guy warming data as it moves south down the latitudes when he should be cooling it. They really need to get rid of these people