Guest Post by Bob Tisdale
Just in case you missed the mention of this in the text of the most recent sea surface temperature update…
In a June 2015 post, we discussed and illustrated how Weak El Niños and La Niñas Come and Go from NOAA’s Oceanic NINO Index (ONI) with Each SST Dataset Revision. NOAA has once again revised their Oceanic NINO Index. Now they’re using the base years of 1986-2015 for the NINO3.4 sea surface temperature data starting in 2001, instead of 1981-2010. See the NOAA webpage here for the basis for using shifting base years for their Oceanic NINO Index. Based on the date of that webpage it appears the recent changes took place in June 2016.
The recent changes have shifted the start and end months of some the “official” NOAA El Nino and La Niña events. But the most noticeable change is the resurrection of the 2014/15 El Niño. See Table 1 where I’ve highlighted the relative time period for the 3 most-recent versions of the Oceanic NINO Index.
Table 1 (Click for full-sized table)
The 2014/15 El Niño registered on ONI when NOAA used their ERSST.v3b sea surface temperature data and the shifting base years, where the base years of 1981-2010 were used for the most recent years (link here). Then in June 2015 NOAA switched to their “pause-buster” ERSST.v4 sea surface temperature data for the Oceanic NINO Index (link here), again using 1981-2010 as base years for the most recent data–the 2014/15 El Niño disappeared with the “pause-buster” data. And now, with the use of the base years of 1986-2015 for the ONI data starting in 2001 (link here), the 2014/15 El Niño has reappeared as an “official” event.
Once again NOAA has shown that weak El Niños and La Niñas can come and go from their Oceanic NINO Index (ONI) with each revision, and once again, by constantly changing the Oceanic NINO Index, NOAA is showing, very obviously, that their listing of “official” ENSO events has little value.
I think a new phrase needs to be added to the popular lexicon: “Data In, Garbage Out”
another good 4 word phrase
Welcome to the Adjustocene
“Adjustocene”
Very nice
The value of using anomalies as the basis for inferring what has happened with past data is frequently cited, in WUWT amongst other publications. What is now happening is that on a whim some influential organisation can manipulate what the unsuspecting public is told, simply by changing the basis perioid.
Despite known or perceived problems with using raw data as the basis for research into the behaviour of climate, is there not still a case for using unmanipulated data in climate studies?
One method would be to refer all series to the overall mean of the series, which would help overcome any problems due to changes in methods or siting, or at least be consistent and reproducible by anyone.
The overall mean changes with each additional data point.
Did you see this chart?
http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/30yrbaseperiods_Nino34._v4.png
30 year periods can give you a misleading direction when you have over 60 years of data in it.
El Nino Index is a non physical index.There are no scientific units. It’s Defined by CONVENTION not by nature. It exists only as a useful heuristic tool. a simple metric. Not even as useful as say an “consumer price index.” INDEX… dont forget that. Non physical.
Its why El Nino cannot cause warming. It’s merely a conventional, man made, arbitrary, index that
has some limited uses.
not really important
Simplistic drivel lad. ‘It’s merely a conventional, man made, arbitrary, index that
has some limited uses.’ Yes, like Crimate scam media headlines.
When we finally get to understand how Earth’s Climate works and operates, we might see that the oceans drive the Climate.
presently, we do not know what drives ENSO nor do we know whether it has or has not a long term bearing upon Climate (obviously it has short lived regional impacts). But it does appear that during the only warming period for which man can be held responsible, ie., the circa 1975 to 1998 warming, ENSO was in a predominantly warm phrase. I am not attaching attribution, but merely pointing out this fact, which may be the reason behind (in whole of part) of any real warming that took place during that 20 or so year period.
See:
http://media.breitbart.com/media/2016/10/ts-gif_thumb-1.png
They simply moved the period up to the latest 30 years. But there are some in the middle of the event that are not as warm. I see nothing wrong with this. Simply going to latest 30 year mean
Peace
Why is 30 years used?
I’m hearing crickets, soooooo Why is 30 years used ?
“Simply going to latest 30 year mean” Why ?
My understanding is that certain soothsayers and wizard-like acolytes consider a 30 year period to represent “climate” as opposed to plain, old weather. Of course, I don’t understand why not a 50 year period, or better, a hundred year period, as a better representative of “climate”?
“Why is 30 years used?”
“F… you”, they explained.
Additionally, why the most recent 30 years?
Why is it changed every 5 years instead of every year, or every 10 years?
Yes, why 30 years when 2318 years period is much more relevant???
“The periodic movement of the planets of the solar system generates a set of stable resonances:
1a. Periodic changes in solar activity, solar wind and solar luminosity
1b. Periodic changes in the electromagnetic field of the solar system
1c. Periodic changes in the gravitational field of the solar system
Hence
2a. Periodic changes in the dust amout entering the Earth’s athmosphere
2b. Periodic changes in the cosmic ray amount entering the Earth’s atmosphere, hence periodic changes in the radionucleotide (C-14 and Be-10) production
Hence
3. Periodic changes in the cloudness inducing albedo changes
4. Periodic changes in the total solar irradiance reaching the Earth’s surface
5. Periodic changes in the Earth’s climate”
Ref. https://tallbloke.files.wordpress.com/2016/09/scfetta-halstatt-2016.pdf
It’s an arbitrary time frame standard use by the World Meteorological Organization probably based on the also somewhat arbitrary 30 sample size statistical rule of thumb.
I was under the impression that 30 year periods were being used due to the then 30 year satellite record of the time and since became a standard
The WMO uses 30 years as the standard period for climatology ‘normals’. These are defined as “Period averages computed for a uniform and relatively long period comprising at least three consecutive ten-year periods.”
What’s wrong with a 40 or 50 year mean?
Thanks for the info SC, appreciated
The most reliable Satellite records don’t go back to 1965 so a 50 year mean requires some proxies
Why can’t they use ALL of the years instead?
At a minimum they should use the length of the longest known climate cycle. That would make 60 years much more logical.
Starting in 2039, the satellite record will be that long so thwy might, unless it skews the meme too far
They should no question.
Doesn’t matter if the mean would change with each additional data point – the laws of sampling dictate that such changes will be smaller and smaller with each additional data point.
You’re basically talking about grand mean centering. This has the effect of making data sets more comparable because the mean of the resultant ‘data set is zero or very close to zero. It has the effect of removing scaling issues across data sets while leaving differces in variance (if such differences exist) intact.
I have surmised from the comments regarding the settled science, that the 30 year period is arbitrary and not based in science. . . no hypothesis as to why, no theory expressing why, no null hypothesis as to why not, no experiment to prove why or why not and nobody replicated the experiment that hasn’t been defined.
Well it’s all ‘smoke and mirrors’ and cherry picking.
Usefull info here–
http://www.wmo.int/pages/prog/wcp/ccl/mg/documents/mg2011/CCl-MG-2011-Doc_10_climatenormals1.pdf
Now you see it, now you don’t;
You may believe, but then you won’t.
Catch as catch can,
Took the temperature and ran;
Where’s the truth, where is it, we moan’t.
=========================
This is simply a shift in the baseline for a comparative. This is done periodically to reference to the current climate background and a regular part of the methodology for calculating ONI. There is no “whim” nor is this intended to deceive. As the saying goes, RTFM.
I thought NOAA regularly re-baselined all their data for the most recent 30 baseline every few years? Why they would do it not on a year ending in 5 or 10, I don’t know.
WMO members are required to update their 30 year normal 30 every ten years.
Votes are treated as climate data…..
Votes counted as fractions instead of as whole numbers
https://jonrappoport.wordpress.com/2016/10/10/high-alert-the-election-can-still-be-rigged/
I urge you to dive into her multi-part series, Fraction Magic (Part-1 here). Here are key Harris quotes. They’re all shockers:
“Our testing [of GEMS] shows that one vote can be counted 25 times, another only one one-thousandth of a time, effectively converting some votes to zero.”
“This report summarizes the results of our review of the GEMS election management system, which counts approximately 25 percent of all votes in the United States. The results of this study demonstrate that a fractional vote feature is embedded in each GEMS application which can be used to invisibly, yet radically, alter election outcomes by pre-setting desired vote percentages to redistribute votes. This tampering is not visible to election observers, even if they are standing in the room and watching the computer. Use of the decimalized vote feature is unlikely to be detected by auditing or canvass procedures, and can be applied across large jurisdictions in less than 60 seconds.”
“GEMS vote-counting systems are and have been operated under five trade names: Global Election Systems, Diebold Election Systems, Premier Election Systems, Dominion Voting Systems, and Election Systems & Software, in addition to a number of private regional subcontractors. At the time of this writing, this system is used statewide in Alaska, Connecticut, Georgia, Mississippi, New Hampshire, Utah and Vermont, and for counties in Arizona, California, Colorado, Florida, Illinois, Indiana, Iowa, Kansas, Kentucky, Massachusetts, Michigan, Missouri, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, Wisconsin and Wyoming. It is also used in Canada.”
“Instead of ‘1’ the vote is allowed to be 1/2, or 1+7/8, or any other value that is not a whole number.”
How does it help the CAGW narrative if a certain period is or is not classified as El Nino?
Climatists have regularly posited that CAGW would produce (lead to) more EN events. And eventually with enough warming lead to a persistent EN-like state for the trop Pacific.
..Still the best video of Micheal Crichton and assorted scientists explaining the corruption of “Climate Science”
https://youtu.be/MDCCvOv3qZY
These types of bureaucratic manipulations always proves Orwell.
“Those who control the present, control the past and those who control the past control the future.”
― George Orwell, 1984
Climate Change faithful, with their climate prophesies in hand (as in CMIP5), are all about controlling the future, and will adjust the past as needed to keep their fingers wrapped firmly around the neck of the present.
The future is fixed, it’s the past that keeps changing.
The above video also includes a great oration by the great William M. Gray..
Thanks Bob, this illustrates one of the flaws of anomaly data presentation.
Somebody always gets to make a decision about the base period. Just look at NASA GISS surface temperature, they stick with a base period during some of the coolest times of the planet, from 1951-1980. If they used the same base period, say as UAH (1981-2010), their charts wouldn’t look nearly as scary in amplitude. Slope would be the same, but amplitude is affected by the base period, which is what you are seeing in the ENSO data changes.
Base period is an arbitrary choice, and when a human decision affects the presentation of data, that’s not very scientific. Mike Mann and his “Nature Trick” showed us that clearly:
https://wattsupwiththat.com/2009/11/20/mikes-nature-trick/
…Why do they not use ALL data available for the mean ? Would that not make more sense ?
If by “more sense” you mean it would make for far more objective scrutiny. Then yes, it absolutely would.
Dr. Roy Spencer has a good explanation of what causes El Nino. If you’re not clear on the concept, it’s an excellent place to start.
You could average climate data for 1000 years, it would still mean nothing. There is no mean climate on any timescale.
DNF63.
Maybe if you try 2318 years?
Ref. https://tallbloke.files.wordpress.com/2016/09/scfetta-halstatt-2016.pdf
I am not sure I see the importance of this. Classifying a weather pattern as an El Nino event is
an arbitrary decision based on a set of human defined criteria. Thus there are always going to be
patterns that will be extremely close to being classified as an El Nino and which will thus move up
or down depending on any changes in baselines or improved measurements. This would appear
not to be about changing the data but about changing the classification criteria which is arbitrary to
begin with.
Interesting,
The linked chart with the explanation has a flaw… no December.
When the facts don’t fit the theory, change the facts. “Only the future is certain … the past is always changing.”
The above video is a brilliant common-sense rebuttal of the Hi-Jackers of the Agenda & the principles by which they tendentiously advance their Alarmist GW cause. Absolutely must-watch!
In a sea of misinformation and disinformation and outright lies by the mannipulators of the script, designed to panic and rush voters and tax-payers into more and more useless funding for their benefit, this video is a rock of principled thinking … to which all of us rational, far-sighted, sage observers should return when the tsunami of un-truths looks overwhelming.
NOAA is in desperation mode to lower the bar to show that 2000 through 2016 had an El Nino Every Year!
99 Bottles of Beer On The Wall.
https://www.timeanddate.com/countdown/generic?p0=263&iso=20170120T00&msg=Time%20left%20until%20Obama%20leaves%20office%22
Constantly updating the 30 year ‘normal’ would be the last thing anyone would want to do if they were trying to make the present look much warmer than the past.
The reason for the continuous updating is to take account of the underlying rise in Pacific sea surface temperatures. If they just left everything to 1981-10 anomaly, for instance, then the proportion of ‘el Nino’ years would be higher towards the latter end of the data.
They are trying to identify a change from ‘normal’ in conditions where ‘normal’ is continuously changing in a warming direction.
well here in Aus La nina seems to be here and hopefully staying around for while;-)
This stuff depresses me so much I wont bother to highlight my predictions on this over the past week in a few other threads. All that remains is re defining “landfall” for hurricanes, maybe tweaking the Saffir Simpson scale, jiggering polar ice, etc. If you want an insight into my method of predicting this stuff (incidentally, I coined the term Karlization) it is this: if something has bugged the CAGW team for more than 10 years by standing in contrast to activist cli science, it will be snuffed out in some way. Monckton got on their nerves over the Pause and it was getting a lot of traction: poof, Karlization. They are getting ever more bold. We need a much expanded surfacestations.org for every climate metric before everything is shifted to make the climate models correct.