Discrepancy in NSIDC press release vs. data puts turning point for end of Arctic ice melt 3 days earlier

Yesterday, as covered by WUWT, NSIDC announced that Arctic sea ice melt had turned the corner on September 10th with a value of 4.14 million square kilometers:


Source: http://nsidc.org/arcticseaicenews/2016/09/2016-ties-with-2007-for-second-lowest-arctic-sea-ice-minimum/

XMETMAN writes of his discovery of a discrepancy between what NSIDC announced yesterday, and what their data actually says. I’ve confirmed his findings by downloading the data myself and it sure seems that the minimum was on September 7th, and not the 10th:


Source: ftp://sidads.colorado.edu/DATASETS/NOAA/G02135/north/daily/data/NH_seaice_extent_nrt_v2.csv

He says on his blog:

The Arctic sea ice looks to have reached its minimum on the 7th September, which is four days earlier than average. The sea ice extent bottomed out at 4.083 million square kilometres making it the second lowest since records started in 1978 – well that’s according to the data file that I’ve just downloaded!

Strangely, according to the data that I download from the National Snow and Ice Data Center [NSIDC] the minimum occurred three days later on September 10th. As I said in my introduction on the 7th the value was 4.083, but according to the news item that I’ve included below, the value on the 10th was 4.14 million square kilometres and tied it with the year 2007, which according to the data file is third.\

All these daily values translate into the following chart with the minimum occurring on the seventh and not the tenth of September.


More here: http://xmetman.com/wp/2016/09/16/10th-or-7th/

It is a puzzle. Perhaps whoever wrote the NSIDC press release looked at their 5 day average value in their Chartic interactive graph instead of the raw data? When using that tool, the data rounds up to 4.14 as you can see:


NSIDC is closed for the weekend, so perhaps we will get an answer to this puzzle on Monday.



138 thoughts on “Discrepancy in NSIDC press release vs. data puts turning point for end of Arctic ice melt 3 days earlier

    • Honestly, who cares? It’s just ice. It comes, it goes. Northampton are playing Man Utd in the league Cup on the 21st.

    • Actually the opposite. If Arctic sea ice average would start to grow that would be a sign of cooling. The risks are tremendously asymmetric towards the side of cooling. Cold is a huge killer on itself but when it affects food production it can depopulate countries. In 1696 Finland lost half of its population to a cold related famine. Can you imagine that today? Half of the population of a country wiped in a single year?

      • Or maybe the definition has always been taken as minimum on a 5-day average? Might sound stupid, but maybe not.

      • Meh, definitions are flexible. They mean whatever we say they mean… and are subject to change… without notice.

      • Well Craig, it just doesn’t sound stupid; it IS stupid.

        Even the dumbest statistician knows that the average can never be less than the minimum.

        Well the average can equal the minimum, but only if it is also the maximum.


      • As I understand it a 9 day mean was used prior to August 18th, 2012. How this changes the comparison to the minimum in 2007 with a 9 day mean to 2016 with a 5 day mean, I don’t know.

  1. I read somewhere today that NSDIC uses a rolling 5-day average, the current day actual and the preceding 4 days actual, for this determination. The numbers work. I’d be interested to know why. My guess is to smoothe the chart, and to average out measurement uncertainty.

    • From the NSIDC website “About the data” information page:

      “NSIDC produces the daily extent image and graph using a five-day trailing mean. Please note, the values provided are the individual daily values, not the five-day average values that NSIDC displays for its daily extent image and graph.” http://nsidc.org/arcticseaicenews/about-the-data/

      As you say, applying this 5-day trailing mean makes 10 September the lowest ‘daily’ value (4.137). As mentioned above, the lowest individual daily value was actually 4.083, which occurred on 7th September.

      • So September 10 was not the lowest daily value, but the lowest 5-day period was between September 6 and September 10.

        The lowest daily value is not that precise nor daily changes have any significance.

      • What is it about the LOWEST daily that makes it not ” that precise “, and why would we expect any of the other daily values to then be more precise than that lowest one ??

        The chances are that the five day average number was NEVER ever observed at any time.


      • Did I say any other of the values are more precise? We were talking about the lowest value, so that’s why I mentioned that.

        But who cares, there are always error bars, even if they didn’t draw them.

    • Well if you average five numbers, you do not reduce the uncertainty.

      In fact you increase the likelihood that the number you get is not even one of the numbers that were measured.

      You can’t ever do better than the numbers that were actually measured. That’s why they call it DATA; something real.

      Average is something you conjure up from the real numbers.


      • Thank you George, thank you. I am butting heads daily with people who think that a calculated value has more ‘validity’ (or something) than a measured value. In cases where measurement is impossible they settle for making things up. They claim to be clever enough to have their made-up numbers mean something. And I should believe that.

        An average is a model. I just saw a system efficiency test result. The numbers were
        High power 61.6
        Medium 66.0
        Low 66.1
        High (a mystery number)
        Time-weighted Average 55.2


        So what was the mystery number, high power again? How can the average be lower than all the data?

        Well the mystery number was 6. Yup, and that was not an ‘entered value’. No typos. It turned out to be a raw data copy/pasting error. The real calculated value was 61.9, very close to the first measurement.

        So the average is useful as a data quality check, sometimes.

  2. The five day moving average makes sense; it filters out a lot of noise. They’ve been doing this for years. It’s consistent and reasonable – nothing fishy.

    • It may not be fishy, but it’s not accurate. It shifts the actual day of the event. It would be better to average the data with the two previous days, and the two future days. This wouldn’t shift the day of the event, though the calculation would arrive two days late.

      • “It would be better to average the data with the two previous days, and the two future days. This wouldn’t shift the day of the event, though the calculation would arrive two days late.”

        I agree that centring the average would be better, but even doing this can still shift the day of the event. For instance, using a centred 5 day average this year would have shifted the published minimum from the 10th to the 8th. This is closer to the ‘true’ best estimate individual daily minimum, which occurred on 7th, but it’s still not quite on the money. It makes little difference in the end.

      • DWR54 said below: “It [a 2 day error] makes little difference in the end.”

        I would disagree. In an article that headlines a “three day” error, a two day error in the analysis method is not trivial. Filtering always can introduce errors, and a phase shift makes them even worse. Since there are better methods, it’s better to use them. A “boxcar” average using only historic data is about the worst you can get. A zero phase shift filter, and even adding a couple of weighting coefficients to the factors to get predictable filter characteristics is simply better science. One could even advocate for a Kalman filter, though that could bring up endless debates about the model.

      • Tom – Even if you’re understanding is exactly correct, the fix is to shift EVERY data point back 2 days.


      • Frederic Michael said: “the fix is to shift EVERY data point back 2 days. Right?”

        No. The fix is to apply a proper averaging technique that includes an equal number of data points ahead of and behind the current day. Of course, you have to wait for the data from the forward days before you can make the calculation, so you’re always a couple of days late with the results.

    • For calling a bottom, or a top, it’s BS. For charting/reporting, the Legend of the chart should be explicit in the averaging.

      • Well the fix with trying to see what data is saying, is to use the data that you have measured. That contains the most information you can ever have about what happened.


    • The five day moving average makes sense;

      The only sense it makes is the fact that the Artic sea ice extent calculated via satellite imagery is utterly FUBAR …… and thus the calculated one (1) day or five (5) day average(s) should only be used as reference info/data …… and/or for plotting pretty “graphics”.

    • When I was running a database of Arctic sea ice extent, I used a 3 day centered moving average. That did away with the variability and the noise and kept the dates at the proper time.

      The average minimum day is actually September 12th or day 155 of the year. Using a trailing average and you get that date wrong.

      And it is important enough to know the actual day of the minimum because all kinds of seasonality trends are touted in climate science, so this needs to be done right.

      • 12th Sept is day 255. The NSIDC 1981-2000 climatology has its min 2.5d later as expected due to their trailing average.

        256, 6.271, 0.843
        257, 6.283, 0.851
        258, 6.284, 0.859
        259, 6.287, 0.873

      • And today is September 18, 2016, which is day 261, and the atmospheric CO2 ppm is still decreasing just like it has been doing all summer long ……. and the Artic sea ice has begun to refreeze instead of melting like it had been doing all summer long ….. simply because the (NH) ocean water has begun its wintertime cooling cycle.

        And likewise, the Antarctica sea ice will surely have begun its melting instead of freezing like it had been doing all (SH) winter long ….. simply because the (SH) ocean water has begun its summertime warming cycle.

        And four (4) days from now, September 22, 2016, the Autumnal equinox will occur, with the Sun directly overhead at the Equator, ….. and the atmospheric CO2 ppm will still be decreasing just like it has been doing all summer long.

        But then, at some point in time between September 23rd and October 4th the atmospheric CO2 ppm will stop decreasing and begin increasing ….. and will continue to increase until mid-May 2017., which will “mark” the high-point or maximum atmospheric CO2 ppm for the 2016-2017 cycle.

        The seasonal change in temperature of the ocean water in the Southern Hemisphere is the “driver” of the bi-yearly cycle of atmospheric CO2 ppm quantities as measured at the Mauna Loa observatory and/or as denoted via the Keeling Curve Graph .

    • The “noise ” you mention is usually called “data “. That is what you measured.

      If you filter out ALL of your NOISE from the world global Temperature; for a 600 million year running average, you get 17 deg. C as the noise free Temperature we have had for 600 million years.

      I’m sure we can live with that.


      • What defines what is noise and signal depends upon what you are looking for.

        The timing of short term cycles and weather driven stochastic variability is noise in the context of trying to see long term changes in the annual cycle.

        You could try to reduce the erratic dates gained form unfiltered ice extent but with only 37 data it’s very short already. Low-pass filtering the data before searching the minima means you still have 37 years and the result is less noisy.


      • Well the best way to find out about long term changes, is to make measurements over a long term.

        Noise is something that is unrelated to the phenomenon being monitored, and is also not predictable.

        So noise changes the apparent value of a variable you are trying to observe in an unpredictable way.

        Since the noise is unpredictable, it also cannot be removed by any process, without the possibility of altering the real value in some other manner.

        You can reduce the mean noise power by suitable filtering, but it doesn’t make the instantaneous values of the remaining noise any more predictable.
        The more you know about the real signal you are trying to measure, the better you can separate it from the noise. Also the more complex the real signal is and is known to be, the better able you are to design the best filter for extracting the signal from the noise.

        That’s precisely how the two so far reported Einstein Wave signals were detected. The raw data was collected in digital form so it could be played back indefinitely; and then it was run through many matched filters each specifically designed for just one specific postulated event.

        And only one of those filters would actually give the correct response to the stored signal, and that was the response that filter was designed for. All the other filters would give a response that did not match the event they were designed for.


  3. The 10 day forecast shows the Arctic cooling considerably.

    The west U.S. has already had early freezes and up to a meter of snow has fallen in parts of the Mid Rockies. I get the feeling that it’s going to be an early and hard winter with bitter cold in early February (okay that’s what the Farmers Alm is predicting ;) Hope the natural gas pipelines can handle it.

    • (okay that’s what the Farmers Alm is predicting ). Years ago,about 30,myself and 4 other wx men(as we were known then),just for a lark,we did a study on all Farmer’s ALM we could find,and compared their outlooks to ours. Generally,when predicating long term stuff,like winter temps/snow,spring temps/break-ups,etc. we hit a not shabby 58%. They hit 86%!!! No wonder so many of us still use them.

    • My very elderly horse started shedding his summer coat in June, and growing his winter coat then. Last time he did this, a couple years ago, we did have an early and hard winter. How he knows, or his body knows, I have no idea, but I have been saying to friends for two months now that this is what I expect.

      I am told that horse owners and vets all over the midwest have noticed unusual shedding patterns in their horses, so perhaps he is not too unusual.

      • Rat nests are another indicator according to old timers, the more substantial the nest found in in the late summer / early fall, the colder fall / winter they predicted.

    • What temperature does natural gas freeze at ??

      Never heard of natural gas pipes breaking due to cold. I’ve actually had water pipes break when the Temperature got down to 14 deg. F one day.


      • Well heads freeze at the orfice choke depending on pressure drop an moisture in the natural gas. Wells that make heavier salt water are a different story. Many Wells have heater treaters and dehydrators to help remove the moisture and prevent freeze up at other pressure drops down the pipe line. When unseasonably cold winters occur it is much more difficult to produce the gas because of freeze ups.

      • Uh, …. uh, …. George E, me thinks Robert W was referring to the volume or quantity of NG that the gas pipes can “handle” iffen there is a tremendous surge in demand for the NG.

        And yes, it’s always a good idea to have one of these at the well-head or somewhere in the NG pipe line, to wit:

        Gets the H2O out of the NG before it gets a chance to “collect” in a low spot of the gas line and blocks the flow of NG …. and worse case it freezes solid.

  4. On today’s earlier WUWT thread “NSIDC: 2016 ties with 2007 for second lowest Arctic sea ice minimum”, Ufasuperstorm posts

    “…Additionally, NSIDC always fails to mention sea ice extent minimum values before 2012 were based on a nine day trailing mean.


    Since then we have a 5 day trailing mean, which makes the minimum extent value appear lower than it would be had there been a 9 day trailing mean…”

    The issue here seems two-fold:

    1) The discrepancy in the NSIDC press release
    2) Failure to disclose a material methodological change (using data provided by Ufasuperstorm, the 5-day calculation resulted in as much as a 2.5-3.5% LOWER ice coverage number than the 9-day calculation (this difference varried day to day).

  5. Mark Serreze and his Vietnam Agent Orange Swigging minions have been fudging the numbers for years; remember the 2007 AGU Fall Meeting Abstract of Serreze!

    The more deserving just do not exist!

    Ha ha

  6. Personally I think it is a simple mistake, occasionally I make them myself. It would be disappointing to see this post turn into a beat up session of the NSIDC. They do a great job, give them some slack.

    • Yes, this seems like a bit of a non issue to me. If they have been doing this kind of economists trailing average on all the data it makes no difference to those who are worrying about what day the annual min happens. It inserts a phase shift of 2.5 days into the data but does so consistently.

      The only problem I see here is that they could do a much better job of removing noise and measurement error if they did not use such a CRAP filter.

      Running averages are notorious for leaving a lot of high frequency noise which you intended to remove and often inverting peaks and troughs in the data. So if you are getting excited about exactly what day of the year the min. occurs you could invert the true trough and incorrectly get an earlier or later date ( which itself could be a local peak).

      Running means must die

      • Well Greg, ANY filter will change the result. That is the purpose; to create the illusion that something which never ever happened was actually real.

        You are talking about doing statistical manipulations on a set of numbers for which there is NO EXPECTATION that they should be the same number.

        The result of doing that is quite meaningless.

        If you do statistical manipulations on a set of numbers for which there is an expectation that they represent some actual fixed value, then you can claim that such a process can reduce noise.

        In the case under consideration; you are talking about a measured observation that will NEVER HAPPEN AGAIN.

        Anything that might occur at some later time, can not tell you anything about something that already happened and won’t happen ever again.


      • I like smooth curves. Data wiggles just look like cellulite.

        Seriously, though, I’m more interested in climate at WUWT. Not so much weather.

        I leave pounding on data to good ol’e “Wandering in the Weeds” himself, Mr. Mosher. I’m surprised that the Best team hasn’t ginned up a satellite data set! Maybe he is consulting with Mr. Mears?

      • So sayith George E,

        In the case under consideration; you are talking about a measured observation that will NEVER HAPPEN AGAIN.

        And that includes all of the monthly and/or yearly Global/Regional Average Temperature measurements, estimates, guesstimate, extropolatedments, calculations, etc..

  7. I am honestly confused.

    It would seem simple to adjust prior data to be consistent with current methods of analysis. Has this not been done? None of what I have read here gives a clue.

    Maybe it is too late in the day. I can’t make heads or tales of what I’ve read.

    • I don’t see anything which needs adjusting unless some can show that they have changed their methods and I don’t think they have. I did this plot a few days ago based on their online graph cursor readouts. That is presumably the 5d trailing average, so we could subtract 2.5 days but the relationship would stay the same.

      Now I have the raw data I will try a different filter and see how the results compare. I think a longer filter would be better but since there is much ( illogical ) excitement about guessing the exact day this occurs, they do not want to be the last ones out of door in releasing more appropriately filtered data a week or two later.

      • In my humble opinion, it hardly matters what smoothing approach is used, when the major providers of sea ice extent purveyors can’t, or won’t agree on the number to within 10%. Take look at DMI, JAXA, MASIE and the “standard” from NSIDC to see what I mean. If Millikan had gotten the electron charge to within 10% 100 years ago, would he have published, or simply tried again (he got within 1%). That these guys think this is real science when after over 35 years and countless millions, they can’t agree on an answer (but are convinced they are internally consistent), is beyond me. I say come back with a plan to converge to an agreed number, or quit pretending to know the answer.

      • While I agree some of the “excitement” unnecessarily comes from wanting to scoop others with the news of minimum extent timing, I think the fact that it’s early or late can be significant. For example, this year, per DMI, the drop below freezing above 80 degrees lattitude, was right on the long term average, but the refreeze of ice/minimum extent was early. That could imply that the water is colder than normal, which could further imply that all the open water during the spring/summer caused a loss of ocean heat, not an increase as the albedo conjecture would have us believe. Caleb over at Sunrise Swansong makes this point better than I, but the notion of the earliness or lateness of the minimum can be significant, IMHO.

      • What I was saying is that it is self consistent. If you are going to start comparing different data ( like air temp and ice area ) then you’d better take a good look at what you are using and the processing and ask yourself whether they are compatible before attempting to draw conclusions.

        As for different bodies getting different answers, that is TOTALLY legit and reflects the uncertainty of the satellite extraction processes. The last thing we need is these groups “getting together” to produce ‘homogenised’ data that all look the same and give a totally false idea of the accuracy and the uncertainty.

        This is exactly what has happened with the surface temp record. There are international conferences on “homogenisation” attended by folks like Phil Jones where the object is to massage all datasets to give the same message. This is not science, it is post hoc data massaging.

      • “This is exactly what has happened with the surface temp record. There are international conferences on “homogenisation” attended by folks like Phil Jones where the object is to massage all datasets to give the same message. This is not science, it is post hoc data massaging.”

        more conspiracy crap

        Jones does not homogenize data. he takes data from NWS . Example canada. ENV Canada has
        about 7000 stations. from this 7000 they select about 200 and homogenize them
        Jones uses this. WITH NO CHANGES.. Env Canada Also have submitted around 3200 raw stations
        to GHCN-D. And a subset of these make it into GHCN-M.. again Raw.
        NOAA takes this raw data and creates Adjusted data using code that was published years ago
        and hasnt changed. GISS use this adjusted data. they do not homogenize data. the only change
        GISS makes is to adjust for UHI.. Nobody else uses their approach.
        Berkely uses the 3200 daily stations, raw, and we do our own completely independent adjustment

        So in the case of canada.. lets recap.

        ENV canada has about 7K stations. approx 3000 of these go to GHCN-D, and a smaller number
        end up in GHCN M ( around 1200 active stations today) In addition, they publish an official series
        for canada made up of 200 stations, adjusted by local experts

        1. Jones uses official canada data. he does not adjust it.
        2. NOAA and GISS use GHCN-M
        A) NOAA adjusts with PHA
        B) NASA inputs adjusted NOAA data and applies a UHI adjustment
        3. Berk uses GHCN-D daily raw, and does a completely different approach.

        This is not an isolated example. CRU has about 5K stations. they use the Official versions
        of each countries NWS . Different counties all use different methods.

        NOAA uses raw data from NWS, and adjusts it their own way.. using PHA.
        GISS uses Noaa adjustments and does a UHI adjustment.
        BE uses daily data and we do our own thing. In fact, you can go find the papers
        comparing our versions of various countries against the Official versions adjusted by the local
        experts. They Differ.

      • Thank you kindly for this graph, Greg. I look forward to finding out what happens with your different filter. I would also be interested in seeing a similar graph of the day of maximum extent, in order to gauge the length of the melt season — alas, visiting the charctic site myself, I get only a blank where a chart should be.

      • I assume that this is an actual replica of a graph drawn by somebody at NSIDC.

        One can make the following observations about this graph.

        The ” function ” represented by this graph, is NOT a band limited function. It clearly contains infinite frequency components, clearly observable by regions of infinite curvature and instantaneous slope changes. It appears that the graph contains 37 actual sampled data values, and maybe over a 36 or 37 year interval of time.

        The Nyquist criterion requires that there be at least 2Bt samples taken in a time interval (t) to properly represent a band limited (B) signal.

        Well I have already said the shown function is NOT band limited anyway so that graph is as phony as a three dollar bill, and NO real information is contained in it. The 37 marked data points, may in fact be real, but the rest of the graph is total bullshit.


      • Steve M and All

        “Canada has about 7000 stations. from this 7000 they select about 200 and homogenize them”

        The pre-cooking is done by Env Canada which is in this up to their gunwales so there is no need to re-cook them. Those ‘200’ stations selected have not been consistent (check) and Env Canada is one of those national services in a league populated by a few others: New Zealand for example.

        The 200 have been chosen to ‘raise the temperature of Canada’ which is saying something because for the most part it hasn’t budged in decades. Waterloo hasn’t warmed in 100 years. That is according to a data set I found on line for Waterloo Ontario. Last winter we hit -34.5 C in March(!), a new record. This winter is probably going to be terrible according to all forecasts.

        Env Canada is one of the most untrustworthy data set generators anywhere. They are in the CAGW scam up to their eyes, even crushing dissent within their department when experts tried to expose their shenanigans, particularly by their not-at-all-missed former cheer leader. What a sorry bunch of things they have been up to since joining hands with Hansen all those years ago.

        Borehole measurements made decades ago showed that the Arctic had already warmed about 6 degree C in the previous century. Since then, not so much, as they say. The BBC, which is about as shrill an alarmist channel as exists, yesterday said that the Arctic ice was melting because of warm water that entered the region. Well they got that right. They of course said it in a manner that gave the clear impression the warmth was caused by CO2. Ahh…back to ‘message’.

        They went on to say that an 1100 passenger liner had passed through the NW passage in a month. OMG it must be warming up in the Arctic! They failed to mention that the liner was led by the nose through the NW Passage by a huge ice breaker. Details like that are ‘off message’ according to Phil Jones. Well, that’s all right then.

        Env Canada, BBC, peas, pod.

      • @ Steven Mosher. You write “NOAA takes this raw data and creates Adjusted data”.

        No Steven. Adjusted data is an oxymoron. There is only data. Everything else is at best a calculation and at worst the result of torturing the data.

      • Forrest Gardener on September 17, 2016 at 11:42 am

        There is only data. Everything else is at best a calculation and at worst the result of torturing the data.

        Sorry: you write a lot about temperature data processing but do not seem to know very much about it.

        Here, Mr Gardener, is a chart showing
        – GHCN unadjusted data: raw (eliminated data excepted) with absolute temperatures
        – GISS adjusted data (land only; land + ocean).

        Looking at the raw data (in white) and comparing it with GISS land only (in red) perfectly explains what must be done to transform the raw data into something valuable.

        Raw data contains metadata about the stations; it is not the GHCN unadjusted record’s task to evaluate that metadata. For example, to evaluate the rurality and nightlight levels of all stations is the job of all GHCN record users.

        You see the results of the data homogenisation (here performed by GISS): noise reduction and outlier elimination reduce both the standard error level and the temperature trend.

        Trends 1880-2016, in °C / decade:
        – GHCN unadjusted 0.229 ± 0.006
        – GISS land only 0.097 ± 0.001
        – GISS land + ocean 0.070 ± 0.001

        Nobody tortures the data, Mr Gardener. What about some own data processing, instead of guessing, supposing, pretending, claiming?

        Datasets used for the chart:

        Feel free to download and evaluate…

      • Oh dear.

        Bindidon, we do all realise that the graphs you present are the result of calculations to combine figures from around the world into a single figure for each time period, don’t we? Maybe not.

        Try again when you’ve thought about data means.

    • That is a good question. This is of no news.

      Another way to see it: it depends on what happens between 2016 and say, 2040. Total summer melt as predicted many many times, or some recovery as predicted many many times. I’m not sure. But I find total summer melt a little bit disturbing, should it happen. It means the warming is larger I expected.

      • From now to 2020 will give a much better indication of how much of this circa 60y periodicity and how much is a true long term loss of ice. So far they are playing games looking at the hot half of the 60y cycle and pretending it is all OMG “global warming” caused by man made CO2.

        That is horseshit, and they know it.

        If the accelerating melting from 1997 to 2007 had continues ( which was a reasonable concern in 2007 ) then we probably would be near the “zero = 10^6 km sqr” level by now.

        The fact is that it hasn’t and if they had half a brain and a grain of integrity they we heaving a great sign of relief that it’s NOT as bad as we thought, instead of pretending no change since 2007 “reinforced” the downward trend.

      • Greg – What you say is correct in a way, but it stops short of reality. To my mind, the fact that accelerated warming did not continue is a bad thing, because warming is clearly beneficial even in much larger measure than we have thus far experienced. If they had “half a brain and a grain of integrity” then they would acknowledge that the warming stopped and that this is nothing to be pleased about.

    • Another answer. Maybe it is interesting that NSIDC can’t communicate their methodology clearly enough and claim September 10 was the minimum when it is an end of a five day minimum period.

      • Is there not a simpler explanation? A glance ay the first differences suggests that the figure for 7.9.16 is a typo. Or perhaps someone needs new glasses?

    • “Earliest minimum”? Nobody knows, but if you mean in the satellite record, that still depends on who you ask. That’s why I think NSIDC’s prediction contest is a joke. You’re not trying to figure out what the minimum is, you’re trying to guess what NSIDC flawed methodology thinks it is. Often there are estimates that are way high by NSIDC numbers, but very close to DMI – who’s right? Again, nobody knows. Sorry, but it’s not quite science yet. Give me particle physics any day. Lots of unknowns, but plenty of nice boundaries.

    • Agreed. This is why the whole jamboree about ice min is an alarmist farce. Unless you have a single and unique change of direction in the data you are looking at weather noise, not the annual variation.

      If you want to make an illogical alarmist fuss about one data point out of 365 then you chose single day data to get the lowest value possible.

      Five days and a crap filter will not give a proper indication of changes in the annual cycle nor the correct timing or value of the minimum.

      I found I needed at least a 13d guassian ( roughly similar to a 30d running mean ) to get a single change of direction each year. This is not surprising since there are greater and lesser trends of about a week duration clearly visible even in the 5d plotted data.

      • If these guys were being serious they would be doing stuff like looking at changes in the timing and length of the melting season and ice min/max after suitable filtering. But they prefer playing non scientific games and making stupid unfounded claims for the media to reproduce as “expert scientific opinion”.

        At the same time as announcing that this years ice min is technically identical to that of 2007 ( ie NO CHANGE IN NINE YEARS ) Mark Serreze told the Guardian that “we have reinforced the overall downward trend. ”

        Right ! So now we know that zero = 10^6 and flat means reinforcing the downward trend.

  8. For information, frequency analysis of the NSIDC data shows strong peaks at 9.9, 14.6, 15.9 and 28.1 days. At least some that is probably lunar influence.
    That is why the date of minimum is jumping back and forth so much.

    Unless you filter out all of that ( which you don’t with a crappy 5 day RM ) then all you are getting excited about how many of those cycles’ troughs end up falling in the middle of September in any given year.

  9. Please excuse my naievity but is this a massive peak anomaly from Anthony’s Northern Hemisphere sea ice anomaly page. I mean blimey it goes of the page !!

    [Some of the satellite sensors failed this spring, and have been generating a false signal of 4.688 million sq kilometer sea ice. The Cryosphere (ground computers/ground programs have NOT been corrected since May, and are merely repeating garbage-base anomaly plots since that error. And nobody at Cryosphere (Univ of Illinois) has bothered to correct the problem. .mod]

    • There was a satellite instrument failure earlier this year. NSIDC cross-calibrated to substitute another instrument, Cryosphere Today have not done anything about it and are producing garbage.

      Maybe since the ‘run-away’ stopped working in 2007 and the data has clearly levelled off, they have lost interest.

      • One would hope that continuous calibration of sensors would be taking place, however…. When sensors are known to have failed, surely there is some extra scrutiny of previous data to determine if tailure was sudden and catastrophic or progressive deterioration? Where are the retrospective corrections?

      • NDISC did cross calibrate to substitute data from a different satellite. Cryo Today could not be bothered.

        They had a pretty alarmist web site, so I guess they are feeling pretty depressed now this years data is no worse than it was nine years ago.

        It’s a real bummer that the poles are not melting out the way they hoped and the climate is not locked into a death spiral . Now we don’t need them to save the planet any more. They must be feeling very diminished.

  10. It’s because of the Olympics! Seriously – the Olympics occur every four years in a Leap Year – so Feb 29 means that it will happen at least one day early.

  11. If I understand sea ice extent it is the total area of 25×25 Km grid cells that have at least 15% sea ice coverage. To provide such a measurement the data of total actual sea ice coverage must be available. Surely the extent measurement is a composite measure of ice and effect of winds. Does a graph of actual total ice exist. That would more closely reflect the amount of actual melting

    • Sea ice area was calculated by Cryosphere Today team at U. Illinois but that has not been maintained since earlier this year.

      If you want “amount of ice” look at cryosat-2 for ice volume. Area is probably more relevant to study feedbacks and volume is more relevant to the quantity of latent heat energy is being exchanged.

      Both have their value.

  12. Here is the NSIDC extent annual minimum after having using a 12 day gaussian filter to remove the weather ‘noise’.

    As I suggested it shows a much more coherent pattern once the changes in the annual cycle are isolated from the short term variability.

    IIRC, the pre- 1988 data is from a different satellite which only gave one datum every three days. So this should probably be coloured differently . Less reliable.

    There is a clear change of direction in 2007 and whole thing is starting to look very cyclic, not linear.

    Here is the 5d running average results taken form NSIDC interactive graph:

    • Greg, interesting graphs you show. The SH and NH Sea Ice Extent graphs I posted farther above also suggest a cycle which is contrary to each other on the NH and the SH.

      One of the things that could play a role in an eventual NH – SH cycle is the Brewer-Dobson circulation. A circulation high in the stratosphere that seems to redirect stratospheric air masses between NH and SH. And can influence polar temperatures from the upside.

      From down, the import of water masses play a role. I read an interesting 2011 article about a recent warming event for the Arctic Ocean (from the nineties) which extended its influence till the 2000’s. It was about a massive subsurface water flow (called the Atlantic Water, AW) which was warmer than normal and that entered by the Fram Strait and followed her way along the Siberian coast to finally end up, less warm in the Canadian Basin.

      They write “The AW is believed to be effectively insulated from the pack ice by a ~30–50-m cap of fresh, cold surface water (….)”. But also: “The decrease of AW temperature with distance from Fram Strait implies that AW heat must be lost as the AW spreads.”

      After the inflow of the nineties a second warm puls from 2004 to 2007 is named. I don’t know about the inflow in the Arctic Sea after this date. Would be interesting to hear more about the [mass and temperature of] the inflow of Atlantic Water in the Arctic Basin.

      The article: “Fate of Early 2000s Arctic Warm Water Pulse”: http://journals.ametsoc.org/doi/pdf/10.1175/2010BAMS2921.1

    • Greg, your filter is not working properly. All those dates are off by quite a bit.

      Daily extent from 1978 to 2015 can be found here.


      2016 here.


      Average climatology can be found here. The average minimum date is Day 255 or September 12th in non-leap-year.


      And for a little bit more fun, one can go back all the way to 1972 using this dataset (1972 to 2002). Need to apply an adjustment to match these up numerically to the current methodology. This data is the same methodology as that used by Jaxa.


      • Thanks Bill, could you say what you mean by ‘off by quite at bit’ and why you consider the result incorrect?

        Clearly the dates will not be the same as what is gained form applying a crappy running mean filter which is too short to filter out the weather noise and provide a unique turning point. The point of the exercise is to get more representative dates and they will be expected to be different ( though similar ).

        The links you provide are the source of the data I’m have been using.

        You’re a smart chap and you may well have a point but I can not assess whether this is a problem on the basis of you simply saying the filter is not working. Are you saying something more specific than that my dates are not the same as the minima in the 5d trailing averages ( which itself is wrong because it is not centred ).

        I would expect my dates to be on average 2.5 days earlier since I don’t phase shift the data. They will also be different in either direction since I have eliminated some short term variability.

    • If you give me the exact values of those 37 plotted points, I’m sure I can calculate for you the average value for 37 years, shorn of ALL of the noise.

      That would seem to be the only number that could be of any interest, because any other possible result has no validity whatsoever.


  13. The NSIDC actually uses two different datasets – one called the Sea Ice Index which most data from the NSIDC is quoted from and one called MASIE which they describe as more accurate but doesn’t have the long-term historical perspective calculated so that trends can be be compared.

    The Sea Ice Index minimum was on September 7th, day 251 and 4 days earlier than normal.



    The Masie index minimum – 1 km resolution – was on September 8th, day 252 and 3 days earlier than normal.


    The FAQs say,

    “Use the Sea Ice Index when comparing trends in sea ice over time or when consistency is important. Even then, the monthly, not the daily, Sea Ice Index views should be used to look at trends in sea ice. The Sea Ice Index documentation explains how linear regression is used to say something about trends in ice extent, and what the limitations of that method are. Use MASIE when you want the most accurate view possible of Arctic-wide ice on a given day or through the week. More accurate pictures of ice extent on any given day might be possible on a regional basis and from other international centers.”

    “The Sea Ice Index (SII) relies on satellite passive microwave data as its only data source. These data are automatically processed using an algorithm and have known biases and limitations; these are covered in the SII documentation. MASIE relies on data from the Interactive Multisensor Snow and Ice Mapping System (IMS) that runs at the National Ice Center (NIC). The IMS product uses several satellite data sources including passive microwave, but it is also based on visual analysis and other data sources and undergoes a form of manual data fusion. Another difference is in the resolution of the products. The MASIE product has a nominal 4-km resolution which is higher than the nominal 25-km resolution of the SII.

    • “The Masie index minimum – 1 km resolution – was on September 8th, day 252 and 3 days earlier than normal.”

      What is “normal” supposed to mean? Does that render anything earlier or later “abnormal”. If you are referring to average over a certain , arbitrary period, it would better to say so, rather than use terms which incorrectly suggest some things are normal and others not.

      The individual day on which the weather driven noise introduced a minimum is not really informative unless you are navigating the ice.

  14. imho: using a 5 day running mean for the arctic makes a lot of sense when it comes to measure melting factors the best way they can: at least it partially rules out stacking and spreading by wave/wind action, when you use a 15% coverage margin.

    they do not use the full ice extent: the question to the NSIDC would be “how much ice was below the 15% margin?” i would not be surprised to see in that data an upward spike on September 7.that is compareable to the loss in their chart.

    i’m sure that ice didn’t melt, it got spreaded out, and again contracted to 15% values 2-3 days later.
    120.000 square kilometers of ice doesn’t melt in just one day at the reversal of the melt season. It’s clearly “impossible” (unless you got 120.000 square kilometers that drop from 15% to 14.99% and thus vanishes from the chart, which is possible, but wouldn’t give a real picture of the real melt)

    that’s what the 5 day running mean solves., so i don’t get the fuzz about using it

    • Some low-pass filtering makes sense. Better to use a decent filter not a runny mean. In fact a much longer filter is needed to remove the short term noise and see any changes to the annual cycle which is what is of interest climatologically.

      • Until a year or so ago, DMI used to supply two extent numbers – a 15% and a 30%. They had different masks, so not directly comparable, but useful as a tool to see what was potentially driving extent overall. If 15% was increasing faster than 30%, I assumed that winds, etc. were spreading out the ice, rather than real ice growth. Pity they removed that product. It might be a flawed approach, but was interesting to track as a predictor of what might happen

      • Thank you very much, Greg. I agree; daily/weekly squiggles mean nothing.

        I’ve been waiting the whole thread to say this: It’s the weather, stupid!

        Climate Skeptic aka Dave Fair

      • Thanks Charlie, the squiggles are weather but there does seem to be a relevant climate signal once they are removed.

        The fact that freezing has been commencing earlier and earlier consistently since 2007 is clearly an indicator of a fundamental change.

        It looks to be dominated by a multi-decadal periodic variation , the most obvious candidate being AMO.

        What we are seeing is most certainly NOT a linear “trend” of any kind, nor a parabolic downward curve that would accompany a run-away melting process.

      • Thanks, Greg. I had not noticed the earlier freezing dates.

        It’s a shame satellite data only extends back to 1979. If memory serves, there are data indicating higher Arctic temperatures, including amplification, and reduced sea ice extent earlier in the 20th Century. I assume the 1960’s and ’70’s global cooling scare had to do with falling Arctic temperatures, including amplification, and increased sea ice extent. Starting with elevated sea ice extent in 1979 would tend to fool the unwary when looking at 1990’s and early 21st Century extents.

        Charlie Skeptic

      • Well as to the Temperature that is associated with that ice area, day of melt, etc I can tell you that Temperature is exactly 17 deg. C and it has remained at that filtered value for the last 600 million years.


        PS And that filtered value was obtained from a “matched” filter, and it is not possible to get a higher peak signal to noise ratio than you get from the matched filter.

      • Greg i definitely agree that a more accurate filter would be better.

        however the biggest message from this that I get is the following and that’s where i find this article very interesting: the “summer minimum” did occur 4 days before average, That contradicts all the alarmists claims stating that the summer minimum is shifting to later days of the average.

        What strikes me more is how the ice extent is in an own word phrased term “it’s AMO max mean value”
        in 2007 the AMO was near peak, with the top in 2012, now it’s on it’s way down again, and the ice charts are slowly following this (with a lag of course) but it is striking that the “mean” has been neatly around the -2 standard deviation value (or till cryosphere today got it’s issues around the -1million square km value.)

        i think the 2012 minimum was an AMO driven/ weather driven isolated event, which also fell “in synch” with an occurring 150 year cyclic one year large melt of the Greenland ice cap. (see Anthony’s post here.) The window of opportunity that this could happen again is getting smaller. I think past 2023 we won’t see this happen again. but who knows what weather will throw at us?
        Note that a 150 year cycle corresponds roughly to 2 and a half AMO cycles: That means the event in 1889 did happen during a negative peak of the AMO and 2012 did correspond to a positive AMO peak.

        and finally
        Nobody knows what exactly is driving the AMO cycle. this is a guess but maybe we are actually seeing the driver at work? Exposed polar sea water does cool more then ice covered water. So maybe we see the primary driver of the AMO cycle: more ice extend does warm the waters over decadal time, exposed open arctic water cools over decadal time?

        i know if i were a scientist i would investigate this plausible possibility….

  15. It is well known that an open sea will gain more heat and thus make ice more rare as ice cover shrinks.
    The death spiral.
    A knowledge to be proven or not.
    Not this year!

    • ..Hmmm, my understanding is that an “open sea” in the North will give off more heat to the cold atmosphere because the heat is no longer trapped by the ice ??

    • eh no . it is well known that open water in the arctic loses more “heat” to the atmosphere for around 10 months of the year than it gains. at this time of year low sea ice area means more “heat” lost to the atmosphere.
      as the distance from the surface to space in the arctic is a lot less than at the equator the arctic is rather efficient at losing “heat” at this time of year.

      • An errant thought that just popped into my head: With a large sea ice extent in the Antarctic over the last number of years, would that imply the Southern Ocean is retaining more heat? Could that be driving higher SO temperatures?

      • charlieskeptic , it would seem make some sense to me, particularly as the antarctic ice reaches higher latitudes .

      • Chilly, my question was pure speculation/wondering about ARGO data indicating warming of the Southern Ocean, while the remaining ocean basins cool. What could possibly cause that difference, if true?

        What bothers me about the use of anomalies and global averages is that they hide things that need further analyses. Bob Tisdale’s work involving SSTs subdivided by ocean basin reveal the role played by the North Atlantic in influencing rising average global ocean temperatures. Averages also miss the lack of warming in the Eastern Pacific Ocean over the satellite era.

    • It is well known that an open sea will gain more heat and thus make ice more rare as ice cover shrinks.

      That is not known it is arbitrary assumed ( as are many key factors in climatology ).

      What is known is that water absorbs incoming solar better than ice , though it is far from clear whether they are correctly accounting for surface reflection under glazing incidence.

      What is usually ignored in such arguments is that open water emits more infra-red than ice or snow and
      looses far more energy though evaporation. It is the net result of these opposing effects which will determine whether open water leads to more heat input or more heat output and hence whether is leads to accelerated melting or opposes melting and acts as a negative feedback.

      The naive assumption that only the absorbed solar need be considered is what leads to all the hair-pulling and bed-wetting about run-away melting , tipping points etc.

      We have an initial ground truth check on all this : what happened after the OMG Arctic ice lows of 2007 and 2012.

      As I showed above 007 was the turning point from ice minimum happening later and later and switching to getting earlier and earlier. This earlier “trend” has been going of for the 9 years since. See above.

      Following the OM-OMG low of 2012 , Cryosat-2 recorded almost 50% increase in Artic sea ice volume in 2013, in a single year !! Hardly evidence of a positive feedback from the open water absorbing more sun as suggested.

      The decadal rate of melting has also reduced since 2007 as measured by ice area and extent.

      So there are two ways to interpret the observational data:

      1) the net feedback from open water is negative , not positive and run-away melting was a mistake. It is not happening.

      2) the feedbacks are not the key driver of Arctic sea ice melting, there is another external force, such as N. Atlantic SST , which is dominant and run-away melting was a mistake. It is not happening.

      We also see that this year’s min extent was indistinguishable from that of 2007. That is NINE YEARS with not net change. Again incompatible with the suggestion of a dominant positive feedback.

      The death spiral is dead.

    • Actually that is not correct.

      The open ocean must be at a Temperature higher than the freezing point of sea water, which may be -2 deg C or thereabouts (giggle it for yourself).

      But once that ocean becomes ice covered, it can drop to any Temperature even much lower than that freezing point, and since ocean water is quite close to a black body for thermal radiation in the 1 to 500 micron wavelength range, the open water will radiate much more profusely than does the ice covered sea.


      • Why does being covered or not change the freezing point? That is a property of the water not of whether it is covered.

        Water can remain liquid below that temperature at depth because of the pressure. This not because it is covered.

    • oppti
      It is well known that an open sea will gain more heat and thus make ice more rare as ice cover shrinks.
      The death spiral.
      A knowledge to be proven or not.
      Not this year!

      The long-projected “arctic death spiral” is a valid (theoretical) fiction, a classroom or paper exercise in average values readily demonstrated in the (lower latitude!) classrooms. Thus it is entirely valid for icebergs floating offshore of Miami or Key West. Valid even up north towards 48 or 60 degrees latitude.

      But in today’s arctic sea ice? It is not valid that far north, not valid 7 of the 12 months of the year. Add in other losses (extra evaporation cooling, extra long wave radiation to space from a “warmer” ocean surface, extra convection losses from the open ocean and increased conduction losses across the sea ice from the ocean waters to the atmosphere) and 9 months of the year there are increased heat losses from the open ocean than from an ice-covered Arctic Ocean.

      Less sea ice = more cooling, 9 months of the year.

      • Except if Arctic ice decline from its high of the century in 1979 is due to a warming planet, why has Antarctic sea ice grown so much during that period?

      • Gabro
        Except if Arctic ice decline from its high of the century in 1979 is due to a warming planet, why has Antarctic sea ice grown so much during that period?

        Nobody can explain the problem. The “easy answers” fail, and the complex ones (offshore winds blowing sea ice away from the continent so more can freeze, land ice melting and diluting the Antarctic salt water near shore so more can freeze, and “just ignore it” all also fail.

        but the loss of arctic sea ice does not contribute to greater warmth in the arctic year-to-year, else the rapid gains and losses of 1, 2, 1ns 3 million sq kilometers of sea ice from fall to spring, from spring to the next fall, could not occur. Either gaining a sea ice area the size of half of Greenland matters, or it does not. If you gain that much sea ice one year, 6 months after the record lowest Sept sea ice minimum, then the theory of arctic feedback is blown away. If you have the highest sea ice extent in May, then a near-record low sea ice extent in Sept the same year, obviously reflecting solar energy in May from sea ice has nothing to do with energy content of the water four months later.

        So, is the simplified theory of arctic death spirals dead wrong? Or simply so incomplete that it cannot predict the actual sea ice area only 6 months later? What else are they missing?

        Well, to begin with, recognize that Judith Curry witnessed the melt ponds re-freezing on Aug 12 during her months on the ice in the SHEBA long-term study up north one entire year. So, ALL of the solar energy absorbed over the 24 hour period 12 August at 78 north was lost, PLUS enough extra heat energy was lost to freeze a 1 mm thick layer overnight on the melt ponds. Losses come to more than 2600-2500 watts/meter^2 at least, depending on the depth of the ice layer the next mornings.

      • That year was at or near the high. It was probably the high since 1916, although can’t be sure about 1917-19. But by the early 1920s, the loss of Arctic ice was already being noticed.

        It stayed low until the ’40s, then climbed back up during the ’50s-’70s. From satellites in the ’60s and ’70s, we can make direct comparison with 1979, which was higher than any other year observed from space.

      • then the continual trend in the loss of sea ice in the Arctic must mean that …

        This is fundamental error in your thinking and comes from the banal way this is usually presented by alarmist scientists. There is no a “continual trend” there is a variable trend. Drawing a straight line though the data to bias the eye of the reader and spuriously attributing this to the steady rise in CO2 is to deliberately mislead.

        Very few variables in the climate are flat. So you can always make a spurious claim that it is driven by CO2 which ever way the “trend” goes. That is woefully inadequate as scientific attribution but has been the mainstay of the alarmist state of mind for the last 30 years.

        The very fact that the rate of ice melting did not increase after the 2007 minimum pretty much kills the idea of a positive feedback and destroys the claimed correlation with the ever rising level of atmospheric CO2.

        The change of direction in the drift of the date ice ice minimum, also defies the idea of constant CO2 driven change.

  16. O T – Asbestos

    In the comments to a recent article (I cant remember which) someone mentioned a column by Christopher Booker in the Daily Mail about the relative safety of white asbestos vs blue (there is also brown, 6 in all) used in building products and the “racket” of asbestos removal firms. I have some transite bench tops and asbestos siding on a house, so I made a note to check out the article. It appears to have been taken down, 404. However I did find this:


    Perhaps more research is in order if you are in a similar situation.

  17. In another month, 80 deg. N. latitude will be getting about 4 hours and decreasing of sunlight per day. I don’t understand the relevance of the September ice minimum to insolation, when for most of the next 6 months there isn’t going to be any sunshine and what there is will be low to the horizon.

  18. Bill Illis:

    Greg, your filter is not working properly. All those dates are off by quite a bit.

    Thanks Bill. No reason the suspect there’s a problem with the filter but I have checked over the whole process and it seems to be doing what is expected.

    Here I have plotted my results against those of 5d NDISC points that I lifted from their interactive graph. I subtracted 2.5 to remove the phase shift of the trailing average.

    Now my results for the last few years are about a day earlier. For the pre-88 data they are very similar. I think this is because Nimbus-7 had a 6 day repeat cycle so was actually averaging or blurring the data anyway.

    For the middle section there is a notable discrepancy, Two spike years disappear. This is the kind of thing I was expecting having removed short term weather distruption. This really was the aim of what I was doing.

    If we are believe the data, what this shows is that weather disruption tends to lead to an extra late breakup of ice in some years but does not tend to produce early minima. That is that there is an asymmetric bias introduced by taking short term averages which do not remove weather ‘noise’ from the data.

    • Having looked at this again, I realised that the drift to earlier dates as the filter length is increased is simply a reflection of the asymmetry of the annual cycle. Refreezing tends to be rapid once it starts, so smoothing the data leads to a progressive shift to ealier dates.

      Bill’s Illis comment that there was ‘something wrong with the filter’ was incorrect, it is doing exactly what it is supposed to but the issue raised was enlightening. This needed explaining.

      Thanks for pointing this out Bill.

  19. Turning point? Hmmmh…

    For those who don’t automatically think « Wow! That data sure is flawed from top to bottom! »

    Of course: Antarctica isn’t warm! It’s august, i.e. “february” there, Europe’s often coolest winter month.

    But it is warmer. What is really unusual is that the dark red anomalies aren’t where I supposed to find them (around the Peninsula). They are located in the central area. Hmmmh…

    “Shouldn’t happen”, some Lisp machines once said just before experiencing a hard shutdown :-)

  20. Data doesn’t adjust itself . So who are the actual people at NOAA , NASA and Environment Canada that cherry pick weather station data and ” adjusting” numbers to claim “records ” ?

    More likely NOAA announces the newest fudged numbers since fudged numbers started 10 years ago .
    Why did the CRU in the UK purposely destroy data records ?
    Right there red flags should be flying .

  21. so – nobody checked how NSIDC calculate extent data and minimums?

    As set out here?

    “In April 2012, NSIDC updated its method of calculating daily values for the Arctic sea ice extent minimum from a 5-day centered average to a 5-day trailing average. The new calculations show, for example, that the record minimum occurred on September 18, 2007, which was two days later than we originally reported (September 16). In addition, NSIDC updates extent values, calculated initially with near-real-time data, when final processed data becomes available. These final data, processed at NASA Goddard, use higher quality input source data and include additional quality control measures. The recalculations show a 2007 record low extent of 4.17 million square kilometers (1.61 million square miles). Our originally published value was 4.13 million square kilometers. In the final data, the date of the minimum may also change for some years.

    For more information on calculating daily sea ice extent values, see the Sea Ice Index documentation.”

  22. In the final data, the date of the minimum may also change for some years. All their so-called ‘data’ is always changing, The past is never what it used to be. Winston Smith is hard at work, but he is your friend Griff.

  23. Well, it doesn’t strike me to be a big deal in regard to what the low was or is. The only time it would become a concern if someone tried to use their averaging method and then pick a day as a low that wasn’t a low. This could result in losing track of trends relating to when lows are occurring and when a recovery starts. If they would say “5 day running….” when they make the announcement, I don’t understand the big deal.

Comments are closed.