Playing around with my hometown data, I was horrified when I found what NASA had done to it. Even producing GISTEMP Ver 2 was counterfactual.
Guest essay by Philip Lloyd
The raw data that is fed to NASA in order to develop the global temperature series is subjected to “homogenization” to ensure that it does not suffer from such things as the changes in the method of measuring the mean temperature, or changes in readings because of changes in location. However, while the process is supposed to be supported by metadata – i.e. the homogenizers are supposed to provide the basis for any modification of the raw data.
For example, the raw data for my home city, Cape Town, goes back to 1880:
http://data.giss.nasa.gov/tmp/gistemp/STATIONS/tmp_141688160000_0_0/station.txt
The warmest years were in the 1930’s, as they were in many other parts of the globe. There was then a fairly steep decline into the 1970’s before the temperature recovered to today’s levels, close to the hottest years of the 1930’s.
In NASA’s hands, the data pre-1909 was discarded; the 1910 to 1939 data was adjusted downwards by 1.1deg C; the 1940 to 1959 data was adjusted downwards by about 0.8 deg C on average; the 1969 to 1995 data was adjusted upwards by about 0.2 deg C, with the end result that GISS Ver 2 was:-
Being curious, I asked for the metadata. Eventually I got a single line, most of which was obvious, latitude, longitude, height above mean sea level, followed by four or five alphanumerics. This was no basis for the “adjustments” to the raw data.
Which should I believe? The raw data showed a marked drop from the 1940’s to the 1970’s, which echoed similar drops elsewhere. Time magazine covers showed the 1970’s were indeed cold.
The raw data is probably accurate. The homogenized data is certainly not. It is difficult to avoid the conclusion that “homogenization” means “revise the story line” and “anthropogenic global warming” really means “humans changed the figures”.
Prof Philip Lloyd, Energy Institute, CPUT, SARETEC, Sacks Circle, Bellville
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

More of the same. Let’s face it these achingly clever NASA/GISS guys NEVER thought they’d be busted adjusting temperatures. Not even if they pulled these stunts for 130 years. They probably thought it the perfect, uncatchable ‘crime’, but here we are with example after example from all around the world. Busted. It should be the most brutal public sector takedown in history.
Mr Trump, get medieval on their arssses with a blow torch and some pliers.
“They probably thought it the perfect, uncatchable ‘crime’…”
I doubt that they thought of it as criminal at all. I think it more likely that they thought that was the minimum changed needed to create the narrative they wanted. They’re “saving the world”, after all.
Holy heavens! I didn’t realize that the data was manipulated to that degree! I am floored.
So…..what is anyone going to do about this?
What CAN anyone do about it…?
“Follow the money” is another way of saying “Follow the Power”.
So far, Trump is doing well in doing something about it.
Trump isn’t god, but so far, in the context of “GAGW”, he’s doing very well.
Keep an eye those, Democrat or Republican, who are opposing or hinting at opposing his nominations.
They are the ones who want Big Government, even if the goals of what they would do with it differ.
DG,
It would be helpful to have a global study that produced say 5 sets of data, starting with least homogenised (raw if you have it), going to a 1st step like removal of outliers and wrong transcriptions, then to a 3rd set with deletions of data where there has been a recorded station shift (instead of trying to correct for the shift), a 4th step where all but ant final GISS etc type post-fact adjustments are made, then the 5th set being the polished turd.
Let the fun begin. By doing a blink comparitor, one might see that the overall direction of change is one way only (cooler past, greater warming with time) whereas the change expected by experienced observers of earth science data would be neutral, but with wide confidence limits.
Oh and to bang a current drum, let us insist on formal, proper, approved calculation of error bounds (including bias as well as stats type precision) in place of the optional extra ways that errors are currently treated.
Geoff
Geoff, nice post. But one never removes outliers unless there is a sound, physical reason, such as a known miscalibration or something. If that’s what was measured and recorded, that is what should be used. Sometimes data is noiser than we want.
DG,
Obvious outliers like 10 times too big, I displaced decimals, etc.
It is obvious why the pre-1909 data had to be deleted. If they had kept it, they would have to at least apply the same corrections to the data pre-1909 as they did to the 1910-1940 data. If they had, that would have resulted in about a 2.5 C temperature rise since 1890 to present. NO WAY did they want to have to explain that.
In my mind, when I see a temp record with the obvious amp or pdo signature, become adjusted and the ocean signal removed, I can’t imagine a more clear cut case for either fraud or absolute incompetence. Science has become political.
The Cape Town raw data has a large temperature drop within only a few years in the early 1960s. I doubt such a large downward step was in the actual temperature at any given location.
Indeed. 1960 was the year the airport opened. Inland, 48m altitude. The station would have moved there from seaside Cape Town. As the NOAA sheet shows, that was one of the adjustments made.
There is no need to make any adjustment. It is a new data point. Like here in Sydney where the airport is reported to show the highest temperature evah since records began. The BoM took over record keeping in 1901. Sydney had an airport then too! /sarc
Pathetic defense of shonky “fake science”.
Inland made it cooler? At an airport? And 48m is supposed to be that much higher so cooler?
I believe we have the same issue here in Oz. IIRC, the raw data show no warming, and slight (statistically irrelevant) cooloing.
If you were to remove 90% of the rural data, and homogenise the rest with urban data, then reduce old data by 0.5C and increase later data by 0.5C, you would get some significant warming.
But I can’t imagine that responsible governments employees would do anything as obviously dishonest as that, would they?
Have you been looking at the “homogenized” data?
The only dishonesty apparent here is that a fake magazine cover is being used as some sort of evidence of something – god knows what since TIME is hardly a scientific resource. One of the obligations of a researcher is to consult source material whenever possible. Obviously Philip Lloyd did not. Or, alternatively, he *knew* the TIME magazine cover was a photo-shopped fake and used it anyways.
The image was actually used on a TIME cover on April 9, 2007 with the caption: The Global Warming Survival Guide
http://img.timeinc.net/time/magazine/archive/covers/2007/1101070409_400.jpg
The cover on the left is a con too. It isn’t about climate, or weather. It is from Dec 3 1973 and is about the oil crisis.
Nick Stokes… the master of the con !!
So the magazine covers are fakes – so what? How does using a fake cover mean the temperature changes made are correct?
There is abundant evidence that climate scientists were warning of cooling in the 1970s.
Your “point” makes no sense whatsoever.
“Your “point” makes no sense whatsoever.”
So what was the point of the covers at all, if being fake doesn’t matter?
There’s a lot of effort being made by warmists to assert that the 1970s concern about global cooling was either non-existent or a trivial part of public discourse.
Well, I do remember it, and yes, there were a few articles about cooling and what it might mean, but it never came to dominate public opinion in the way that global warming has for the last 20-plus years. Two reasons for the difference, based on my armchair analysis:
1. 1970s cooling was not attributed to human activity, so it wasn’t “our fault” and so there was nothing we could do about it. Hence the powerful guilt component wasn’t there to exploit. Plus there were other much more threatening things going on like the prospect of large scale nuclear war.
2. It’s not that the 1970’s cooling was under-reported, but that post-1970s warming is over-reported to the point of insanity(IMHO). It’s been taken over by a loosely coordinated movement with well defined political goals. Well, everyone knows about the IPCC and governments being subverted, and funding for research about “climate”, and the mainstream media being dominated by warmists, so there’s no point in rehashing all that stuff (Tim Ball has several really good posts about the underlying management of the AGW theme). But it seems to me that, if the IPCC and all its spin-offs weren’t hammering the message about global warming/climate change and impending disaster, day in and day out, it would just be another non-event that just possibly might inconvenience our grandchildren, but fades in significance when compared with real and current issues like Isl a mic Ter r o rism.
One possible reason why warmists might want to downplay the 1970s cooling scare is the way it was impressed on us at the time and how that would affect subsequent thought patterns, like this:
a) it was getting colder
b) that (if it had continued) would have been a Bad Thing
c) if cooling is a Bad Thing, then it followswarming must be a Good Thing
d) hence the 1979 to 1998 warming (and a few blips since 1998) has been beneficial
Well, we can’t have thoughts like that circulating, can we? Never mind record crop yields and desert greening….. Warming is a Bad Thing, it’s all Our Fault and so we must pay carbon taxes to try and stop it, and build windmills and buy Teslas and blah blah blah
Adding /sarc just in case anyone thinks the last paragraph is actually my opinion.
The covers are fakes — Time Magazine has all its covers available as (searchable) images at
Time Cover Search.
The author had the responsibility to verify that images before use – and should have done so.
The post should be CORRECTED — clearly stating the\at the images have been found to be fakes.
MODERATOR take note.
Yes, try a search and instead of the covers it takes you to a “latest news” (“The Brief”) page, nothing to do with your cover search.
AGW == > The Time Mag cover search page is broken, in all three popular browsers. (in different ways). If you know the date of the cover you are looking for, you can use the Birthday Covers search on the right of the page, it will give yo the cover closest to the date you enter.
A few topics back Dr. Roy Spencer suggested that we should be studying physical geography in relation to climate change. I agree. We can take this a step further by studying an even more responsive proxy: commercial agriculture – in particular fruit production.
Profit margins of products from the land have always been marginal. As a consequence we find enterprises that survive are almost always clustered into environments that best suit the crop being grown. This comes down to soil type, shelter, sun hours, rainfall, and temperature – in some cases both hot and cold. The difference between economic regions and uneconomic regions often comes down to a few degrees temperature or days of winter frost. In many cases winter chilling is essential. Should climate – in particular temperature – have changed, then specific crops would have migrated.
My own country, New Zealand, is slightly larger than the UK and 63% the size of California. It extends 1600 km (990 m) approx Nth to Sth. The far north is subtropical and the deep south, temperate. Along with an obvious Nth-Sth temperature gradient there is a notable climate variation E-W. The prevailing W-SW winds bring more rain and cooler air temperatures to the west coast. Central ranges make westerlies warmer and dryer in the E.
This results in a great variety of growing environments and localised production centres e.g:
Northland (warm with few frosts): Sweet potato, citrus
Bay of Plenty (warm summer, good sun hours, some frosts): Kiwi fruit
Hawkes Bay: (hot, dry summers, strong frosts in winter) Wine viticulture, stone and pip fruit
Nelson: (warm summer, good sun hours, some frosts) pip and stone fruit, hops, tobacco
Marlborough: (Hot dry simmer, severe frosts in winter) Wine viticulture
Otago: (Very hot dry summer, severe winters) Apricots, cherries
For all of these crops a slight change of temperature, rainfall, or wind, can mean economic make-or-break e.g. too warm in winter and Otago could not produce the cherries and apricots it is famous for.
I was born where I now farm 66 years later. I have visited most of New Zealand and have a good understanding of its geography. Furthermore, growing up we picked up things from parents going back further into history. Guess what? NOTHING HAS CHANGED. There has been no migration of crop locations and production has not dropped.
I remember our family or neighbours planting oranges, figs, Kiwifuit, grapes. The fruit set but never get big and sweet. We are just a fraction shy on sun hours and temperature. This has not changed!
There is another example involving a perennial grass, Kikuyu (origin E Africa). It is very frost sensitive but invasive of most other pasture species that have higher food value. Dairy farming in almost an entire province in NZ (Northland) is now dependent on it as the prime pasture species. It is cheaper to work with it than against it. Production/hectare is probably 30 % less than on top pasture.
Aside from in coastal strips of KM-scale width, Kikuyu runs out south of Auckland City where frost is more common. In my living memory the kikuya boundary has remained remarkably static. A coastal strip within 20 minutes from where I write has not migrated by any more than ½ KM
And only 7 thermometers used by NIWA to make up an average for such variance.
actually i noticed that we know bullocks about the global surface temperature record….
just an example: june 2016 was at our RMI the wettest june ever recorded… this because of an unusual stalling thunderstorm system in a low wind environment. I live some 80 km away near the sea, we had a sea breeze that protected us and i was sitting outside in the sun while inland everything was flooding.
Then the last two weeks the weather was interesting: inversion with no wind.
this is where the coast always has warmer temperatures then inland ‘SST induced heat convection keeps the coast from freezes)
well both urban stations recorded warmer temperatures then even the rural station on a pier at the coast surrounded by a SST of 7-8°C!
that’s odd imho Well not odd when you look at the study made by the university of ghent: they concluded that in our weather patterns UHI can sometimes give an anomaly of +8°C compared to a nearby rural station.
I get a monthly update from NIWA. The language used relates to “average” (“above”, “below”, “about”). – no numbers or indication of what the average base line is. This how National Radio reports too. I asked NIWA for the adjusted data for December 2016 2 weeks ago. I got no answer. We pay their salaries FGS
The entire idea of using a computed temperature average for the entire globe and then using that to compute a climate change has me laughing at the shear incompetence of academia and government.
One simply cannot average temperature readings from a bone dry Arctic/Antarctic region with the high humidity regions over the tropical oceans (remember that 70%?). What is the actual energy change if 10000 square kilometers of polar regions goes up by 1°C and 10000 square kilometers of oceanic tropics declines by 1°C. Dear James Hanson, it is not zero.
If one wants to detect a global trend, then compute the trend from each temperature recording site on its own. There is no need for any homogenizing or “infilling” of data to do so. One can compute a trend at any one site even with lapses in the record. Now perhaps one can compute a global climate change by averaging trends over the globe. Even this leaves too much room for mischief in the averaging method.
However, everyone is aware of the data that tells us any “corrections” that are real must reduce the computed trend. The first thing everybody knows is that the world wide population has increased greatly since 1880. The second item we all know is that the down town temperatures are always higher than out in the sticks (Urban heat island effect). If the corrections do not reduce the apparent temperature increase from the raw data, the “corrections” are incompetent.
Gary Palmgren
“One simply cannot average temperature readings from a bone dry Arctic/Antarctic region with the high humidity regions over the tropical oceans…”
_____________________
Nor does anyone do that. They average area-weighted temperature ‘anomalies’.
Except when they don’t use anomalies. 1997 – 62.45ºF; 2016 – 58.69ºF ( 0.94°C (1.69°F) above the 20th century average of 13.9°C (57.0°F))
No, they do use anomalies. Sometimes there are parts of NOAA dumb enough to add an uncertainly estimated average global temperature. Here they explain why you shouldn’t.
Nick, “Sometimes there are parts of NOAA dumb enough to add an uncertainly estimated average global temperature.”
The climate science part.
The Berkeley Earth team also analysed Cape Town temperatures and came to pretty much the same conclusions as NASA: http://berkeleyearth.lbl.gov/auto/Local/TAVG/Figures/32.95S-18.19E-TAVG-Trend.pdf
That’s a pretty good graph, DWR54. Now if they would plot the actual data, not averages, and then overlay the standard deviation of the data it would be totally honest. Although the 12 month moving average and 95% uncertainty range would tell about the same story. Comparing(it has to be a comparison on a graph) standard deviation in a 10 year moving average to a 12month moving average is a deceitful practice. They aren’t comparable at all.
Substitute “Climate Warming Doomsayers” for “Party”, and the following quote of Winston Smith in “1984” is eerily accurate…
I think it is impressive how people can say stuff like
“the past is falsified, but it would never be possible for me to prove it”
and then show plots (using current data) of how it once was to prove it.
I showed above how NOAA publishes for each station plots of data before and after adjustment.
Let’s hope that Trump has read “1984”.
I’d suggest he start with Animal Farm.
Actually, maybe you should read both to understand what they are about and where “CAGW” and UN Agenda 21 means, if implemented, and what that means for everyone.
Are you in it , or something, McClod?
Looking for yet another selfie. !
NASA-GISS = fr@ud. Drain that swamp.
I would be gad if the author would tell us about his knowledge of changes to the siting of thermometers. BEST shows two station moves at Capetown coinciding with shifts in temperature. Do the alphanumerics mean something to anyone else? I’ve commented on these graphs in good faith on my own website, and found a better GISS one for raw temperatures at Capetown, but I need some reassurance about the rules regarding homogenisation in South Africa.
It’s just too easy to see problems that may not be there.
The adjustment (shown by NOAA here) came in two stages, one in about 1887 and one in about 1960. The first I don’t know, though it sounds like the time for change to Stevenson screen. The second coincides with the opening of the new Cape Town airport in 1960. That is the current site.
A site at an airport doesn’t just suffer from the Urban Heat Island Effect, but from the Jet Engine Effect, as was discovered by the UK MET office last year.
“Capetown South Africa”
As distinct from all the other Capetowns.
Well, Cape Town is.
But no-one needs to be told where Cape Town is. Capetown, on the other hand, …
Not just Capetown. GISS has been altering it’s temperatures for many years. It occurs on a monthly basis. Around 2008 I started saving some of the monthly text files of a few Antarctic stations from their website.
Then, in 2012 I extended that by saving the monthly data files of about 48 stations. It was a random sample of all stations with long continuous temperature records that remained active. Then watching on a monthly basis in 2012, I noticed that about 10 percent of the stations had obvious changes of old data. Most station data from the past remained identical from month to month. Over time, random stations did show significant alterations. In December 2012 a much more radical and extensive alteration of data occurred when compared to the January 2013 data. Some major error occurred to many stations that caused losses of data for all years after 2007, but that was corrected a few months later.
Now with December 2016 saved, all but a couple of those 48 stations have changed old data by various amounts since 2012. Some changes to pre-2012 data show much cooling of data before 1980, resulting in increased warming trends over the decades. There is also a substantial amount of monthly data that have been replaced with “999.9” to indicate data loss. In many cases, old years with missing data have been resurrected with apparently good temperatures. Much of the change looks suspicious in that almost every month has been changed by an exact amount for every record, such as 0.5 degrees lower. Then in later years, that constant change vanishes in one month. Some stations show much larger changes that seem impossible, such as 2 or more degrees colder in the past.
Although most stations show some amount of cooling in the early years, some changes in some cases do show the opposite trend to various stations with no pattern that is obvious to me. And changes can occur at any time for a few months and then a few months later those changes are reversed. It’s very suspicious, and especially obvious when placing a december 2012 station record next to a december 2016 record.
The stations for which december 2012 data saved on my hard drive are:
Akureyri, Amundsen-Scott, Anthony, Bartow, Beaver City, Bridgehampton, Byrd, Calvinia, Concordia, Crete, Davis, Ellsworth, Franklin, Geneva, Gothenburg, Halley, Hanford, Hilo, Honolulu, Jan Mayen, Kodia, Kwajalein, La Serena, Lakin, Lamar, Lebanon, MO, Loup City, Marysville, Mina, Minden, Murteshwar, Nantes, Nome, Norfolk Island, Nuuk, Orland, Red Cloud, Scott Base, St. Helena, St. Paul, Steffenville, Talkeetna, Thiruvanantha, Truk, Valladolid, Vostok, Wakeeny, Yakutat and Yamba.
bw on January 28, 2017 at 11:27 pm
bw, I understand your point.
But I saw lots of comments published at WUWT threads complaining about GHCN adjusted station data being “far higher” than the unadjusted variant.
Within many of them you see graphs comparing the two variants for carefully selected GHCN stations, e.g. Reykjavik, Santiago de Chile, Darwin etc.
That sort of repeated insisting became so boring to me that I computed, for all 7,280 stations, their linear trend in both records, built a list of the trend differences and plotted the sorted data after having eliminated nonsense ( about 0.5 %) due to stations with e.g. exceedingly short lifetime (for example, Tucumen with over 14 °C / century of trend difference, or Elliott with -12, etc).
Here is a plot of the remaining about 7,170 trend differences:
http://fs5.directupload.net/images/170129/u4cx6rim.jpg
(over 4,200 of them are less than ± 0.1 °C /century, and thus not so very significant I guess).
Nobody speaks about the blue part of the trend distribution line.
So your data certainly is accurate, bw, but imho you shoud consider all stations instead of such a little group of them.
Unfortunately, while the complete GHCN station data is available in text form, the corresponding GISTEMP records are in NetCDF if I well remember, and extending a hobby line to that format is too much work. Otherwise, I would have made the same comparison for GISTEMP.
Maybe Nick Stokes has such data…
“Maybe Nick Stokes has such data…”
I do have GHCN data in various forms. That is where the adjustment happens. There are histograms of trends here. Or here you can see the trend effects of adjustment laid out on a Google map with colored markers that you can play with.
Thanks Nick… but I explicitely meant GISTEMP data showing for all GHCN stations, in one continuous dataset, the difference between NOAA adjustment anf GISTEMP homogenisation.
Without extracting the data out of a NetCDF database, we can access the data only station by station using the web link
https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=station id&ds=7&dt=1
Nice tool, but.
Nick, could you comment on this? Why should we be repeatedly changing the temperature readings for a period a hundred years or so ago? It makes no sense. If we have new data which casts doubt on a particular reading, change it, and explain why. But repeatedly changing the readings with no explanation cannot be science, surely? You do not treat observations like this if you are doing science. The right way to do it is plot the observations as they are, then in the argument on the paper or assessment, explain why there may be biases in the record. But you cannot just change observations all the time with neither reason nor notification.
The alternative conclusion would be that we simply don’t know what past temperatures were from the instrumental record, which may be reasonable, but in that case we have to stop plotting them as if we did know.
The Capetown case as graphed, assuming the author has done his work correctly, is really weird isn’t it? There can be no justification for altering observations from the 1900s.
To add a bit. There is a well known thing in perfectly respectable science, when observations don’t fit a theory. The observations will always be to some extent theory laden, so it may be reasonable to question whether they are right if they are different from what a well confirmed theory says they should be.
This is quite reasonable. This is, for instance, the French paradox on heart disease and saturated fat in the diet, and we ask, are we sure they are recording all the heart disease that there is? Maybe they are calling it mal au foie?
But in this case there is no theory that gives any indication that the observations are out of line. There’s no reason to think that the temperatures really were a bit higher or lower in Capetown on some day in 1908 than those which were written down.
It just makes no sense to me. If this is the foundation of the view that modern warming is unprecedented, one is inclined to conclude the whole thing is rubbish.
There is no French “paradox” on heart disease and saturated fat. The studies used to attempt to demonize fats did not separate saturated fats from trans-fats, and it is trans-fats that are extremely unhealthy. Saturated fat is good for you, and is a staple of the human diet. The “fat is bad” science conflated the effects of certain facts as being applicable to all fats, and was junk science – just like AGW is.
Should be “applicable to certain FATS as being applicable…”
” There can be no justification for altering observations from the 1900s.”
On the contrary – if there is evidence of a change at a station, you have to take it into account. And I suspect people here would be all over them if they didn’t.
As I showed in my comment here, there are really only two adjustment made to cape Town data, one in about 1878 and one in 1960. I am not sure about the earlier one, but it is not very significant anyway. The one in 1960 removed the massive dive you see in the unadjusted data that year. So they look around. Here is Berkeley’s plot of temperatures nearby relative to Cape Town
http://berkeleyearth.lbl.gov/auto/Local/TAVG/Figures/32.95S-18.19E-TAVG-Counts.png
As you see, looking at stations within any of a variety of radii, there is a massive dip. That is, the change at Cape Town did not appear in the neighboring sites. And you can measure the difference.
Then a bit more research shows that the new Cape Town airport opened in 1960. It was then quite a bit out of town. That’s a substantial move. Adjustment is required.
There is no indication that Cape Town is “repeatedly changed”. There seem to be just a few major ones, probably done only once. It’s true that the global average changes frequently, but that is because thete are many thousands of stations, and changing any one will change the average.
NO.
A change of siting is a new station, not a continuation. Hence,
You have one set of data up to the change.
Another set of data post the change.
No splicing of the two together. They are simply different stations and should be handled as such.
The same with equipment change. as soon as there is equipment change, you have a new data point. One should not splice the two together, not unless there has been at least 10 years of overlap with both types of equipment installed, wherein a proper assessment of bias introduced by the equipment change can be assessed.
michel on January 29, 2017 at 12:40 am
There can be no justification for altering observations from the 1900s.
As Nick wrote: “If there is evidence of a change at a station, you have to take it into account”.
I allow me to add that anomaly based temperature records (without which we see was is biggest but don’t detect what differs the most) have a fundamental drawback: the fact that any data change in the reference period, called climatology (here: 1951-1980) automatically results in a modification of all the record’s anomalies, as these all are constructed month by month (or even day by day) by computing the difference between an absolute value and the monthly (daily) mean of the reference period.
“allow me to add that anomaly based temperature records (without which we see was is biggest but don’t detect what differs the most) have a fundamental drawback: the fact that any data change in the reference period, called climatology (here: 1951-1980) automatically results in a modification of all the record’s anomalies,”
So a change because of a legitimate reason in one part of the record may cause an illlegitimate change in other parts of the record when using anomalies.
Exactly. Better is to look at the average rate of change per annum.
TA on January 29, 2017 at 9:45 am
So a change because of a legitimate reason in one part of the record may cause an illlegitimate change in other parts of the record when using anomalies.
lllegitimate? No, TA. Because the absolute data the anomalies are originating from was not modified.
If you can’t live with that, so please use that absolute data instead. We all are laypersons here, with no binding to career or superiors’ meaning.
For me there is no way back since I learned that these “anomalies” are no simple deltas wrt some overall mean: if you have a monthly record, your baseline is a 12 month vector; the same holds for daily records, the baseline then having 366 units.
And that’s the difference you best see when looking e.g. at sea ice extent.
If you sort ice extent e.g. in the Arctic by increasing surface using absolute values published by colorado.edu, you will see at top, as expected, lots of septembers, then a mix of august/september/october. Far below at position 78 (of 456) you detect a timid july 2012. The first july (2016) appears at position 167, the first may (2016) at position 216, etc. The first winter month is at position 289.
But when you now sort the stuff by anomalies, the list gets quite different. The first july (2011) appears at position 14, may (2016) at position 31, and the first… february (2016) at position 42. A winter month!
The same holds of course for the Antarctic.
Thus I keep this for me: anomalies, i.e. positive or negative departures from an average, aren’t a tool fabricated by warmistas to scare people. Their use is in removing annual cycles in time series. That’s all I see in them.
ftp://sidads.colorado.edu/DATASETS/NOAA/G02135/north/daily/data/
ftp://sidads.colorado.edu/DATASETS/NOAA/G02135/south/daily/data/
Forrest Gardener
to understand movement of global T and weather/rainfall you must understand that there are two sources of the energy here, namely the energy coming from outside in and then there is the energy from inside earth to out. The strength of the latter becomes more evident when you go down 1km into a gold mine here…It seems that everyone has assumed that energy inside to out is more or less constant, and I think that is true – when measured over a short period [e.g. the Holocene] -, but, like I said, my finding here for the past 40 years is that most of the SH is cooling whilst the NH is warming. I could not figure that one out. A simple theory [that now makes sense to me] is that earth’s inner core is aligning itself with the magnetic force from the sun – very much like a magnetic stirrer, if you like, [if you know what that is?]. Indeed the evidence clearly shows that Earth’s magnetic North Pole has been moving northwards, and that movement is causing said melting of ice at the north pole and relative more warming in the NH.
As far as rainfall goes, I think this is largely influenced by the solar cycles, i.e. the energy coming from outside to earth. My finding is that William Arnold’s report, back in 1985, before they started with the CO2 nonsense, on the solar cycles is largely correct, and I have subsequently identified that one complete solar cycle (Hale-Nicholson) consists of two succesive Schwabe solar cycles. 4 Hale cycles makes up for one Gleissberg cycle 87 years, of which the first 43,5 years is the mirror of the next 43,5 years.
To try to explain, I give you another example.
http://oi66.tinypic.com/einoz6.jpg
Clearly you can see that the result from the first few decades of the 20th century falls off from the curve? It is because the Gleissberg cycle causes the pendulum to fall down for 43.5 years and then it goes up by 43.5 years.
I have to take a break now, but feel free to ask me more questions.
The top link to Cape Town Safr station 141688160000 is wrong (maybe other commenters have seen it):
… and this link doesn’t work too, even right now in my own browser. Probably the NASA web engine produces short-living links only like do many others.
I hope this one lives a bit longer:
https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=141688160000&ds=7&dt=1
Capetown’s measurements are based at the airport, which like all airports, has grown in leaps and bounds over the years.
Any adjustments should be the other way round, increasing historical temps
I suppose the only response is: God damn, wat ‘n vrag van kak!
So lets review the changes and make some obvious assumptions:
1) the data pre-1909 was discarded;
– On Jan 1 1910 the station data became good. Obviously it got upgraded.
2) the 1910 to 1939 data was adjusted downwards by 1.1deg C;
– On Jan 1 1940 they replaced the old station for some reason. We know this because the data was changed differently on Dec 31 as it was on Jan 1st.
3) the 1940 to 1959 data was adjusted downwards by about 0.8 deg C on average.
– On Jan 1 1960, exactly 20 years after they changed the station, it was replaced once again and we know this because on Dec 31st the results were different.
4) the 1969 to 1995 data was adjusted upwards by about 0.2 deg C.
– Then one again on Jan 1st, but this tome 1970, they again changed the station.
My powers of observation are quite clear. To see what is happening to the data, we only need to observe the station in Dec 31st to Jan 1st of each decade to see who is going in to change the station.
I am a little confused as to how a computer process was able to know these things happened but the people that programmed it do not. But computers are really smart right?
Never underestimate a computer. They are wonderous things used by really smart guys and dolls..
For those confused by the lack of information about ‘The Garden Spot of the Arctic’, it is Eureka, Nunavut.
Retroactive homogenization by seems to be one of the most dubious things that can be done if objective decision making is the goal. Now if selling a new program is the goal, then retroactive homogenization is a preferred tool of choice.
Up thread there was a fascinating idea: Recreate the *devices used at the time*, and make side-by-side measurements with modern instruments and see what the difference might be in current climate. Perhaps that would clarify the question as to whether or not the steady adjustments (oddly preferentially down) of the past are justified.