I remember vividly the panic leading up to year 2000. People were racing to Y2K their computers and systems. TV news crews had reporters stationed at bank machines, at train traffic centers in NYC, at airports, all waiting to see if the machines and the computers that run them, stopped working when the clock went from 1999 23:59:59 to 2000 00:00:00 because in the early days of programming, to save memory, they used two digit years instead of four, and the fear was that computers would reset themselves to the year 1900 rather than 2000, and stop functioning.
I remember being in the TV newsroom (as it was mandatory for all staff to be there that night) as the millennium crept up in each time zone on our satellite feeds…we waited, scanning, looking, wondering…..and nothing happened. The bug of the millennium became the bust of the millennium. That story was repeated in every news bureau worldwide. After all the worry and hype, nothing happened. Not even a price scanner in Kmart failed (a testament to the engineers and programmers that solved the issue in advance). We grumbled about it spoiling our own plans and went home. With “nothing happening” other than tearful wailing from Bill McKibben, subsidized anger from Joe Romm, self immolation for the cause by Gleick, pronouncements of certainty by the sabbaticalized Michael Mann, and failed predictions from scientist turned rap sheet holder Jim Hansen, CAGW seems to be a lot like Y2K.
Simon Carr of the Independent, after hearing a lecture by MIT professor Dr. Richard Lindzen, thinks maybe global warming and Y2K have something in common. He writes:
At a public meeting in the Commons, the climate scientist Professor Richard Lindzen of MIT made a number of declarations that unsettle the claim that global warming is backed by “settled science”. They’re not new, but some of them were new to me.
Over the last 150 years CO2 (or its equivalents) has doubled. This has been accompanied by a rise in temperature of seven or eight tenths of a degree centigrade.
The Intergovernmental Panel on Climate Change attributes half this increase to human activity.
Lindzen says: “Claims that the earth has been warming, that there is a Greenhouse Effect, and that man’s activity have contributed to warming are trivially true but essentially meaningless.”
Full story here
h/t to WUWT reader Ian Forrest
Bishop Hill has a copy of Dr. Lindzen’s slide show for his talk here
(Update: some people having trouble with the link to Bishop Hill’s – so I’ve made a local copy of Linzden’s talk here: http://wattsupwiththat.files.wordpress.com/2012/02/rsl-houseofcommons-2012.pdf )
Josh Livetooned the talk – have a look at his work here
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Ugh… I am SO sick of people who know nothing about Y2K making these stupid comparisons. As someone who personally had to scramble to resolve Y2K problems, I can say with confidence:
Repeat after me:
1. Potential date problems, some pretty serious, with the crossover were VERY real.
2. Because the serious application software world is probably a helluvva lot more stringent and competent that many others, the problems were largely successfully addressed, some of it just in time.
3. The fact that hardly anything noticable happened was a tribute to the effort put in, and I am sure many programmers were holding breath in case they missed something.
4. Ok, so the media were pretty ignorant too. (So what’s new)
So all you outsiders who noticed nothing… give it a rest and be thankfull. There is no significant common ground with the CAGW nonsense. PLEASE!
(BTW, if I remember correctly, the next date handling problem is potentially around 2032 with Unix-like systems still running 32 bit, if any. Integer size, counting the seconds from 1950, the Unix reference point for dates, to current date at the time, being too small. You can work it out. Dunno if it has been addressed to date. Doubt if I shall still be programming at that time… )
CO2 atmospheric content has not doubled over the past 150 years! Measurements in the late 1800’s shows a CO2 content of around 490ppmv using the same methods as used today so in fact they have fallen.
Of course it is a mistake. The mistake lies in assuming that absorptivity measurements made with visible light will still apply for much lower frequency radiation from the atmosphere. They don’t, and in fact such absorptivity factors, when the source is cooler than the target, must be zero.
Maybe this solar cooker funnel experiment will help you and others understand better ….
You simply cannot explain what happens to the extra radiation in my funnel experiment from the much larger (though slightly cooler) plate to the smaller one. Clearly far more radiation gets concentrated onto the small plate, yet heat flow must be from hot to cold, ie opposite to net flow.
Net radiative flow has no corresponding physical entity and is a meaningless concept. You cannot add different beams of radiation like you can add forces. Yes there is two-way radiation, but heat only flows one way because only the radiation from hot to cold has any effect. So how does this happen? What physical mechanism is involved? My point is that, in the funnel experiment, you cannot just calculate net radiative flux and assume heat goes in the same direction, because it doesn’t.
So “net radiation” is not a physical mechanism which tells us which direction heat will transfer, or how much will be transferred.
Study this diagram: http://climate-change-theory.com/freqdist.jpg and note that the distribution for a cooler temperature is always fully contained within that for a warmer temperature.
The effect that radiation from the hotter source has is that the excess over that which can resonate with the cooler one is converted to thermal energy. When radiation resonates like this there is no termal energy left behind and the radiation immediately exits just as if it had been reflected with diffuse reflection, not specular.
Radiation which fits under the curve of the cooler one can resonate either way (hot to cold or cold to hot) because those frequencies are common to hot and cold bodies. Such radiation is scattered and the effect is the same as diffuse reflection. The warmer body can scatter any amount of such energy without its own outward radiation being affected and without receiving any thermal energy from the cooler one.
What happens when the Sun is warming the surface in the morning? The net flow of radiation is into the surface, right? So how could the IPCC models be right in saying extra thermal energy (also from radiation in the same direction) flows from cool atmosphere to warm surface against the Second Law.
The Second Law must apply to every individual “transaction” or radiated beam between any two points. You cannot just say all will be fixed up that evening when net flow is finally outwards. Besides, the energy might come back out by diffusion or evaporation rather than radiation.
There are no two ways about it. Only radiation from hot to cold can transfer thermal energy. That from cold to hot does nothing. The amount transferred is represented by the extra frequencies / extra radiation in the area between the curves because these frequencies (coming from the hot body) are only in its distribution and thus cannot resonate with the cooler body. (The result is the same as SBL calculations in normal situations, but SBL does not give the right answer for a funnel.) In contrast, all the frequencies in the cooler body’s radiation can resonate with the warmer one.
For more detail see the ‘Radiation’ page my website http://climate-change-theory.com and a paper (which I have oompleted) will be available in due course – to be advised.
WAC, exp, and others seem not to comprehend the meaning of “trivially true”. It means that there is almost by definition an effect, but that this is not a useful or important fact. Lindzen is quite explicit that he further means only trivially true. There are no particular consequences to be derived from the “fact”, much less drastic ones.
Berndt;
“old scripts to produce a 2-gidit year” I was programming around that time, too, but I never had to deal with “gidits”! Are they a German thing?
😉
As many people have pointed out, the Y2K bug was demonstrable and real, and the reason it didn’t cause a huge problem was that huge amounts of money had been spent to fix it.
A far better comparison would be with all the phrophesies of doom dating from the 70’s, from people like Paul Erlich. According to those idiots, civilisation would have crashed and burned by 2000 (precisely when the Y2K bug didn’t cause a problem). In fact the last decades of the 20th century saw unparallelled prosperity and also the end of the Cold War. It’s strange how prosperity and warming periods seem to have such a strong correlation. There’s been no global warming for at least ten years – and look at the state we’re in now!
This is an excellent piece, particularly as the Independent is not noted for its climate scepticism. Clearly the writer isn’t a climate expert (he stated that Co2 had doubled – it probably won’t happen, if ever, until roughly the end of this century). It was also remarkable to see the amount of global warming (0.7 degrees C) actually stated. It’s such an embarrassingly small amount that it never gets mentioned by the likes of the BBC.
Clearly the writer was impressed by Richard Lindzen, who is a true scientist. If only some of our politicians such as David Cameron could spend some time with him….
Chris
Silver Ralph says:
February 24, 2012 at 9:28 am
Just to put this article in its proper perspective, the Independent is the Greenest of Green publications – with every newspaper produced on recycled paper and offset with a newly planted tree, and three herrings given to a seal.**
Thus this is a little bit like Al Gore declaring he has doubts about AGW.
** Fish are not cuddly, so its ok to kill them.
=====================
waaah waaaah you wanna kill a “sea Kitten” whaaa whaaa
PETA will be looking for you:-)
.
We have been saying this for 15 years and its still the appropriate analoogy.
Berndt;
“old scripts to produce a 2-gidit year” I was programming around that time, too, but I never had to deal with “gidits”! Are they a German thing?
I’ve dealt with a Gidget or two, is that close enough?
In 2003, I responded to an IT services tender in the UK for a public sector organisation (which shall remain nameless). I had to fill in a section about Y2K policies……
OMG, we have to expand the size of a data field! Call for consultants, lets get some cover here if things go wrong! Yes, Y2K was a farce. The grunts loved the work,change the code, reset the date to midnight and test the result, over and over and over again. Every IT body in existence was on call when the clock stuck 12:00. It was the worst New Year ever.
Andrew Krause says:
February 25, 2012 at 7:43 am
You quite obviously know nothing about how software is written or maintained.
Depending on system limitations, expanding a datafield can sometimes result in the entire program having to be rewritten. It’s not as simple as changing a define and recompiling. Beyond that there is all the testing to make sure that your change did not affect anything unexpected.
Many people are commenting that Y2K was real and therefore not like CAGW, but in one sense they are very similar – the way they have both been jumped on by people with very cynical motives.
Y2K was blown up to sell lots and of new computers and software, the vast majority of which were not needed. Yes, many programs did need modifying, but – as has been noted by quite a few posters here – these were identified quite a long time in advance and the modifications made. The key evidence here is the timing – posters here talk of starting work on Y2K “bugs” 5, 6, or even 7 years prior to 31 december 1999 – yet the hype was only in the last 12-18 months, as was the massive IT infrastructure spending.
In terms of damage, there is also an analogy in that mis-allocation of funds causes unsustainable bubbles in certain sectors. For example, in Ottawa (and I suspect many other IT clusters), the 1999 boom in IT spending was followed by a slump lasting over 4 years as companies simply ceased to make any upgrades in the wake of being “burned” over Y2K. A large part of the 2000/2001 downturn (now relatively forgotten after 2008) was attributed to the bursting of the “internet” bubble – linked closely to the reaction to the non-Y2K disaster.
On the whole, there are lessons to be learned from Y2K which can be applied to climate change even if they are not directly comparable and the biggest one – in my mind – is that human beings have a massive residual store of ingenuity to overcome real problems. Creating disaster scenarios helps no-one.
You would think that persons who profess the greatest concern for the environment would be mightily relieved to learn that earth is apparently not careering towards hell in a handcart.
Yet when this is pointed out, some seem to get even more shrill in their denunciations of those who are sceptical about their claims. Just no pleasing some people.
My Brother, the “night of the turn over” was paid TRIPLE TIME, (as a facilities engineer) to STAY OVER NIGHT at the headquarters of a large electronics firm (180,000 employees at the time.)
He did a lot of internet surfing (SURFS UP! NETWORK WAS UP!) Ate a lot of dougnuts and coffee, chatted with the night guards and laughed all the way to the bank with his $1700 he got for 20 hours of “work”….
I have a LOT of programmer friends. MANY of whom “bit the bullet” and did 60 hour weeks doing NONSENSE WORK (they knew it) but at $90 a hour with O.T., they pulled down close to $250K for the year.
Now, would THEY tell anyone willing to pay them that, “THIS IS NONSENSE, SILLY, and STUPID?”
OK, now let’s get to the “TEAM”….
Who proudly trot out their graduate student or post Doc SLAVES and say, “See they don’t earn that much…” where in actuality many of them make $120 to $160 per year …due to their time, seniority and position. NOT BAD CA-CHING compared to NORMAL people.
It’s about the MONEY!
Secondly the POWER…
Max
I must say that this is a major change. The Independent was one of the true hellfire and brimstone warming papers in the UK for many years. Perhaps the change of ownership to Alexander Lebedev, a Russian Oligarch, may be partly responsible for the change?
Whatever the reasons, I welcome a shift to open-mindedness, to skeptical evidence-based journalism.
Perhaps the threat to Press Freedom on the back of the hacking/bugging/buying off public officials scandal, still ongoing with the Leveson Enquiry, is also contributing to a culture of ‘what will it take to restore ethics, probity and respect to UK journalism?’
Whatever the reasons, it is an important step along the road, one I hope that continues until 2020.
I agree with the comments on the last page of Richard Lindzen’s presentation that it’s time to stop using the the word “skeptic” and replace it with something like “realist.” “Skeptic” makes it sound like we don’t even agree with the most basic accepted facts given at the beginning of his presentation when it’s the unfounded alarmism, wild speculation, and imperative to act that we don’t agree with.
The In-Dope-Pendant’s headline would have been fine if the phrase “.like the Millennium Bug,” had been omitted.
I have to agree with the people who have pointed out that the Y2K bug was in fact a real problem. The difference is that the IT specialists were working behind the scenes trying to find and fix those bugs in a lot of systems.
Microsoft came out with Windows ME at the time to replace Windows98 or whatever version preceded the Windows ME. I would imagine that those who changed their OS had no problem because of the effort by Microsoft to try and avoid a problem.
That is only one software company and it is only one side of the problem that had existed. I worked in accounts at the time and the system in use was old. The company had to move from something called Distrib which was extremely outdated to a new system. I think that this is actually the crux of the issue because a lot of small firms were using very outdated software that was about to fall over when the Y2 problem was hitting.
In this case mitigation worked and at the stroke of midnight nothing extraordinary happened. Our computer systems did not break down and the press brouha was an absolute joke… just like their brouha in 1984…. and now look what is happening!!! Cameras are on the streets everywhere….
BTW EXP is an Aussie troll. It would be a good idea not to feed this particular troll and ignore him
[ESP? Vice EXP. Robt]
Well, let me suggest an effective approach that worked. We had some really large systems. We were pretty sure that little or no detail maintenance would be done; not sexy enough.
We specified what we called a Federal Date; yyyymmdd. Despite derision, we used it from 1975 on.
The approach? Plan ahead. Avoid problems.
“There is no parallel between Y2K and cAGW – because the former was a real and potentially serious problem and the latter almost certainly is not.”
Agreed. I was working for a mutual fund company, we spent about 2 years fixing mainframe code to keep the system from shutting down, plus all of the ancillary systems. My Dad and my sisters, also in IT, worked on many different projects, including a waste water treatment plant. If that one wasn’t fixed, the wrong amount of chemicals would have been released at the wrong time, poisoning the water supply.
We were all on alert that night, and the only thing we saw that failed was the time/date on some other website.
This is typical of the type of mistake which is made when trying to explain the (radiative) greenhouse conjecture. On his website Dr RoySpencer wrote this very strange comment when trying to “prove ” that the Second Law is not violated.
But the same objections could be made against many systems which create very high temperatures. You can pump energy into a system at a certain rate, and insulate the system so that it cannot lose heat easily and thus increase temperatures to very high levels.
And he seemed to imply this “heat furnace” concept applies in the atmosphere. Let me quote Wikipedia (Second Law .. ) The second law declares the impossibility of machines that generate usable energy from the abundant internal energy of nature by processes called perpetual motion of the second kind.
Any “pumping” up of temperatures in the atmosphere would have to raise the temperature up there to more than the surface temperature at the time before any spontaneous radiation from the atmosphere would warm the surface.
It cannot happen, Roy, and it doesn’t.
And before you come back at me with discussion of “net” radiation, tell me what physical entity you think net radiation actually corresponds to. Are two rays on opposite sides of the World (day and night) going to have a combined effect? Hardly! Nor would they even if only a metre apart in parallel with each other.
The only way any effect of radiation in one direction can be altered by radiation in the other direction is via thermal energy addition. This means the energy in each ray has to be converted to thermal energy first. After all, the energy might exit the surface by evaporation or some other non-radiative process.
Each ray has to be considered as a separate process. So any conversion to thermal energy involving radiation from a cooler atmosphere to a warmer surface violates the Second Law. Other rays in the opposite direction cannot justify the violation.
The reason it does not happen is because the absorptivity of the surface does in fact reduce to zero for radiation coming from a cooler source. Such radiation merely resonates or is “rejected” in some way, just as if it underwent diffuse reflection. Only radiation from hot to cold counts when it comes to anything to do with temperatures of the target.
My funnel experiment proves that his must be the case and that the warmer surface can in fact handle any amount of such radiation without it affecting its own rate of emission or its temperature.
(I give notice that I have submitted a paper on this and do not wish to reveal the explanatory mechanism detailed in that paper at this point for obvious reasons.)
The parallel between Y2K and AGW was made before 2000 by people panicked over both, such as Lawrence Lessig – http://code-is-law.org/conclusion_excerpt.html
As to the “reality” of Y2K, it was probably about 20% real: those systems which simply could not be allowed to go wrong had to be (expensively) tested and fixed in advance, and I do not doubt those above who say they did vital work on electrical systems and the like. My own experience was that it would have been far cheaper to deal with problems as they arise, and I believe that is more typical of ordinary business IT, which is the majority of software development
On 1st January 2000 I was making money at a rate of £30,000/hour 🙂
Sadly, not for very long 🙁
I am an IT consultant and I have seen more Y2K problems in the past five years than I did at the end of 1999.
dak