Is It Time To Stop The Insanity Of Wasting Time and Money On More Climate Models?

Guest Opinion: Dr. Tim Ball

Nearly every single climate model prediction, projection or whatever else they want to call them has been wrong. Weather forecasts beyond 72 hours typically deteriorate into their error bands. The UK Met Office summer forecast was wrong again. I have lost track of the number of times they were wrong. Apparently, the British Broadcasting Corporation had enough as they stopped using their services. They are not just marginally wrong. Invariably, the weather is the inverse of their forecast.

Short, medium, and long-term climate forecasts are wrong more than 50 percent of the time so that a correct one is a no better than a random event. Global and or regional forecasts are often equally incorrect. If there were a climate model that made even 60 percent accurate forecasts, everybody would use it. Since there is no single accurate climate model forecast, the IPCC resorts to averaging out their model forecasts as if, somehow, the errors would cancel each other out and the average of forecasts would be representative. Climate models and their forecasts have been unmitigated failures that would cause an automatic cessation in any other enterprise. Unless, of course, it was another government funded, fiasco. Daily weather forecasts are improved from when modern forecasting began in World War I. However, even short term climate forecasts appear no better than the Old Farmers Almanac, which appeared in 1792, using moon, sun, and other astronomical and terrestrial indicators.

I have written and often spoken about the key role of the models in creating and perpetuating the catastrophic AGW mythology. People were shocked by the leaked emails from the Climatic Research Unit (CRU), but most don’t know that the actual instructions to “hide the decline” in the tree ring portion of the hockey stick graph were in the computer code. It is one reason that people translate the Garbage In, Garbage Out (GIGO) acronym as Gospel in, Gospel Out when speaking of climate models.

I am tired of the continued pretense that climate models can produce accurate forecasts in a chaotic system. Sadly, the pretense occurs on both sides of the scientific debate. The reality is the models don’t work and can’t work for many reasons, including the most fundamental; lack of data, lack of knowledge of major mechanisms, lack of knowledge of basic physical processes, lack of ability to represent physical mechanisms like turbulence in mathematical form, and lack of computer capacity. Bob Tisdale summarized the problems in his 2013 book Climate Models Fail. It is time to stop wasting time and money and put people and computers to more important uses.

The only thing that keeps people working on the models is government funding, either at weather offices or in academia. Without this funding computer modelers would not dominate the study of climate. Without the funding, the Intergovernmental Panel on Climate Change could not exist. Many of the people involved in climate modeling were not familiar with or had no training in climatology or climate science. They were graduates of computer modeling programs looking for a challenging opportunity with large amounts of funding available and access to large computers. The atmosphere and later the oceans fit the bill. Now they put the two together to continue the fiasco. Unfortunately, it is all at massive expense to society. Those expenses include the computers and the modeling time but worse the cost of applying the failed results to global energy and environmental issues.

Let’s stop pretending and wasting money and time. Remove that funding and nobody would spend private money to work on climate forecast models.

I used to argue that there was some small value in playing with climate models in a laboratory, with only a scientific responsibility for the accuracy, feasibility, and applicability. It is clear they do not fulfill those responsibilities. Now I realize that position was wrong. When model results are used as the sole basis for government policy, there is no value. It is a massive cost and detriment to society, which is what the Intergovernmental Panel on Climate Change (IPCC) was specifically designed to do.

The IPCC has one small value. It illustrates all the problems identified in the previous comments. Laboratory-generated climate models are manipulated outside of even basic scientific rigor in government weather offices or academia, and then become the basis of public policy through the Summary for Policymakers (SPM).

Another value of the IPCC Physical Science Basis Reports is they provide a detailed listing of why models can’t and don’t work. Too bad few read or understand them. If they did, they would realize the limitations are such that they preclude any chance of success. Just a partial examination illustrates the point.

Data

The IPCC people knew of the data limitations from the start, but it didn’t stop them building models.

In 1993, Stephen Schneider, a primary player in the anthropogenic global warming hypothesis and the use of models went beyond doubt to certainty when he said,

“Uncertainty about important feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.”

A February 3, 1999, US National Research Council Report said,

Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

To which Kevin Trenberth responded,

It’s very clear we do not have a climate observing system….This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.

Two Directors of the CRU, Tom Wigley, and Phil Jones said,

Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.

70% of the world is oceans and there are virtually no stations. The Poles are critical in the dynamics of driving the atmosphere and creating climate yet there are virtually no stations in 15 million km2 of the Arctic Ocean or for the 14 million km2 of Antarctica. Approximately 85% of the surface has no weather data. The IPCC acknowledge the limitations by claiming a single station data are representative of conditions within a 1200km radius. Is that a valid assumption? I don’t think it is.

But it isn’t just lack of data at the surface. Actually, it is not data for the surface, but for a range of altitudes above the surface between 1.25 to 2 m and as researchers from Geiger (Climate Near the Ground) on show this is markedly different from actual surface temperatures as measured at the few microclimate stations that exist.  Arguably US surface stations are best, but Anthony Watts diligent study shows that only 7.9 percent of them accurate to less than 1°C. (Figure 1) To put that in perspective, in the 2001 IPCC Report Jones claimed a 0.6°C increase over 120 years was beyond a natural increase. That also underscores the fact that most of the instrumental record temperatures were measured to 0.5°C.

clip_image002

Figure 1

Other basic data, including precipitation, barometric pressure, wind speed, and direction are worse than the temperature data. For example, in Africa there are only 1152 weather watch stations, which are one-eighth the World Meteorological Organization (WMO) recommended minimum density. As I noted in an earlier paper, lack of data for all phases of water alone guarantees the failure of IPCC projections.

The models attempt to simulate a three-dimensional atmosphere, but there is virtually no data above the surface. The modelers think we are foolish enough to believe the argument that more layers in the model will solve the problem, but it doesn’t matter if you have no data.

Major Mechanisms

During my career as a climatologist, several mechanism of weather and climate were either discovered or measured, supposedly with sufficient accuracy for application in a model. These include, El Nino/La Nina (ENSO), the Pacific Decadal Oscillation (PDO), the Atlantic Multidecadal Oscillation (AMO), the Antarctic Oscillation (AAO), the North Atlantic Oscillation (NAO), Dansgaard-Oeschger Oscillation (D-O), Madden-Julian Oscillation (MJO), Indian Ocean Dipole (IOD), among others.

Despite this, we are still unclear about the mechanisms associated with the Hadley Cell and the Inter-tropical Convergence Zone (ITCZ), which are essentially the entire tropical climate mechanisms. The Milankovitch Effect remains controversial and is not included in IPCC models. The Cosmic Theory appears to provide an answer to the relationship between sunspots, global temperature, and precipitation but is similarly ignored by the IPCC. They do not deal with the Monsoon mechanism well as they note,

In short, most AOGCMs do not simulate the spatial or intra-seasonal variation of monsoon precipitation accurately.

There is very limited knowledge of the major oceanic circulations at the surface and in the depths. There are virtually no measures of the volumes of heat transferred or how they change over time, including measures of geothermal heat.

Physical Mechanisms.

The IPCC acknowledge that,

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

That comment is sufficient to argue for cessation of the waste of time and money. Add the second and related problem identified by Essex and McKitrick in Taken By Storm and it is confirmed.

Climate research is anything but a routine application of classical theories like fluid mechanics, even though some may be tempted to think it is. It has to be regarded in the “exotic’ category of scientific problems in part because we are trying to look for scientifically meaningful structure that no one can see or has ever seen, and may not even exist.

In this regard it is crucial to bear in mind that there is no experimental set up for global climate, so all we really have are those first principles. You can take all the measurements you want today, fill terabytes of disk space if you want, but that does not serve as an experimental apparatus. Engineering apparatus can be controlled, and those running them can make measurements of known variables over a range of controlled physically relevant conditions. In contrast, we have only today’s climate to sample directly, provided we are clever enough to even know how to average middle realm data in a physically meaningful way to represent climate. In short, global climate is not treatable by any conventional means.

Computer capacity

Modelers claim computers are getting better, and all they need are bigger, faster computers. It can’t make any difference, but they continue to waste money. In 2012, Cray introduced the promotionally named Gaea supercomputer (Figure 2). It has a 1.1 petaflops capacity. FLOPS means Floating-Point Operations per Second, and peta is 1016 (or a thousand) million floating-point operations per second. Jagadish Shukla says the challenge is

We must be able to run climate models at the same resolution as weather prediction models, which may have horizontal resolutions of 3-5 km within the next 5 years. This will require computers with peak capability of about 100 petaflops

Regardless of the computer capacity it is meaningless without data for the model.

clip_image004

Figure 2: Cray’s Gaea Computer with the environmental image.

Failed Forecasts, (Predictions, Projections)

Figure 3 shows the IPCC failed forecast. They call them projections, but the public believes they are forecasts. Either way, they are consistently wrong. Notice the labels added to Hayden’s graph taken from the Summary for Policymakers. As the error range increase in the actual data the Summary claims it is improving. One of the computer models used for the IPCC forecast belongs to Environment Canada. Their forecasts are the worst of all of those averaged results used by the IPCC (Figure 4).

clip_image006

Figure 3

clip_image008

Figure 4 Source; Ken Gregory

The Canadian disaster is not surprising as their one-year forecast assessment indicates. They make a one –year forecast and provide a map indicating the percentage of accuracy against the average for the period 1981-2010 (Figure 5).

clip_image010

Figure 5

The Canadian average accuracy percentage is shown in the bottom left as 41.5 percent. That is the best they can achieve after some thirty years of developing the models. Other countries results are no better.

In a New Scientist report Tim Palmer, a leading climate modeller at the European Centre for Medium-Range Weather Forecasts in Reading England said:

I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.

The Cost

Joanne Nova has done most research on the cost of climate research to the US government.

In total, over the last 20 years, by the end of fiscal year 2009, the US government will have poured in $32 billion for climate research—and another $36 billion for development of climate-related technologies. These are actual dollars, obtained from government reports, and not adjusted for inflation. It does not include funding from other governments. The real total can only grow.

There is no doubt that number grew, and the world total is likely double the US amount as this commentator claims.

However, at least I can add a reliable half-billion pounds to Joanne Nova’s $79 billion – plus we know already that the EU Framework 7 programme includes €1.9 billion on direct climate change research. Framework 6 runs to €769 million. If we take all the Annex 1 countries, the sum expended must be well over $100 billion.

These are just the computer modeling costs. The economic and social costs are much higher and virtually impossible to calculate. As Paul Driessen explains

As with its polar counterparts, 90% of the titanic climate funding iceberg is invisible to most citizens, businessmen and politicians.

It’s no wonder Larry Bell can say,

The U.S. Government Accounting Office (GAO) can’t figure out what benefits taxpayers are getting from the many billions of dollars spent each year on policies that are purportedly aimed at addressing climate change.

If it is impossible for a supposedly sophisticated agency like US GAO to determine the costs, then there is no hope for a global assessment. There is little doubt the direct cost is measured in trillions of dollars. That does not include the lost opportunities for development and lives continuing in poverty. All this because of the falsified results from completely failed computer model prediction, projections or whatever they want to call them.

It is time to stop the insanity, which in climate science is the repetition of creating computer models that don’t and can’t work? I think so.

“Those who have knowledge don’t predict. Those who do predict don’t have knowledge.” Tzu, Lao (6th Century BC)


Note: this article was updated shortly after publication to fix a text formatting error.

Get notified when a new post is published.
Subscribe today!
5 1 vote
Article Rating
218 Comments
Inline Feedbacks
View all comments
manicbeancounter
September 14, 2015 2:17 pm

Unlike the Roy Spencer the reconciling actual data to the CIMP5 models, there is one that shows the actual temperature range within that of the hindcasted models. Reproduced below, it is from Prof Ed Hawkins of Reading University, near London.
There are problems.
1. It is only from 1985,
2. That is as least 20 years BEFORE the models were devised.
3. The model range through to 2050 is huge, and the bottom end is virtually no warming. So we have both the dire apocalypse and the trivial all within the range of the climate models.
4. There is less than a decade of actual forecast – and that is bumping along the bottom. Hardly something to get all in a tizzy about.
5. The “actual” data is HADCRUT4, which shows a greater recent warming trend than HADCRUT3. This in turn shows a greater recent warming trend than the satellites.
http://www.climate-lab-book.ac.uk/wp-content/uploads/fig-nearterm_all_UPDATE_2014.png

manicbeancounter
Reply to  manicbeancounter
September 14, 2015 2:29 pm

A way of looking evaluating the evidence for climate catastrophism is to consider whether over time climatology is making statements in support of the CAGW hypothesis that with increasing empirical content, or whether it is degenerating towards more banal statements, which do not exclude milder versions of the AGW hypothesis or even random or natural variations.
http://manicbeancounter.files.wordpress.com/2015/09/090515_1425_degeneratin11.png?w=600
Explained in detail at http://manicbeancounter.com/2015/08/14/can-climatology-ever-be-considered-a-science/

JK
Reply to  manicbeancounter
September 14, 2015 3:29 pm

At the link manicbeancounter gives his favourite Richard Feynman quote:
‘You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.’
This reminded me of a sentence in Tim Ball’s post:
‘The Cosmic Theory appears to provide an answer to the relationship between sunspots, global temperature, and precipitation but is similarly ignored by the IPCC.’
It seems to me that the way Ball refers to the theory here is a bit vague. I can understand that a more precise explanation would be off topic for the post but, as with the criticims of weather forecasting, a reference or two would go a long, long way.
It also seems that his complaint about the IPCC in this sentence is imprecise. The report of Working Group I, Chapter 7, on Clouds and Aerosols does provide some assessment of connection between cosmic rays and clouds (section 7.4.6 Impact of Cosmic Rays on Aerosols and Clouds, in http://www.ipcc.ch/pdf/assessment-report/ar5/wg1/WG1AR5_Chapter07_FINAL.pdf ) It might be argued that they are dismissive, or wrong, or don’t pay it enough attention. But they do discuss over 40 papers. I may be mistaken, but I think that’s probably more than Wattsupwiththat have discussed.
This may seem like a small complaint, but if Ball wants to really win the argument then he needs to be sharper, and not give opponents a reason to dismiss what he says as ‘not literally true’.
This is especially so since I don’t think it really stands up to say that the scientific establishment have ignored the cosmic ray theory. If the IPCC had simply had to scrape together every cursory mention they could find. But this is not so. If you look up papers by Svensmark or Kirkby on Google Scholar you will find them cited many hundreds of times. Admittedly, many are from sceptics who might be considered ‘outsiders’, but they are by no means dominant.
So if you want to claim that there is a theory which is being ignored, you really need to point to it. Stop being so vague.

Reply to  manicbeancounter
September 14, 2015 9:10 pm

In figure 7 of http://www.drroyspencer.com/2015/04/version-6-0-of-the-uah-temperature-dataset-released-new-lt-trend-0-11-cdecade , radiosonde data indicate that the lowest 100-200 or so meters of the lower troposphere outwarmed the “satellite-measured lower troposphere as a whole” (my words) from 1979-2015 by about the same amount that HadCRUT3 did. I think HadCRUT3-global is closer to the truth of global surface temperature than any any satellite dataset of the global lower troposphere, all surface global datasets obsoleted before 2005 (since post-2005 matters), all surface ones newer than 2008, and all versions of GISS and NCDC/NCEI global temperature of around or after 2008. HadCRUT3 even has better correlation with all versions of both major satellite-measured global lower troposphere temperature than every other global surface temperature dataset that was current around or after 2008 in terms of fluctuations. I think HadCRUT3 had accuracy so great that it should be restored.

Marcuso8
September 14, 2015 2:53 pm

Give me a high speed computer and I can make it tell you what ever you pay me to tell you !!!! Funny how that works !!!

Marcuso8
Reply to  Marcuso8
September 14, 2015 2:54 pm

The ONLY model that counts is REALITY !!

MarkW
Reply to  Marcuso8
September 14, 2015 3:12 pm

The only model that counts is Heidi Klum.

Reply to  Marcuso8
September 14, 2015 8:47 pm

The only near-model that could count was Carol Vordeman…
https://en.wikipedia.org/wiki/Carol_Vorderman

Matt
September 14, 2015 2:53 pm

In this whole scamming industry how many are involved. How much of the trillions of dollars does each scammer get? Are these some of the highest paid people on earth?
Just curious.

Marcuso8
Reply to  Matt
September 14, 2015 3:03 pm

The trail is hidden because it’s government !!! Just ask the EPA for backup data !!!

knr
September 14, 2015 3:04 pm

In the end the ‘models’ are merely automated ways to manipulate numbers , they have no real intelligence they can do nothing but what they are told to do. And that is where the ‘trick’ is , by starting with the ‘right ‘ assumptions you can get the ‘right’ results even if that results are in reality worth nothing .

Marcuso8
Reply to  knr
September 14, 2015 3:09 pm

Actually , that would be ” Left ” results, because it is mostly liberals that push this lie !!!

pat
September 14, 2015 3:06 pm

14 Sept: CarbonBrief: Daily Briefing | 2015 and 2016 set to break global heat records, says Met Office
Scientists confirm there’s enough fossil fuel on Earth to entirely melt Antarctica
A new “blockbuster” study finds that burning all the world’s fossil fuel resources would raise global temperatures enough to eliminate the Antarctic ice sheet. The process would likely take up to 10,000 years but we would be committing ourselves to more than 50 metres of sea-level rise, enough to submerge major cities from Shanghai to New York, says RTCC. Most of the scientific focus has been on west Antarctica and this is the first research to look at the impact of fossil fuel burning on the entire sheet, says The Independent. The New York Times’s Andy Revkin has a video chat with the authors over at his Dot Earthblog. The Guardian, Reuters and TIME all have more on the new study. You can also readCarbon Brief’s write up. The Washington Post
http://www.carbonbrief.org/blog/2015/09/daily-briefing-2015-and-2016-set-to-break-global-heat-records/
btw link also has several links to UK media’s exaggerated headlines of Met Office claims that 2015 and 2016 COULD break global heat records…OR NOT.
hedging their bets:
14 Sept: UK Telegraph: Emily Gosden: World is warming again, says Met Office – but Britain could see cooler summers
Pause in global warming could be about to end, yet changes in the Atlantic Ocean could bring colder weather to northern Europe
“The world is warming again,” Professor Adam Scaife, head of monthly to decadal forecasting at the Met Office, said. “We can’t be sure this is the end of the slowdown, but decadal warming rates are likely to reach late-twentieth century levels within two years.”…
However, other likely changes to the climate, in the North Atlantic Ocean, could still lead to cooler and drier summers in the UK and northern Europe, scientists said…
***He said he thought it was “likely” that “we would see an absolute cooling – actually summers getting cooler than they have been recently” but insisted this was not a “forecast” because other factors – such as rising global surface temperatures – could counteract the impact…
A cooler North Atlantic could result in some “temporary recovery” in sea-ice in the North Atlantic in the Labrador and possibly Nordic seas, he said.
“This does not mean we are headed for the next ice age – absolutely not. We are talking about a modest cooling… but it is potentially enough to affect weather patterns in Europe and elsewhere.”…
http://www.telegraph.co.uk/news/earth/environment/climatechange/11862392/World-is-warming-again-says-Met-Office-but-Britain-could-see-cooler-summers.html

Marcuso8
Reply to  pat
September 14, 2015 3:11 pm

In other words , Toss a set of dice and you would be more accurate !!!

Gamecock
Reply to  pat
September 14, 2015 5:10 pm

Ahhh, the ubiquitous “new study.”

Gary Pearse
Reply to  pat
September 14, 2015 7:48 pm

Only a social scientist could imagine us burning fossil fuels for 10,000 yrs more. The Neanderthal component in these authors is huge. We already have a replacement and it is thermonuclear – hopefully fusion soon to follow – it probably would already be here if the linear thinkers weren’t shouting nuclear down for over half a century. Imagining what will happen in a quarter century has never been accomplished. It is what is wrong with the doomsters that keep doing this.
They have one hand tied behind their backs because they never consider the elephant in the room factor – human ingenuity!! They never see it because they don’t possess ingenuity. They are students of fixed systems – how the lion behaves or the sex life of a flea. Thomas Malthus (late 18th, early 19thC) had us all buried in horse manure by the middle of the 19th Century; Jevons had the industrial revolution starving itself out with coal resource decline by the end of the 19th Century. The club of Rome in 1972 (Ehrlich, Holdren,…) had us starving to death before the end of the 20th Century. By the first decade of the 21st Century, the 1972 population had been doubled and the number of people in poverty was fewer than that in 1972!
These people are biological accountants whose thoughts about the future are worthless, because human ingenuity is not within the purview of their considerations. Ceterus paribus thinking disqualifies them from the business of pronouncements about the future. Humankind has done some awful things, but with countless doom scenarios that NEVER, EVER came to pass, I think it safe to say that it is axiomatic that humans cannot destroy the planet – they can do some temporary damage.
I don’t wish to make small the horror of Hiroshima, but radiation levels had decreased to background levels in one year and the city was rebuilt. Thinkers on this remarkable fact ask, why is this the case in Hiroshima (and Nagasaki) and not in Chernobyl? Well the answer is supplied by nature. The vast ‘exclusion zone’ of Chernobyl has sprung up as a remarkable wildlife refuge with wild pigs, wolves, bears, beavers and all sorts of creatures thought to have been extirpated in much of Europe. Anti nuclear energy and lefty activists hate that this has happened and snipe about how the animals are all sick and other BS. Have a look:
https://ca.search.yahoo.com/search?fr=mcafee&type=C111CA662D20141029&p=Hiroshima+today
http://www.slate.com/articles/health_and_science/nuclear_power/2013/01/wildlife_in_chernobyl_debate_over_mutations_and_populations_of_plants_and.html
Looks bad, eh? But:
“Chernobyl’s abundant and surprisingly normal-looking wildlife has shaken up how biologists think about the environmental effects of radioactivity. The idea that the world’s biggest radioactive wasteland could become Europe’s largest wildlife sanctuary is completely counterintuitive for anyone raised on nuclear dystopias.”

MarkW
Reply to  Gary Pearse
September 16, 2015 11:40 am

Thermonuclear is fusion.

MarkW
Reply to  Gary Pearse
September 16, 2015 11:41 am

The types and amount of radiation put out by Chernobyl and the Hiroshima/Nagasaki bombs were vastly different.

Reply to  pat
September 14, 2015 9:18 pm

The catastrophe assumes feedbacks that I think are unrealistically high, and CO2 absorption by the oceans that I think is unrealistically low. I think more likely, burning all available fossil fuels will merely delay the next ice age glaciation by a few thousand years. And, I think we should do reasonably economic energy conservation (I prefer mainly by energy-efficiency improvement because that is largely economically practical) for many reasons, because improvable inefficiencies of homes, motor vehicles and appliances waste money and resources and increase pollution. Also, postponing burning of fossil fuels postpones the oceans removing their CO2 barrier to the next ice age glaciation.

Marcuso8
September 14, 2015 3:12 pm

More Humans die from cold than from heat around the world !! REALITY !!!

Phaedrus
September 14, 2015 3:24 pm

It seems very simple to me, rather than waste money chasing the Global Warming solution (there isn’t one and thats why supporting researchers love the endless funding) why don’t we look at ways that genuinely make life more energy efficient. Better housing, use of waste heat, fantastic public transport etc etc. The list is endless and will result in a solution rather than Armageddon of energy poverty for all but the few.

Reply to  Phaedrus
September 14, 2015 3:52 pm

Or close a hurricane’s predicted landfall by 50+ miles when it’s 2 days out?
Or how about giving a tornado warning or alert 15 minutes sooner?
Or even just be reliably tell a farmer he can drive his tractor out into the field and it won’t get stuck in the mud?
Climate models might have a place in research. But invest in getting tomorrow right before warning us about 20, 50, 100 years from now and pretending you can do something about it today (with our dollars and freedoms as the price).

MarkW
Reply to  Gunga Din
September 16, 2015 11:43 am

Basic research, such as trying to figure out exactly how clouds “work” would benefit both weather models and climate models.

Reply to  Phaedrus
September 14, 2015 10:22 pm

I would like reduction of wasteful electricity consumption by the “vampires”. Reasonable reduction of electricity consumption by those will save consumers more within a couple years than the cost increase of improved energy efficiency of the “vampires”. For example, my cable modem and another box that my cable company put into my home that was necessary to separate phone and Internet signals from my cable feed. These two devices combined have 13 LEDs, of which 9 are on nearly all the time and another is on about half the time. These LEDs are fairly obviously very likely cheapies using cheap LED chemistries developed in the 1970s and which had little improvement since the mid 1980s. I have means to examine their spectra, and I know LEDs. I’m fairly sure these LEDs and their associated losses in dropping resistors, driving circuitry and power supplies is around .7 to 1.5 watts. At US national average residential electricity cost of $0.12 per KWH, for 10 years, this is around $7 to $16 – in 2015 dollars and assuming residential electricity does not have inflation exceeding the official inflation index.
I know of LEDs where these losses can be reduced by 90-plus percent and costing a few cents each more at the factory. And even if 5-times that is what the cable companies have to pay for cable modems and similar “boxes” with more-modern higher-efficiency LEDs, it would cost my cable company about $5 more to provide me both “boxes” if those were made with LEDs that would save me $6-$15 in my electric bill over 10 years. And one of these boxes I had for more than 10 years, and the other was imposed onto me last year to make it’s function using my grid electricity as opposed to my landlord’s with loss of economy of scale, and probably due to a modernization project of my cable company (to make apartment renters more similar to homeowners) as opposed to something that my landlord asked for.
Meanwhile, my PC has another of these LEDs for power-on indication. I know a fairly easy hack, something so easy that “hack” may be an unfairly strong word, that I can do to reduce its impact on my electric bill by around .1 watt. I recently got some InGaN green LEDs that are suitable for this, with some surplus to get a better quantity discount that reduced the cost of a R&D project that I am working on. With one of these and a resistor that cost me maybe 2 cents, and maybe 5 minutes of personal labor (I can do other things while my soldering iron is warming up), I will soon reduce the .1 watt load to about .005 watt, for electricity savings of .095 watt, which would reduce a US-national-average residential electric bill by about 10 cents per year if the PC is always on.
For that matter, my monitor has a power-on LED of the same old-tech chemistry, so does my printer, and so does my computer’s set of loudspeakers. My landline answering machine has old-tech LED chemistry for a digital display that is always on, and it is powered by a “wall wart” that consumes about a watt more than if it had the energy-efficient technology typical of modern cellphone chargers.
More in that area – why should there be incandescent nightlights that consume 3 to 7 watts? Why should many with power consumption of 5 to 7 watts have shades to block a fair amount of the already-inefficiently-produced light? Due to some known economies of scale, a good 7-watt 120V incandescent has half or less the efficiency of a good 75W 120V incandescent. Effective LED nightlights consuming .4-1.25 watts have been available for a few years already, and ~3-watt LED lightbulbs with light output like that of 25W incandescents have been on the market in brick-and-mortar home centers throughout 2015, along with an 8.5W LED lightbulb that would match or slightly outperform even a better 75W non-halogen incandescent (now largely unavailable) for nighttime porch lighting – available from Lowes in US for about $5-$6.

tabnumlock
September 14, 2015 3:33 pm

I can easily predict the climate and with high accuracy. It’s going to get very cold for a very long time.
http://www.scottcreighton.co.uk/images/Spiral-Precession/Glacial_eras.jpg

Reply to  tabnumlock
September 14, 2015 4:31 pm

“…going to get very cold for a very long time.”
A -8 C anomaly is not all that cold. Here the annual real value swings from -15 F to 100 F. Somehow we all survived.

Reply to  tabnumlock
September 14, 2015 4:49 pm

tabnumlock,
Here’s another, going back 4.5 billion years:
http://www.newscientist.com/data/images/archive/2839/28392301.jpg
Notice that we’re now at the cold end of the range. We could use a few degrees more warming. Unfortunately, you’re probably right. The odds favor a future that’s colder, rather than warmer.

tabnumlock
Reply to  dbstealey
September 15, 2015 6:17 am

We’re at the end of a small warming period within a declining series of warming periods at the end of an interglacial within an ice age and in one of only two CO2 crashes in earth’s history.
http://www.dandebat.dk/images/1365p.jpg
http://www.climate4you.com/images/VostokTemp0-420000%20BP.gif
Red box:
http://www.climate4you.com/images/GISP2%20TemperatureSince10700%20BP%20with%20CO2%20from%20EPICA%20DomeC.gif

RoHa
September 14, 2015 4:04 pm

“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.”
Not a problem. We change the records to fit the results.
“…the necessary data are lacking …”
Not a problem. We’ve got the theory. What do we need the data for? And if we had it, someone would only use it to try to prove us wrong.

stevek
September 14, 2015 4:15 pm

I would not be the least bit surprised that models these institutions have come up with do in fact predict and accurately model the temperature record with few input parameters. However the problem is these models do not show much global warming so are rejected.
Same type of thing happens in boardrooms. An analyst will come along and show the company that his model predicts declines in profits and revenue over the years if something is not done. The board rejects the analyst views because too much political power depends on status quo. The egos are too attached to a false reality. The won’t even entertain the idea it is false. Any will dismiss any attacks out of hand.
Letters were written to the sec warning that bernie maddoffs results were impossible. Nobody did anything. The were attached to a false reality and could not accept that they were duped.

September 14, 2015 4:34 pm

stevek
See “Trip to Abilene” paradox, a lesson in “go along to get along” mis-management.

Bill Illis
September 14, 2015 4:41 pm

The biggest problem is that 100,000 climate scientists, 50 governments, $300 billion per year in green energy proponents and 100 million people …
… have staked their reputations on the CO2 global warming theory being correct.
How do they back down? Why would they back down when they can just continually just adjust the temperature record every single month and then take another upward adjustment step every few years with a new methodology backed by a flimsy paper.
And how many climate model results indicating 3.0C per doubling do we really need anyway? Hasn’t there been enough already? Do we need to keep funding 50 climate models for the next 100 years?
Answer is: As long as they can keep getting away with it.

Reply to  Bill Illis
September 14, 2015 4:45 pm

Bill Illis,
Excellent comment, best one I’ve seen today.

kim
Reply to  Bill Illis
September 14, 2015 11:05 pm

The BRICS are on to the sham, still in it for the scam.
===============

September 14, 2015 5:29 pm

There is a very good book about economic forecasting called “The Death of Economics” by Paul Ormerod. It has a lot of information about chaotic systems and why forecasting their future state is impossible. One of the points he makes is that unless a model is perfect, it cannot give an accurate forecast. But even more disconcertingly, making the model closer to reality can actually make the forecast worse! Loads of well-explained maths about attractors, cycles, etc. Recommended.

amirlach
September 14, 2015 5:53 pm

Had an interesting back and fort with Ed Wiebe, who works at UVic’s School of Earth and Ocean Sciences.
http://climate.uvic.ca/people/ewiebe/
Link to the UVic Model here. http://climate.uvic.ca/model/
Interesting because he was utterly clueless when it came to actual real world observations and how badly the models actually preformed. He repeatedly claimed “the Models work surprisingly well”. Yet was unable to produce a single example.
His got to sources were SkS and Desmog. All he could really muster was repeated appeals to authority like this gem.
“Just read the IPCC reports, the summaries at least, they are the reference documents you should use.”
The fact that every single IPCC Model has been invalidated by observation was ignored. Referenced Documents? By who?
Too Funny.

dmh
Reply to  amirlach
September 14, 2015 6:48 pm

“Just read the IPCC reports, the summaries at least, they are the reference documents you should use.”
That’s funny. The reason I know the models don’t match observations is because I HAVE read the IPCC reports.
I don’t have the link handy, but one of the amazing things about AR5 was that they set aside model results in regard to sensitivity in favour of “expert opinion”. It would be enlightening to get a comment from Ed Wiebe on the matter.

ScienceABC123
September 14, 2015 7:09 pm

I think an accurate climate model would be very useful. However, the first step to developing an accurate climate model must be not adjusting the observational data to match the model!

September 14, 2015 8:06 pm

IPCC AR5 TS.6 Key Uncertainties is where climate science “experts” admit what they don’t know about some really important stuff. IPCC is uncertain about the connection between climate change and extreme weather especially drought. IPCC is uncertain about how the ice caps and sheets behave. Instead of gone missing they are bigger than ever. IPCC is uncertain about heating in the ocean below 2,000 meters which is 50% of it, but they “wag” that’s where the missing heat of the AGW hiatus went, maybe. IPCC is uncertain about the magnitude of the CO2 feedback loop, which is not surprising since after 18 plus years of rising CO2 and no rising temperatures it’s pretty clear whatever the magnitude, CO2 makes no difference.
In IPCC AR5 text box 9.2 they admit the models don’t work. The full box is several pages of weasel words.
“Model Response Error
The discrepancy between simulated and observed GMST trends during 1998–2012 could be explained in part by a tendency for some CMIP5 models to simulate stronger warming in response to increases in greenhouse gas (GHG) concentration than is consistent with observations (Section 10.3.1.1.3, Figure 10.4).”
IPCC’s CO2/climate sensitivity is too high. See “Climate Change in 12 Minutes.”
“Their staff is too long, they are digging in the wrong spot.”
“Almost all CMIP5 historical simulations do not reproduce the observed recent warming hiatus. There is medium confidence that the GMST trend difference between models and observations during 1998–2012 is to a substantial degree caused by internal variability, with possible contributions from forcing error and some CMIP5 models overestimating the response to increasing GHG and other anthropogenic forcing. The CMIP5 model trend in ERF shows no apparent bias against the AR5 best estimate over 1998–2012. However,
confidence in this assessment of CMIP5 ERF trend is ****low, ****primarily because of the *****uncertainties**** in model aerosol forcing and processes, which through spatial heterogeneity might well cause an undetected global mean ERF trend error even in the absence of a trend in the global mean aerosol loading.”
Translation: IPCC AR5 admits to the hiatus/pause/stasis/lull and the GCMs don’t have a clue!

willhaas
September 14, 2015 9:26 pm

I believe that they started with a weather simulation and hard coded in that an increase in CO2 over time causes warming. As such the simulation begs the question and is hence worthless. A general circulation simulation, numerically will include predictor corrector loops that can become unstable if one increases the spatial temporal sample size. To generate climate prediction results from essentially a weather simulation program in finite time they would have had to increase the spatial temporal sample size. That alone could make the simulation unstable no matter what the detailed modeling assumptions are. The results may be more a function of the processing instability rather than the actual model and hence the results may be meaningless. I think that for climate prediction the entire GCM simulation approach is a big mistake and for many reasons worthless. A totally different approach is required.

kim
September 14, 2015 10:06 pm

The models drone on, they’ve got a following wind.

David Cage
September 14, 2015 11:59 pm

I have only ever seen one accurate forecast of climate. That was done using the assumption that there is no change in climate FROM ITS NORMAL COMPLEX PATTERN. This pattern was determined using some very restricted data analysis software designed to look for hidden information in signals. It managed to reproduce the five year rolling average and was done for 64 zones before being averaged. It made correct predictions for any six hundred year period with any of the randomly selected start points.
Averaging the whole earth first appeared to lose too much information on the patterns to produce any usefulinformation.

William Astley
September 15, 2015 2:29 am

CGC will end the CAGW paradigm, stop the insanity. The long range climate forecasts are completely incorrect as climatology is chock full of Zombie theories.
The past is a guide to what will happen in the future. What is missing in paleo climatology is even the most basic understanding as to what is the physical cause of the glacial/interglacial cycle, what causes cyclic abrupt warming and cooling, and what caused the warming in the last 150 years.
There is cyclic warming and cooling (sometimes abrupt warming and cooling) in the paleo record. Interglacial periods end abruptly not gradually.
The Younger Dryas abrupt cooling period (11,900 years ago) at which time the planet went from interglacial warm to glacial cold with 70% of the cooling occurring in less than a decade and the duration of the cooling lasting for 1200 years, at a time when summer solar insolation at 65N has maximum. Solar insolation at 65N does not drive the glacial/interglacial cycle. Atmospheric CO2 levels has nothing to do with the glacial/interglacial cycle or the warming in the last 150 yeas.
There is a massive forcing function that causes the changes in the paleo record. The planet’s climate does not ‘jump’ or ‘tip’ from one state to other. Rocks do not occasionally jump up hill. Each and every time there was an abrupt slowdown in the solar cycle the planet cooled.
The North American warm blob is starting to disappear as wind speeds pick up and Pacific Ocean cloud cover increases. The same region of the Pacific ocean will cool as has already occurred in the Atlantic ocean.
Greenland ice temperature, last 11,000 years determined from ice core analysis, Richard Alley’s paper. William: The Greenland Ice data shows that have been 9 warming and cooling periods in the last 11,000 years. There was abrupt cooling 11,900 years ago (Younger Dryas abrupt cooling period when the planet went from interglacial warm to glacial cold with 75% of the cooling occurring in less than a decade and there was abrupt cooling 8200 years ago during the 8200 BP climate ‘event’).
http://www.climate4you.com/images/GISP2%20TemperatureSince10700%20BP%20with%20CO2%20from%20EPICA%20DomeC.gif
http://www.hidropolitikakademi.org/wp-content/uploads/2014/07/4.gif
http://www.climate4you.com/images/VostokTemp0-420000%20BP.gif
http://wattsupwiththat.com/2012/09/05/is-the-current-global-warming-a-natural-cycle/

“Does the current global warming signal reflect a natural cycle”
…We found 342 natural warming events (NWEs) corresponding to this definition, distributed over the past 250,000 years …. …. The 342 NWEs contained in the Vostok ice core record are divided into low-rate warming events (LRWEs; < 0.74oC/century) and high rate warming events (HRWEs; ≥ 0.74oC /century) (Figure). … ….The current global warming signal is therefore the slowest and among the smallest in comparison with all HRWEs in the Vostok record, although the current warming signal could in the coming decades yet reach the level of past HRWEs for some parameters. The figure shows the most recent 16 HRWEs in the Vostok ice core data during the Holocene, interspersed with a number of LRWEs. …. ….We were delighted to see the paper published in Nature magazine online (August 22, 2012 issue) reporting past climate warming events in the Antarctic similar in amplitude and warming rate to the present global warming signal. The paper, entitled "Recent Antarctic Peninsula warming relative to Holocene climate and ice – shelf history" and authored by Robert Mulvaney and colleagues of the British Antarctic Survey ( Nature, 2012,doi:10.1038/nature11391), reports two recent natural warming cycles, one around 1500 AD and another around 400 AD, measured from isotope (deuterium) concentrations in ice cores bored adjacent to recent breaks in the ice shelf in northeast Antarctica. ….

ulriclyons
September 15, 2015 3:00 am

“Global and or regional forecasts are often equally incorrect. If there were a climate model that made even 60 percent accurate forecasts, everybody would use it.”
My solar based NAO/AO long range forecasts have been achieving over 90% in recent years. Most people ignore it, and some attack it viciously, like Willis E did in 2013. The only sincere interest that I am getting is from farmers.

mwh
September 15, 2015 4:14 am

I watched a documentary not so long ago on how the BBC (could have been the Met Office) compiled its forecasts – I wish I had been concentrating then I might have found the link. From this it would appear that every forecast is derived from multiple possible scenarios fed in to their super computer and allowed to run. They then remove the extreme outliers and then assess the most likley outcome. Seeing as this computer cost many millions of pounds and is still operated on a GIGO basis (the scenarios) is it any better or less costly than say 25 qualified and unbiased meteorologists predicting the weather and taking the mean from them or even 25 seasoned farmers and fishermen. This would probably be just as accurate as the Met Office has preconceived ideas about which way the climate is going and therefore their scenario imputs are very unlikely to be neutral in nature, a predisposition to CO2 forcing and CAGW will definitely influence any human interface.
If forecasting past 5 days in the future is around 50% accurate then surely pure educated guesswork will not be a lot less accurate!

September 15, 2015 7:27 am

The science is settled.
No need for climate scientists.
No need for climate models.
Fire all the scientists.
Delete all the model software and sell the computers.
Use the money to buy gondolas for NYC so everyone can get to work on Wall Street

Darkinbad the Brighdayler
September 15, 2015 7:40 am

Modelling is not Mirroring!
Part of the purpose of a model is to express a thread of current knowledge that can be used as a benchmark to measure future incoming data against.
There can be numbers of concurrent models that explore different features or threads of incoming data, some of which eventually get ruled out by an increasing gulf between the observed data and the model’s predictive data.
From a scientific perspective, it is as important to know what isn’t the case as what is. A discounted model isn’t necessarily therefore wasted time or money, it all adds to the overall picture.
Winnowing the signal from the noise takes a lot of time, effort and data.
Perhaps the problem lies in the expectations of the observers in the sidelines more than in the process itself?
The answer to Life, The Universe and Everything is 42!

chapprg1
Reply to  Darkinbad the Brighdayler
September 20, 2015 5:34 pm

A model, computer or otherwise is not likely to add to understanding if it is not responsive to real world measurements since the it can go off in a direction presupposed by its programmers and continue to bias thinking in a direction deviating from reality. It can allow one to answer ‘what if’ questions in a quantitative way which may or may not reflect reality unless constrained by accurate and statistically valid data.

terrence
September 15, 2015 9:54 am

“Nearly every single climate model prediction, projection or whatever else they want to call them has been wrong.” I think they are more appropriately called “prophecies”, a la Harold Camping. But, “When the Judgment Day he foresaw did not materialize, the preacher revised his prophecy, saying he had been off by five months.”
The big difference between Camping and the AGW money grubbers is that “His independent Christian media empire spent millions of dollars — some of it from donations made by followers who quit their jobs and sold all their possessions– to spread the word..” So, he used his and his followers MONEY; the AGW money grubbers use OUR money (TAXPAYER money).