From NOAA News, Susan Solomon predicts the future with certainty. In other news, on the same day Caterpillar, Sprint, Texas Instruments, and Home Depot announce massive layoff plans to the tune of 50,000 people, unemployed climate modelers get a government bailout today courtesy of our new president to the tune of 140 million dollars. That should be just enough to pay the electric power bill for the new supercomputer I’m sure NOAA will just “have to have” now to keep up with the new toy for the Brits at Hadley. (h/t to Ed Scott for the NOAA pr)
New Study Shows Climate Change Largely Irreversible
January 26, 2009
A new scientific study led by the National Oceanic and Atmospheric Administration reaches a powerful conclusion about the climate change caused by future increases of carbon dioxide: to a large extent, there’s no going back.
The pioneering study, led by NOAA senior scientist Susan Solomon, shows how changes in surface temperature, rainfall, and sea level are largely irreversible for more than 1,000 years after carbon dioxide (CO2) emissions are completely stopped. The findings appear during the week of January 26 in the Proceedings of the National Academy of Sciences.
“Our study convinced us that current choices regarding carbon dioxide emissions will have legacies that will irreversibly change the planet,” said Solomon, who is based at NOAA’s Earth System Research Laboratory in Boulder, Colo.
“It has long been known that some of the carbon dioxide emitted by human activities stays in the atmosphere for thousands of years,” Solomon said. “But the new study advances the understanding of how this affects the climate system.”
The study examines the consequences of allowing CO2 to build up to several different peak levels beyond present-day concentrations of 385 parts per million and then completely halting the emissions after the peak. The authors found that the scientific evidence is strong enough to quantify some irreversible climate impacts, including rainfall changes in certain key regions, and global sea level rise.
If CO2 is allowed to peak at 450-600 parts per million, the results would include persistent decreases in dry-season rainfall that are comparable to the 1930s North American Dust Bowl in zones including southern Europe, northern Africa, southwestern North America, southern Africa and western Australia.
The study notes that decreases in rainfall that last not just for a few decades but over centuries are expected to have a range of impacts that differ by region. Such regional impacts include decreasing human water supplies, increased fire frequency, ecosystem change and expanded deserts. Dry-season wheat and maize agriculture in regions of rain-fed farming, such as Africa, would also be affected.
Climate impacts were less severe at lower peak levels. But at all levels added carbon dioxide and its climate effects linger because of the ocean.
“In the long run, both carbon dioxide loss and heat transfer depend on the same physics of deep-ocean mixing. The two work against each other to keep temperatures almost constant for more than a thousand years, and that makes carbon dioxide unique among the major climate gases,” said Solomon.
The scientists emphasize that increases in CO2 that occur in this century “lock in” sea level rise that would slowly follow in the next 1,000 years. Considering just the expansion of warming ocean waters—without melting glaciers and polar ice sheets—the authors find that the irreversible global average sea level rise by the year 3000 would be at least 1.3–3.2 feet (0.4–1.0 meter) if CO2 peaks at 600 parts per million, and double that amount if CO2 peaks at 1,000 parts per million.
“Additional contributions to sea level rise from the melting of glaciers and polar ice sheets are too uncertain to quantify in the same way,” said Solomon. “They could be even larger but we just don’t have the same level of knowledge about those terms. We presented the minimum sea level rise that we can expect from well-understood physics, and we were surprised that it was so large.”
Rising sea levels would cause “…irreversible commitments to future changes in the geography of the Earth, since many coastal and island features would ultimately become submerged,” the authors write.
Geoengineering to remove carbon dioxide from the atmosphere was not considered in the study. “Ideas about taking the carbon dioxide away after the world puts it in have been proposed, but right now those are very speculative,” said Solomon.
The authors relied on measurements as well as many different models to support the understanding of their results. They focused on drying of particular regions and on thermal expansion of the ocean because observations suggest that humans are contributing to changes that have already been measured.
Besides Solomon, the study’s authors are Gian-Kasper Plattner and Reto Knutti of ETH Zurich, Switzerland, and Pierre Friedlingstein of Institut Pierre Simon Laplace, Gif-Sur-Yvette, France.
NOAA understands and predicts changes in the Earth’s environment, from the depths of the ocean to the surface of the sun, and conserves and manages our coastal and marine resources.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Still waiting for those facts and figures. Several of you who seem to be confused about the difference between the time constant for a return to equilibrium levels after an increase in atmospheric CO2 versus the expected lifetime of a given CO2 molecule in the atmosphere. George Smith seems to think that the time constant can be calculated by examining the seasonal variability of the Mauna Loa CO2 measurements, but I’m not quite sure how this would apply unless the seasonal variability (due to photosynthesis or uptake changes due to seasonal ocean temperature changes?) ends.
BTW, a link to an actual peer reviewed paper would be helpful.
definitely, funding to projects that will have long term and more devastating effects should be prioritized
See above, it wasn’t that he owed much tax. They used “new Math” to calculate the penalties and interest.
Redemption
Just recently two NASA scientists showed that sea surface water temperture was the primary forcing that drives climate and that CO2 has little or no effect on climate.
Don’t have the link but you can find it over at ICECAP.
Yes, indeed they relied on some measurements as well as models. Isn’t that contradictory?
Redemption
Just because it is ‘freakin hot’ today where you live doesnt mean it hasn’t been even more freakin hot in the past does it?
Even our instrumental temperature records show we have been this way before. Perhaps you believe Michael Mann who thought ‘the MWP was an outdated concept’ when the reality is that it would have been much freakin hotter than today, and of course the Roman warm period was even hotter, which even then wasn’t as warm as the two Holocene maximums.
Are you aware that following the last ice age temperatures jumped from -10c to +10c in fifty years, so we cant even claim our climate is warming quicker than ever before?
Perhaps you might learn some manners and contribute to this blog in a more reasoned manner and realise we are in an paranoid age when terrorism, financial problems, so called climate change are causing many less informed people to succumb to a condition which H L Mencken observed as folows;
“:The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”
tonyB
So some people are complaining about bad science and at the same time complaining that money is being spent to make better science. There’s no pleasing some people.
Don’t listen to the Luddites.
Joe, sure, we’ll give you a peer reviewed estimate of time to revert to “normal” CO2 levels just as soon as there is some actual accurate long-run data of reduced total CO2 emissions to calibrate our models against.
What do you mean, there isn’t any such data?
Why cann’t I find reference of this article on the PNAS web site ?
ATTN: Redemption
RE : Global Warming: A Closer Look at the Numbers
GO : http://www.geocraft.com/WVFossils/greenhouse_data.html
NB : Monte Hieb is a mine safety enginner, and is knowleable of the chemistry
and physics of gases. His calculation assume an abs humidity of 1 % and
temperature of 15 C.
I’m curious, how did you come to be able to review the code for GISSTemp? Is it publicly available?
I’m also curious how we test code when we don’t know what the correct answer is? I use test cases where the results are known, given specific inputs. Don’t these models just make assumptions about how parts of the climate system interact?
I’ve been told these climate models are extremely complex. From reviewing the code, what’s your take on this? Can you give us an idea of how many lines of code are in GISSTemp? What programming language is used and what system does it run on?
Thank you
Harold Pierce Jr says:
Actually, their paper doesn’t really address what effect CO2 has because they assumed that the sea surface temperatures had risen as measured…but this begs the question of why the sea surface temperatures rose. In other words, they essentially found that if you force the temperatures to be higher over the 70pct of the globe covered by oceans, then you will also find the temperatures to be higher over land.
Oh, and by the way, this study was all done with climate models…You know, those things you don’t trust anyway.
tonyb says:
Do you have a cite for those numbers? Even the ice age – interglacial transition only resulted in global temperature change of ~4-7 C. (The changes were almost double that in the polar regions.) These changes generally took place over thousands of years at an average rate of maybe 0.1 C per century for warming (generally slower for cooling).
It is true that there were some much more rapid changes in climate recorded in the ice cores although these were likely not global changes (or were even see-saw changes where some places warmed and others cooled) and I don’t think you have the magnitude right.
Would someone from this site please contact me. I can’t find a contact e:mail.
Thank you!
Reply: info [ at ] surfacestations dot org
God, I love these kind of stories. So, as we know from other such stories, we have only 10 years to avoid this fate. . .and now we know if we don’t it will last for 1,000 years.
Well, well. The 1,000 years bit certainly gives me an idea. I have a modest proposal for a new Tag here at WUWT for such stories. My suggestion is they all be Tagged as “Reichstag Fire” to commemorate another famous morally bankrupt attempt to herd humanity into a destructive 1,000 year cultural revolution.
Joel Shore
Try these references, one refers to a BBC programme you can listen to and the other is a bibliography. The programme was fascinating as it deals with Doggerland-the area between Britain and Europe flooded after the ice age.
http://www.biab.ac.uk/A4volume9-2005.pdf
http://www.bbc.co.uk/programmes/b00gw18s
TonyB
ATTN: Joel Shore
What caused the sea surface water temperature to increase was probably the shift of the PDO into a warm phase in 1977, which Don Easterbrook has called the “Great Climate Shift”.
You’re right. Current climate models are all fatally flawed. But climate models built upon a distributed heating system with water as the working fluid are probably more realistic and don’t require the participation of GHG’s including water vapor as GHG. Water vapor is steam coming from warm water and it transfers heat from the warm water into the continents when wind blows air over the surface of the water from which it orginates.
It seems that atmospheric CO2 was slightly higher in the past: click
Anthony
You missed the $400 million for Earth sciences in the NASA budget from the Stimulus package.
tonyb (00:26:47) : causing many less informed people to succumb to a condition which H L Mencken observed as folows;
“:The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”
Wow. Maybe I need to read some Mencken…. I’d heard the name before but just ‘wrote it off’ as ‘trendy whatever’. This leads me to believe the guy ‘Has Clue’….
Thanks! You have pointed me at a path for new understanding…
dennis ward (01:29:41) :
So some people are complaining about bad science and at the same time complaining that money is being spent to make better science. There’s no pleasing some people.
Some of us are complaining about handing huge buckets of money to the same folks who gave us the bad science in the first place. Giving more money to folks without clue does not give them clue…
Want better science? Give me the money! (only 1/2 😉 unfortunately. I’d rather stay retired, but the present crop of, er, “stuff” being produced really needs some adult supervision and I know of at least one person who could provide it…)
John Galt (06:32:03) :
ragingcaveman (15:20:20) :
It was actually me, E.M.Smith that was most of your quote…
I’m curious, how did you come to be able to review the code for GISSTemp? Is it publicly available?
I came to review by my own choice. (One of ‘The Team’ that regularly supports AGW here was silly enough to post a pointer to the place where I could down load it) From my docs of it:
To download the GISSTEMP source code go to:
http://data.giss.nasa.gov/gistemp/sources/
and click on the download link.
Unpack the archive, and read gistemp.txt
I’m also curious how we test code when we don’t know what the correct answer is? I use test cases where the results are known, given specific inputs. Don’t these models just make assumptions about how parts of the climate system interact?
As near as I can tell, no testing as you are familiar with was ever done.
And yes, they make gross assumptions about ‘how parts of the climate system interact’ that are clearly false, then they run with them. No questions asked…
In particular there are two ‘brokennesses’ that I’ve found.
1) Temperatures from the GHCN (no TOB or equipment correction) and USHCN (with TOB and eq. correction) data set are compared and a difference calculated based on subtracting GHCN from USHCN. This is limited to at most 10 years of data. The idea is to make a ‘delta’ that is used to ‘correct’ USHCN to remove its Time of Observation Bias and equipment change corrections; to better match GHCN. Nice in theory, wrong in practice.
Why? If, say, Reno had a thermometer change 8 years ago that resulted in an increased temp reading, what sense does it make to subtract this from the temperatures recorded in 1910 or 1940 or 1970 or… Perhaps this is why GISS data often re-write past temperatures lower…
2) The “Reference Station Method” is used to ‘adjust’ the urban heat island effect in cities. Unfortunately, they look at ‘rural’ stations up to 1000 km away and use those to change the data for cities. Just what do Lodi and Reno tell you about San Francisco?
This matters because most population is on the coasts. There is not much ‘rural’ out to sea, so the ‘corrections’ come from inland. OK, where’s the problem? Inland temps have a steeper slope. Yes, the ‘anomalies’ correlate, but non-linearly. The code uses a linear adjustment… (Either a straight line or a line with a single ‘knee’ in it, but never a proportional percentage or other slope adjusted correlation …)
I’ve been told these climate models are extremely complex. From reviewing the code, what’s your take on this? Can you give us an idea of how many lines of code are in GISSTemp? What programming language is used and what system does it run on?
The GISStemp is not a climate model. It is a ‘gather temperature data from many sources and change it’ process. Unfortunately, that is an exact statement of function and not a political comment.
By the time GISS temperature data reaches the models it is already fatally flawed and not in touch with reality. The models don’t have a chance…
The code is fairly trivial. Most of it is minor format changes (change ‘data missing flag’ from “-” to “-9999”) and file concatenations / deletions. There are about 6000 lines of code in GISStemp of which I would estimate about 1000 are truly ‘functional’. It consists of 6 coded “steps” in 5 directories plus a couple that are not coded (manual data download, for example). These are numbered STEP0 STEP1 STEP2 STEP3 STEP4_5 (plus the un-numbered steps of manual data download, and subroutine compilation / installation …)
The code in STEP1 is Python (with two major function libraries in “C” that Python calls). All the other STEPs are FORTRAN.
It ought to run on any system with f77 or similar compiler, Python, and “C”. Unix or Linux ought to be your best bet. So far I’ve seen nothing in the code that is tied to a particular architecture. I have seen a lot of ‘crufty practices’ such as writing scratch files in the same place where the source code is ‘archived’ and treating FORTRAN like an interpreted language (compile in line in scripts, run binary, delete binary. An example of why so many lines are ‘non-functional’ IMHO.)
(Apologies to anyone not a programmer. “Cruft” is clearly understood by programmers to mean “crud, not good, junk accumulated over time and never swept up, junky style” as an approximation; but seems to be a word that is not known to standard English. I’ve used it for about 40 years professionally and to this day don’t know where I learned it… Isn’t jargon fun?)
I have also downloaded the ModelE simulation code but have not looked at it… yet.
I have posted pointers to the data and source code downloads in the comments section of the ‘resources’ tab on this site.
Hope this is helpful to you…
E M Smith 21 40 00
Excellent post. One of my particular bugbears are ‘global temperatures to 1850’
I have a great problem with the notion of global temperatures anyway as they are so wildly inconsistent in methodology even before they are adjusted. However the data back to 1850 become even murkier with so few stations, frequent changes of location equipment and operator. This person was usually untrained and was not averse to making up temperatures they might have missed out (we call it interpolation these days) The location for thermometers (often uncalibrated) was sometimes bizarre and more for the convenience of the operator than the science. This practice seems to go on to the present day judging by surfacestations.org.
I wondeted if you had looked at this subject and what your opinion on ‘1850’was?
It is absolutely crucial to the science and basic proposition of AGW to be able to point to temperatures from 1850 and to be able to state to fractions of a degree that we are the cause and its unprecedented. (Lets not get into the MWP or Roman warm periods et all for the moment).
TonyB
TonyB (12:04:54) :
Excellent post. One of my particular bugbears are ‘global temperatures to 1850′
Thanks!
I see a couple of problems with 1850. Since missing gaps are ‘made up’ any tendency for older data to have more gaps results in more ‘made up’ past. Also, since all of past temps are re-written based on a recent anomaly, then a TOB bias change in 2008 re-writes temperatures recorded in 1851. This resulted in a 1.75 C movement for the not too distant past of Pisa. Who knows what it would do to 100 year earlier temperatures!
I fail to see how either of these is an improvement in accuracy…
Since the NOAA directions I posted some threads ago still advises that missing highs / lows can be ‘estimated’ (i.e. made up) I don’t see that the 1850 data are any more made up than the recent ones!
That’s the real data I’m talking about, not the created, er. adjusted, no, make that interpolated, I mean fabricated, oh just call it the GISS ‘data’…
“” DaveE (16:46:37) :
George E. Smith (09:27:02) :
Climate modeling is old hat; so now they are going to make up the data as well.
LMFAO too true George, but I’m afraid to say they’ve been doing that already!
On top of that, I have a question.
Could our current CO2 rise just be the MWP coming to visit?
DaveE. “”
Well Dave, that IS what one would deduce from Al Gore’s ice core data that he waved around in his sci-fi movie. 800 years ago was 1200 AD; seems like the middle of the MWP to me.
I learned some new Physics today; from a climate scientist, who does active research in “remote sensing” He may be Northern California area so I’m guessing he might work at RSS, in Santa Rosa.
Anyhow here is the jewel that he threw at me on another web forum, where the subject is anything but climate; but If I dare make mention of some “climate issue” that might relate to the subject (fishing); all manner of real environmental climate scientists emerge from the woodwork and immediately slam me for being anti-environment and uneducated to boot. One even complained that he felt insulted, because someoine else suggested a “Follow the money” scenario.
I was actually commenting on the Eric Steig/Michael Mann et al paper on Antarctica warming; mainly to point out that it was a storm in a teacup, because it would take 5000 years for the place to get up to the melting point at the obseved rate of warming; and that the satellites only measure the surface temperature, and since ice and snow are good thermal insulators, th ewarmed layer wouldn’t be very much total ice mass compared to the 6-10,000 feet of ice that is there.
So here is the new Physical Dogma:-
Ice by definition at atmospheric pressure is at zero C/32F; no matter what.
So all that gigatons of ice covering Antarctica is at zero deg C; no matter what, evn if the eair immediately above it where the Owl box is, is at -90 deg C, the ice itself remains at zero c; by definition; he says.
Now I only have a Bachelor’s deg, in Physcis, and maths, and radiophysics, and mathematical physics; so my education got cut off before I got to learn that ice is at zero deg C by definition at one atmosphere pressure.
No wonder I have been figuring everything wrong all these years.
So there you have it; scoop of the day.
George