Fossils cast doubt on climate-change model projections on habitats

From the University of Oregon

Mammals didn’t play by the rules of modeling on where they migrated to survive last ice age, says UO researcher

EUGENE, Ore. — Nov. 18, 2014 — Leave it to long-dead short-tailed shrew and flying squirrels to outfox climate-modelers trying to predict future habitats.

Southern_short-tailed_shrew
Short tailed shrew

Evidence from the fossil record shows that gluttonous insect-eating shrew didn’t live where a species distribution technique drawn by biologists put it 20,000 years ago to survive the reach of glaciers, says University of Oregon geologist Edward B. Davis. The shrew is not alone.

According to a new study by Davis and colleagues, fossil records of five ancient mammalian species that survived North America’s last glacial period point to weaknesses in the use of ecological niche models and hindcasting to predict future animal and plant habitats. As a result, Davis says, the modeling needs to be fine-tuned for complexities that might be harvested from fossils.

Ecological niches use modern habitat distributions and climate; hindcasting adds predictive power by adding major past climate shifts into the models. That modeling combination — as seen in a 2007 study led by Eric Waltari, then of the American Museum of Natural History in New York — had the short-tailed shrew surviving the last ice age in mostly Texas and the Deep South. Conclusions drawn in other studies, Davis noted in the new study, also are biased toward southern locations for ice-age surviving mammals of the Pleistocene Epoch.

Short-tailed shrew, according to fossil records, did not live in the predicted ranges. Instead they lived across north central and northeast United States, closer to the glaciers and where they are widely found today.

“It’s almost as though it is living in all of the places that the model says it shouldn’t be living in and not in any of the places that the model says it should be living in,” said Davis, who also is manager of the paleontological collection at the UO Museum of Natural and Cultural History. “This suggests to me that whatever the model is keying on is not actually important to the shrew.”

Nor to the American marten, two species of flying squirrels and the Gapper’s red-backed vole, all of which lived mostly outside of predicted ranges, according to the fossil record. Northern and southern flying squirrels, the Davis study found, shared a compressed geographic region. It may be, Davis said, that some species tolerate competition under harsh conditions but separate when abundant resources are available.

Davis noted that an important but under-cited 2010 paper on rodents by Robert Guralnick of the University of Colorado and Peter B. Pearman of the Swiss Federal Research Institute also showed problems with hindcast projections. Those for lowland rodents in the last ice age did not hold up, but those for a higher elevation species did.

“Our findings say that we need to pay more attention to the potential problems we have with some of our modern methods, and the way that we can improve our understanding of how species interact with the environment,” said Davis, who added that his study was inspired by Waltari’s. “The way to improve our forecasting is to include data from the fossil record. It can give us more information about the environments that species have lived in and could live in.”

The findings appear in the November issue of the journal Ecography. In a special section of the journal, the Davis paper is packaged with four papers on research initially presented in a symposium on conservation paleobiogeography in 2013 at a biennial meeting of the International Biography Society. The Davis paper is co-authored by Jenny L. McGuire, now at Georgia Tech University, and former UO doctoral student John D. Orcutt, who is now at Cornell College in Iowa.

Davis and McGuire co-hosted the symposium, edited the special issue and penned an editorial that accompanies the five papers. Conservation paleobiogeography, Davis said, “is the idea that we can help people understand questions that arise from conservation needs using data from the fossil record.” Doing so, he said, may explain how species shift their ecological roles, or evolve, to survive amid abrupt changes in their habitats.

“Our paper raises questions about some of the work on projecting future ranges of mammals, and we suggest some directions forward,” Davis said. “We have concerns about the precision of the modeling techniques now being used. We don’t have any concerns about climate change happening and that it going to cause geographic range shifts for mammals and plants. The thing I want to do, as a scientist, is to have the best models possible so as we’re making informed decisions as a society.”

###

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

85 Comments
Inline Feedbacks
View all comments
phlogiston
November 19, 2014 11:32 am

Can anyone name for me a single important scientific discovery that has
(a) Benefited humanity in a non trivial way,
(b) Unlocked meaningful advance in understanding,
(c) Altered the paradigm of a field of science (in a positive way) and
(d) Come from computer modelling / simulation?
Just curious. (Maybe this will end up looking like a “what have the Romans ever done for us?” type of question.)

Reply to  phlogiston
November 19, 2014 11:43 am

Does the H-Bomb count?
http://vanemden.wordpress.com/2013/04/02/the-h-bomb-and-birth-of-the-computer-part-iii-triumph-of-brains-and-slide-rules/
Von Neumann’s brain had sufficed for calculations needed for the A-Bomb, but electronic computers were invented to “model” the Super, or thermonuclear device.
It’s debatable if invention of Monte Carlo simulation (by von Neumann & Ulam) counts as a “model”, but that statistical technique was another spinoff from work on the Super.

george e. smith
Reply to  sturgishooper
November 19, 2014 12:00 pm

Well I wouldn’t exactly call Monte Carlo an “invention.”
All you are doing is taking some model of some system, and inserting valid values for all of the variables, and computing the result, and doing that for a large number of cases, and then doing statistics on the result. I do what is tantamount to MC all day long, simulating optical systems with models. They could be imaging (lenses) or non-imaging (as for illumination) and in the latter case, I might trace 100 million rays through a complex system of objects, which will produce showers of daughter rays, and then I plot where they go on suitable maps.
We used to do the same sort of job, particularly in imaging optics, by simply tracing (laboriously by hand with log tables) maybe three rays in two different colors, and then fine tuning the design from those three rays.
The trouble with tracing 10^8 rays, is that if you don’t get the result you want, the computer can’t tell you what the hell is wrong with your system, nor how to fix it. So in the end, you actually have to know optics.

Reply to  sturgishooper
November 19, 2014 12:05 pm

Von Neumann and Ulam invented the process. What do you call their being the first to think up and use that statistical process? IMO, invention is the best word, but I’m open to suggestions for terms that better describe their achievement.
Ulam came up with the idea while playing solitaire during convalescence after severe illness, as the link describes.

george e. smith
Reply to  sturgishooper
November 19, 2014 12:16 pm

So MC is simply statistical examination of the results of a large number of “experiments” which could be actual physical experiments, or computer simulations of a “good enough” model.
In manufacturing industries they have done that for eons in the QA testing of the manufactured output, to determine their “yield to spec.”
So doing it in simulation (MC) is simply doing what you already have been doing, with a computer model.
In the language of the patent office, that is something that is “obvious to one of ORDINARY skill in the art.”
Therefore it is NOT patentable, so not an invention.
The use of MC required the invention of computers; that WAS an invention. Using them on a task already being done is not an invention.
Von Neumann’s role in the origin of the computer is well known.

Reply to  sturgishooper
November 19, 2014 3:10 pm

What types of inventions are not eligible for patent protection?
Some types of inventions will not qualify for a patent, no matter how interesting or important they are. For example, mathematical formulas, laws of nature, newly discovered substances that occur naturally in the world, and purely theoretical phenomena — for instance, a scientific principle like superconductivity — have long been considered unpatentable. In addition, the following categories of inventions don’t qualify for patents:
processes done entirely by human motor coordination, such as choreographed dance routines or a method for meditation
most protocols and methods used to perform surgery on humans
printed matter that has no unique physical shape or structure associated with it
unsafe new drugs
inventions useful only for illegal purposes, and
non-operable inventions, including “perpetual motion” machines (which are presumed to be non-operable because to operate they would have to violate certain bedrock scientific principles).

george e. smith
Reply to  sturgishooper
November 19, 2014 4:21 pm

You’re probably correct on that. My short term memory has been shot for a while, and I’m not great with names anyhow.
But my mention of the computer, was simply to point out that MC computations are hardly practical without a computer, so the USE of MC methods HAD to wait till the invention of the computer. Doesn’t mean the idea did.
Space travel had to wait till suitable rockets were invented. The idea of space travel was known for eons before that.
But thanks for that Turing fix there.
G

george e. smith
Reply to  sturgishooper
November 19, 2014 4:30 pm

An “invention” to be patentable, has to be novel (nobody else thought of it yet), useful (screen doors for submarines or ejection seats for helicopters, are not useful), and it has to be “non obvious to persons of ordinary skill in the art.”
So the test of obviousness is not for “experts” to whom everything (of course) is obvious.
And in the USA you typically have only one year to file for it, from the time of first disclosure, publication, or attempt to commercialize it (sell it). The new international patent law may have changed some of that. Now whoever files first gets the patent; so presumably you don’t have to invent. just stealing someone’s idea and beating him to the patent office works these days.
Makes for great co-operation between engineers and scientists.

tty
Reply to  sturgishooper
November 20, 2014 6:12 am

“ejection seats for helicopters, are not useful”
Actually there are ejection seats for helicopters. Though you have to blow the rotor blades off first.

george e. smith
Reply to  sturgishooper
November 20, 2014 3:15 pm

Many many (too many) years ago, there was a short piece in an electronics industry magazine, in their “Ideas for Design.” column, about how you could use Monte Carlo analysis to determine manufacturing yields.
This engineer had designed a circuit and somebody had donated him an hour or so on some IBM mainframe (time shared) computer, so he thought he would do an MC analysis of his circuit.
Now his circuit was a two transistor amplifier, consisting of an emitter degenerated common emitter gain stage, followed by an emitter follower output stage.
He had done a WORST CASE DESIGN on this to ensure that this amplifier would have a voltage gain of 10.0 +/- 1.0 using 5% tolerance resistors for the design.
So his MC analysis set values for each of the resistors and transistor betas and the like, within the data sheet spreads, and 5% resistor tolerances he had used for his worst case design.
Now a worst case design always gets 100% yield to the spec, assuming live components, or else it is not a worst case design.
So he ran his Monte Carlo on his IBM freebie, and lo and behold, the computer told him that the midpoint gain was only 9.5 and not 10.0, and that moreover he could get gains that were outside the bounds of his worst case design.
The computer told him that the emitter degenerating resistor was the most critical component, and the collector resistor was second, and the first transistor beta was third most critical. It then told him to reduce the emitter resistor by about 5% and re run it.
Wow such artificial intelligence.
Unfortunately, the computer did not tell him: “Hey dude ! if you make the second stage an emitter coupled gain stage as well, and make those emitter resistors much smaller or zero, then you will get a whacking great gain of maybe 500 to 1,000, and then you can enclose the whole thing in a negative feedback loop, and set the gain to 10.0 with maybe a couple of 2% resistors, and then you can use 20% resistors for all the rest, and the transistor betas will have little or no effect.”
Well you see, computers and MC and statistics in general, can certainly tell you how lousy your circuit performs, but no computer can come up with a good design for you.
That’s why you learn circuit design in class, so that you can design circuits that are inherently good no matter what the computer thinks.
Statistics is good for telling you stuff you already know, because you have all the observed data, which is the most you can ever know about your experiment.
But statistics is good for convincing others that you know what you are doing; well what you have done, anyway.

Duster
Reply to  phlogiston
November 19, 2014 12:02 pm

Just curious. (Maybe this will end up looking like a “what have the Romans ever done for us?” type of question.)
Yep. Just consider the object you are reading these posts on and then think how similar discussions were carried on by Thomas Jefferson, or even Albert Einstein. We tend to cease crediting science once it has been converted to common place engineering.

phlogiston
Reply to  Duster
November 19, 2014 3:26 pm

Not sure you understood the question. I’m 100% for science and engineering. I was talking about attempts to recreate reality in complex computer simulations, and then describing the output of such simulation as a “discovery” about the real world, or as “data”. Climate science is the foremost exponent if this delusion.
You mention computers. Was their invention brought about by, err, computer simulation? This sounds like the Terminator storyline, where artificial intelligence time travels to reinvent itself.
If anyone invented computers it was Turing, not von Neumann. There was no simulation involved, just things like logic gates and digital memory.
I can think on one answer to mt question – Daisyworld by Jim Lovelock. This is a very simple conceptual modrl which unlocked a very important fact about the world and its biosphere. The biosphere regulates climate to its own advantage. This is a true advance in scientific understanding, not recognised by a climate research community too busy with fruitless computer gaming.

Reply to  Duster
November 20, 2014 3:35 pm

Don’t want to get in a trans-Atlantic priority match, but Neumann actually wrote his design for an electronic computer before Turing. More importantly, because of the thermonuclear bomb project, he actually got to build one.
But Great Britain can go us one better with Charles Babbage and Ada Lovelace, although of course the Analytical Engine was mechanical rather than electric.

george e. smith
Reply to  phlogiston
November 19, 2014 12:06 pm

Is that an AND or an OR set of criteria? The AND set probably eliminates ALL candidates.

phlogiston
Reply to  george e. smith
November 19, 2014 3:00 pm

AND

Reply to  phlogiston
November 19, 2014 10:40 pm

Well, supply chain management for starters. Your chances of dropping into a major grocery store to find that they are out of canned peas for example is pretty low. Without computer models that track everything from this year’s crop, to current inventory levels by location versus consumption patterns tracked in realtime by location, and make predictions that are accurate enough for businesses to make supply chain decisions about, that just wouldn’t be possible, not to mention that the cost of the can of peas would be several times as high.
Actually, I would submit to you that you take for granted all sorts of things that are the results of computer modeling that meet your criteria. Drive a modern car? It has on board computer systems that model the physics of your car and based on the results of that model combined with realtime data input, make actual decisions ranging from turning on ABS to skid control and so on. Skyscrapers and highways are modeled first on computers, and many a bad design idea proven to be bad and so excluded from consideration. (That’s one of the nuances of modeling, it doesn’t necessarily produce a good idea as much as it facilitates the swift identification of bad ones). Same is true of everything from designing airplanes to nuclear power plants to mines. They are all major beneficiaries of computer modeling that in fact are non trivial, unlock meaningful (applicable) understanding, and altered many a paradigm.
We discuss so much of the misuse of computer models on this site that I think we sometimes lose site of the fact that the positive aspects of modeling are all around us in the things that we take for granted. It is like asking if guns are good or bad. Guns have neither morals nor ethics, they are neither good nor bad. What they get used for, that’s a different question.

November 19, 2014 11:38 am

Hmm, we all know what happens when actual data disagrees with model ouput: adjust the data! In this case such an adjustment could be called the blaming of the shrew. /rimshot

thinair
November 19, 2014 11:50 am

Einstein built mathematical models (i.e., sets of equations) of the implications of his theory of General Relativity and the universe. But at least he had the intelligence not to believe their results until the empirical evidence came, in decades later. He really though his math was wrong (or not applicable to reality) in many cases (e.g., black holes).

Dawtgtomis
Reply to  thinair
November 20, 2014 8:04 am

Einstein demonstrated intelligence tempered by wisdom, unfettered by greed or arrogance. That’s what’s MIA today.

Kon Dealer
November 19, 2014 12:00 pm

In Green Psiense, models trump reality every time.

Robert W Turner
November 19, 2014 12:40 pm

Manbearpig displaced those fossils to test our faith.

u.k.(us)
November 19, 2014 1:48 pm

Davis said. “We have concerns about the precision of the modeling techniques now being used. We don’t have any concerns about climate change happening and that it going to cause geographic range shifts for mammals and plants. The thing I want to do, as a scientist, is to have the best models possible so as we’re making informed decisions as a society.”
======
When did society become a scientists concern ?

Reply to  u.k.(us)
November 19, 2014 3:11 pm

Not familiar with governments, are you?

lee
November 19, 2014 5:55 pm

The shrew is found in the fossils for those areas because they were kept as pets. It is from this we get the notion of “The Taming of the Shrew”.

November 19, 2014 10:44 pm

tty – excellent and informative comments, tx.

November 19, 2014 10:54 pm

Mammals didn’t play by the rules of modeling on where they migrated to survive last ice age, says UO researcher
Given that the climate models are seriously deficient in terms of regional climate change, they may have simply modeled a climate change that didn’t occur, and that is why the flora and fauna existed where the climate models said they shouldn’t be. Strikes me as odd that they went to the trouble of comparing the existence of fossil shrews to an area and not bothering to seek evidence of climate change from that area in other proxies rather than relying exclusively on climate models (or at least doing some proxy evaluation to confirm the models output)

November 21, 2014 11:08 am

Animals do learn.
Difficult to tell even today with living specimens how much is instinct and how much learning, and there’s the possibility that rogue genetic changes get rewarded with greater reproduction as conditions change.
A fish expert used the term “adaptive behavior” in a discussion about herring, which aren’t as smart as animals.
My favourite example is Gray whales, who only feed in the Bering Sea. Some years ice clogs it, which is bad for survival of nursing females, they head north with calf, having not eaten since last summer. The species will survive, as some mavericks feed off the BC coast (about 1% of total in eastern Pacific) and about 10% skip the commute and stay off the OR coast).