Scientists core into California's Clear Lake to explore past climate change

From the University of California – Berkeley

Deep sediments are unparalleled record of biotic changes over past 200,000+ years

Map of Clear Lake.
Clear Lake, which is about 19 miles by 8 miles at its widest, is located about 100 miles north of San Francisco. (Google Earth)

University of California, Berkeley, scientists are drilling into ancient sediments at the bottom of Northern California’s Clear Lake for clues that could help them better predict how today’s plants and animals will adapt to climate change and increasing population.

The lake sediments are among the world’s oldest, containing records of biological change stretching back as far as 500,000 years.

The core drilling is part of a unique, multifaceted effort at UC Berkeley to determine how Earth’s flora and fauna responded to past changes in climate in order to improve models that project how life on Earth will adapt to today’s environmental pressures. What the researchers learn from their look-back in time will be crucial for state or local planners clamoring for better predictive tools to guide policies crucial to saving ecosystems threatened by climate change.

“We are reconstructing the past to better forecast the future, because we need to know what’s coming in order to adequately prepare for it,” said project leader Cindy Looy, UC Berkeley assistant professor of integrative biology.

Looy and 16 other UC Berkeley faculty members – including paleontologists, pollen experts, botanists, ecologists and climate modeling experts – will examine the lake cores for pollen grains, charcoal and fresh-water organisms going back at least 130,000 years, long before humans arrived in the area. Using isotope and chemical analysis as well as carbon dating, the researchers will obtain a long series of detailed snapshots – ideally, every 10 years – of the plant and animal communities in the Clear Lake area and how the communities changed in response to “natural” global warming events. The analysis will also provide a measure of the temperature, oxygen content and nutrient levels of the lake, which reflect rainfall and water level.

“One way to check our predictions is to go back in time to a state very similar to today, with the same plants and animals and about the same temperature. The fossilized plant and animal remains from Clear Lake will give us a baseline for what this region of California looked like under similar climatic conditions, and when it was colder or warmer. We use that information to fine-tune predictive models being developed today,” Looy said. “Rates of global warming almost as fast as what we see today last happened during the shift from the last glacial to the current interglacial roughly 12,000 years ago, so that is one time intervals we will focus on.”

Focusing on two glacial-to-interglacial transitions

Looy and her team also will look at an even earlier transition from a glaciated Earth 130,000 years ago to a time 113,000 years ago when it may have been locally warmer than today. Learning what the area looked like during that time will help Northern Californians anticipate how conditions will change as global temperature continues to rise over the coming decades.

“There are indications from ice cores and ocean drilling cores that the beginning of the previous interglacial may have been warmer than it is now, which is where it becomes interesting,” said Looy. “We know what the Earth is like at today’s temperature, but a lot of people are trying to predict what will happen if the earth warms 1 or 2 degrees Celsius (2-4 degrees Fahrenheit), or even more.”

Charcoal in the lake sediments will also tell the researchers how Native Americans altered the environment through deliberate fires designed, for example, to increase acorn production by oaks.

One member of the team, Anthony Barnosky, UC Berkeley professor of integrative biology, will correlate this information with mammalian fossils collected from cave deposits in the area and that have been stored for decades in the Museum of Paleontology.

“You can view the core as a time machine by which we can define a continuous record of change, both climatic and vegetational, though the past 130,000 years, and then we have all these floating snapshots of the ecosystem – the mammal communities – from cave deposits around here,” Barnosky said. “We can put names on these fossils and radiocarbon-date them and begin to build a 3-D picture of change through time from the late Pleistocene, some 130,000 years ago, through the last glacial/interglacial transition 13,000 to 11,000 years ago, all the way up to the present.”

The study will help to evaluate and refine current models that predict how plants and animals will adapt to a changing world by testing predictions of the models against what actually happened during past times of climate change. Such models are important for state and local planning agencies that must deal with future consequences of climate change, including sea level rise, water shortages and increasing fire incidence that can threaten ecosystems.

“Based on this type of research at UC Berkeley, we want to make the case that adaptation to a changing climate is an issue we have to take more seriously, we have to bring it more into the mainstream of Bay Area planning,” said Bruce Riordan, director of the Bay Area Joint Policy Committee, which coordinates regional planning agencies in responding to climate adaptation. “By starting planning now and understanding the problems, the strategies we need to implement and the costs involved, we may find less costly solutions today rather than later. The research can really help inform about both the problems and about the solutions.”

Half million years of sediment

Clear Lake is unusual in having survived the advance and retreat of glaciers that scoured and obliterated most lakes outside the tropics, including the large lakes in California’s Sierra Nevada. Previous coring in Clear Lake by the U.S. Geological Survey (USGS) in 1973 and 1980 revealed lake sediments half a million years old, with only three breaks in continuity. At the site where UC Berkeley plans to obtain cores, in the upper arm of the lake about 1-3 miles west southwest of the town of Lucerne, the USGS obtained a continuous core in 1973 going back 130,000 years.

Looy and her team hired Utah-based DOSECC (Drilling, Observation and Sampling of the Earths Continental Crust), a non-profit scientific drilling company, to obtain two 120 meter-long (400-foot) cores, each about 8 centimeters (3 inches) in diameter. The cores are obtained in 3-meter (10-foot) chunks that are capped and labeled at the site and will be shipped to a cold-storage facility in Minnesota operated by LacCORE (National Lacustrine Core Facility), a non-profit organization funded by the National Science Foundation and the University of Minnesota. In the facility’s cold lab, the team will split each chunk longitudinally, photograph the halves, and then bring one half of each chunk back to UC Berkeley for analysis.

While the USGS sampled its cores once every meter, Looy and her team will sample parts of their cores every centimeter, the equivalent of about 10 years of sediment.

“We will get 100 times better time resolution, and can follow what happens when you rapidly warm the Earth up,” Looy said.

“The detail we can get from Clear Lake is really impressive,” she added. “The material is well preserved, and the USGS did a great job in describing the whole time interval so that now we know what the interesting areas are to focus on. We know this is not a shot in the dark.”

###

The Clear Lake drilling project is one of seven research projects involving global change forecasting funded by a $2.5 million grant from the Gordon and Betty Moore Foundation to UC Berkeley’s Berkeley Initiative in Global Change Biology, or BiGCB. Each project focuses on a particular California environment and leverages UC Berkeley’s unique museum collections of vertebrates, insects, plants and fossils to provide details about past changes in plant and animal populations.

5 1 vote
Article Rating
45 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
May 4, 2012 7:30 pm

When it was warmer? How dare they challenge the meme!

DocMartyn
May 4, 2012 7:33 pm

Going to be so much better than ice cores. They will have age, temperature, ecosystem, pricipitation and should eve have dust levels.

Philip Bradley
May 4, 2012 7:48 pm

“Rates of global warming almost as fast as what we see today last happened during the shift from the last glacial to the current interglacial roughly 12,000 years ago, so that is one time intervals we will focus on.”
12k years ago was the end of the Younger Dryas. The current inter-glacial is generally agreed to have begun 17k years ago.
But I will be interested to see the history of charcoal deposits as a measure of large scale burning.

May 4, 2012 7:50 pm

Jeez, why didn’t they just resample the USGS cores at greater resolution?
I suppose it’s harder to get a mega-grant for retro-work.
[sigh]

Byron
May 4, 2012 7:58 pm

“Rates of global warming ALMOST as fast as what we see today” WTF ? some estimates put the rate of warming at the end of the younger dryas at +10°C a decade , I think We`d notice if since say , 1982 for example temps had jumped 30°C

John F. Hultquist
May 4, 2012 8:04 pm

Dr. Roy has a post on “U.S. Temperature Update for April, 2012” that Cindy Looy and the other 16 researchers ought to read. Note the ‘U.S.’ part. And as this study is in a tiny point-like part of the US the Berkeley 17 might find it very difficult to say anything about global warming. I’ll suggest that each of them use their expertise and construct temperature proxies for the full term and only when done publish the comparative results.

David, UK
May 4, 2012 8:06 pm

“Such models are important for state and local planning agencies that must deal with future consequences of climate change, including sea level rise, water shortages and increasing fire incidence…”
She missed out the plagues of locusts. But hang on. She knows these consequences already? Then WTF is “fine tuning” (heh, right!) the models going to do? I can’t wait that long, I’m wetting the bed NOW.
“We will get 100 times better time resolution, and can follow what happens when you rapidly warm the Earth up,” Loony said.
So, she’ll give us a Hockey Stick in super HD?

May 4, 2012 8:12 pm

Isn’t it interesting that the avowed focus is not on the interglacial/glacial intervals? You know, given the Holocene is now half a precession cycle old and all…….

johanna
May 4, 2012 8:24 pm

“We are reconstructing the past to better forecast the future, because we need to know what’s coming in order to adequately prepare for it,” said project leader Cindy Looy, UC Berkeley assistant professor of integrative biology.
[snip]
“One way to check our predictions is to go back in time to a state very similar to today, with the same plants and animals and about the same temperature. The fossilized plant and animal remains from Clear Lake will give us a baseline for what this region of California looked like under similar climatic conditions, and when it was colder or warmer. We use that information to fine-tune predictive models being developed today,” Looy said.
——————————————————————————————–
This is just incoherent. No serious biologist would suggest that what was happening with plants and animals 130,000 (or 120k, or 100k) years ago in any location has predictive power for what might happen to the biosphere at that location in the immediate future, even if the climate was roughly similar.
It is a pity that such potentially interesting and valuable research is completely contaminated by the CAGW meme. Apparently the purpose is to provide more inputs for models – ye gods and little fishes! I mean, suggesting that tracking vegetation and Native American burning thousands of years ago as an input to bushfire impact models is just garbage. There are so many dodgy assumptions it is hard to know where to begin.
What a shame that a keen young biologist in California today would have to subscribe to a bunch of illogical cant to get in on the most fascinating project around. Does anyone know what a Professor of Integrative Biology (Dr Looy) is, BTW?

crosspatch
May 4, 2012 8:35 pm

The sad thing is that scientists lately have a pattern of finding what they set out to find or what validates any hypothesis they had going in and are less likely these days to objectively look at what they find.

TGSG
May 4, 2012 8:53 pm

“because we need to know what’s coming in order to adequately prepare for it”
?? I thought the science was settled ??

May 4, 2012 8:53 pm

I have been reading a lot about Fracking recently and I’m very worried that if they drill a hole in the bottom of this lake all the water will drain out.

May 4, 2012 8:57 pm

That’s nice. This has been done before in lots of other places. One of the most interesting was on Baffin Island. If these people think they can properly calibrate their models with proxies from a lake in California they are dreaming.

May 4, 2012 8:59 pm

Going into what could be an interesting study with preconceived ideas on what you are going to find kind of closes your mind to new knowledge doesnt it? I hope they really look at the data and report what they can correctly interpret. As to the models it is still garbage in garbage out.

theduke
May 4, 2012 9:02 pm

There it is again: the obligatory assumption that climate change will necessarily be something negatively skewed by man and that scientists can somehow figure out how to avoid it’s inevitable disastrous effects.
What would we do if we didn’t have these people to save us?

DesertYote
May 4, 2012 9:45 pm

What a shame. Clear Lake is one of the most interesting (and oldest) lakes in North America and well worth scientific inquiry. Its too bad that researchers now usually do studies to provide ammunition to support a political agenda. Once upon a time, scientist studied things for the sake of science.

Ted Clayton
May 4, 2012 9:54 pm

Very interesting study-proposal. Thanks for the heads-up!
I will try to find what sort of schedule is anticipated … and will see if the previous USGS Clear Lake core study results can be identified & are available.
A ten-year resolution is not-quite capable of revealing what we really want to be able to discern in the climate/weather record. Ten year data is certainly good & valuable, but it would eg not-quite be able to expose the warmer-cooler fluctuations that we know to have occurred in the 20th C.
At a resolution of about 4 years, we would expect to be able to discern the influence of normal 11 year solar cycles. Sampling at 7 or 8 year intervals is required to define the shorter decadal climactic variations (say, of 20 or so years).
It would be interesting, to consider & test just what the resolution-potential of the cored material and the sampling-method might be. If, as the plan here describes, researchers can do a good job of sampling every 10 millimeters (1 centimeter), could they also sample every 5 mm? How about 3?
Short of doing the whole core-length at minimum intervals, how about doing a few short stretches, just to check the potential?
Native use of fire is a very compelling field. There are limits & complications galore, particularly with sediments in a large lake. Certainly within relatively recent Native epochs, “burning” was an extremely highly-developed art-form which produced drastically different results than “wildfire”.
To illustrate, in contemporary agricultural ‘cultural burning’, which bears certain close parallels with ‘old native’ burning … one is obliged to carefully prevent any damage to wooden fence posts (sustained combustion), and even greater care must pertain with fence-wire (some ‘paradigms’ will call for no post-burn rusting, while others address de-tempering of the steel or loosening of the strands).
These are very “low-intensity” fires which can do their intended work over very considerable areas with little or no ‘macroscopic’ charcoal resulting. Only “fines” are supposed to be burning, and these leave only “soft” char-material – very different from ‘forest fires’.
I will be keen to see if a close focus can be made, to bracket the Spanish entry into Mexico. Did disease sweep overland, native-to-native? ‘Careless’ burning, with hotter fires and more large-material char, may be a characteristic result of reduced populations trying to maintain operations that require more people to perform, in the closely-controlled optimum way.

davidmhoffer
May 4, 2012 10:46 pm

You can tell from the temperature in one lake, in California, a few thousand years ago, how much the oceans are going to rise on a global basis? How is this done? By tossing the sediment samples into a pan and mixing them with chicken entrails? Can they predict how soon the northwest passage will be completely ice free too? All from sediment in one lake?
On the other hand, they did admit that the earth was warmer at times than it is now, went through periods of rapid warming in the past, and without SUV’s too. Sounds to me like there might be some real science buried in this project with some global warming propoganda painted on top in order to secure grant money.

Ted Clayton
May 4, 2012 10:48 pm

Is Bias a Problem in the Clear Lake Core-Study?
This is clearly a case of “applied research”. The study-participants have accepted funding that comes with the expectation of a warming-oriented focus. “We are concerned about possible Warming-effects on our local-regional environment, and we want you University people to use your expertise to look into the matter”.
The Bay Area, the Central Valley, and the Northern Californian watershed system all share an ‘integrated’ water-management crisis of the first magnitude, no pooh-pooh.
I disagree with Californian assumptions that human-released CO2 is driving climate-change, and that it must cook our goose. But so long as the main/key assumptions & desired forms of focus are laid bare & spelled out, as they are right up-front in this case, then there isn’t a “scientific” problem.
“Pure research” is not common, and in pragmatic cases such as this core-project, attempting to design a program without assumptions and focus and goals and objectives (all of which involve “valuations” that will display variations limited only by the number of individuals who care to weigh in on them), it’s pie in the sky.
The real concern is not “Do these guys put their pants on one leg at a time” (no surprise), but will the actual data be open to the general community, for our cross-examination pleasures [cracks knuckles], and when.

jorgekafkazar
May 4, 2012 10:49 pm

Why do I have this feeling that the preconceived notions of the researchers will perpetrate themselves all the way to the reported results, regardless of the raw data?

May 4, 2012 11:31 pm

Hmmm. I agree that this could be a very interesting study if they can manage to quit with the assumptions. And I can give them a workable hypothesis, even, which as I tell my students is a prediction based on what you already know or can find out: if the climate changed then the living organisms adapted quite well.
Because, duh, that’s what living organisms do. Even my ninth grade biology students could tell you that.
Go Bears?!

Bob
May 5, 2012 3:52 am

This is a very serious matter. There are publications to get. Advanced degrees to get. And I’m sure the local pols are just waiting for the enhanced ability to plan (tax?) the future.

David A
May 5, 2012 4:16 am

“Rates of global warming almost as fast as what we see today last happened during the shift from the last glacial to the current interglacial roughly 12,000 years ago, so that is one time intervals we will focus on.”
By “rates of global warming today”, did the mean 1912 to about 1942, where the rate of warming was about the same as from 1975 to 2005. If not then, well, they did not need to go back 12000 years. (Beyond the likely fact that it was as warm, or warmer then the present at least three times in the past 4,000 years. As others have pointed out, some of the warming from the past glacial, to the current interglacial far exceeds our modern warming. Other then the PC spin, it sounds like a good study.

tty
May 5, 2012 4:53 am

The lake sediments are among the world’s oldest, containing records of biological change stretching back as far as 500,000 years.
That is good for lake sediments, but by no means extreme. Tenaghi Philippon in Greece goes back 1.35 million years and Lake Baikal in Siberia 12 million years.
a time 113,000 years ago when it may have been locally warmer than today
By 113,000 years ago the last interglacial was already over and it was distinctly colder than today. Earlier, about 125,000 years ago it was indeed considerably warmer than today, and worldwide too.
Press-release “science” at its best….

Louis Hooffsteter
May 5, 2012 5:49 am

Ted Clayton says:
Is bias a problem in the Clear Lake Core-Study? The study participants have accepted funding that comes with the expectation of a warming-oriented focus.
These researchers may have unwittingly placed themselves between a rock and a very hard place. Their data can be easily verified (or not) from the USGS core taken from the same general area, and/or the other half of their own core that will be retained at the University of Minnesota. Given that, and the ‘sea change’ in opinion on CAGW, how easy will it be for the project team consisting of 17 UC Berkeley faculty members, to reach a “consensus” on their conclusions? I’d love to be a fly on the wall during that debate!
If they examine the data objectively and produce and unbiased report (unlikely, I know) would journals refuse to publish their findings? Would the Gordon and Betty Moore Foundation ask for their $2.5 million back? Would Berkeley fire them? I hope they’re all tenured. Time to make a batch of popcorn!

Bruce Cobb
May 5, 2012 6:51 am

Wow, so they have their Warmist funding, they have their Alarmist assumptions about what our climate is doing and will do in the future, now they just need to produce “science” that will “inform” policy makers about what to expect. What could possibly be wrong with that?

May 5, 2012 6:54 am

“What the researchers learn from their look-back in time will be crucial for state or local planners clamoring for better predictive tools to guide policies crucial to saving ecosystems threatened by climate change.”
This sounds like a slight overstatement to me!

Bill Wood
May 5, 2012 7:00 am

Since they have implied that their research will confirm the “settled science” of CAGW, what do they expect to find that will confirm their hypothesis. Indeed, what is the hypothesis that they are testing? It would appear that if the end result is other than catastrophic disruption of the biosphere due to rapid warming the research will be deemed a “failure” and obviously flawed.

Kelvin Vaughan
May 5, 2012 9:06 am

The problem with assumed global warming is that everyone is putting all their eggs into one basket! What happens if we cool down. The human race never learns!

GogogoStopSTOP
May 5, 2012 9:40 am

They will be able to discern the lighting induced burn deposits from the Indian initiated scorching by the residual match sticks thrown into the lake… after lighting the Peace pipes, right? I mean, I saw Tonto do that on an episode of The Lone Ranger, on the radio, in the ’50’s… really, no, really I did.
Ok, you don’t believe that. How about… they have the report’s conclusion written already, it’s all due to the rise of CO2 & Mankind is at fault. QED

Pamela Gray
May 5, 2012 9:56 am

I wish for the long gone days when Ivory Tower folks did field research. Tell you what. If the core stuff doesn’t pan out, we’ve got wolves to collar and study here. They went after deer and elk first, then found cows were easier to catch. Now they are killing sheep 50 yards from houses. They will soon find sheeple are even easier to catch.
Will keep the light on for ya.

Taphonomic
May 5, 2012 9:57 am

Previous study on old Clear Lake cores. A lot of new methods have been developed since the 1970s when these cores were taken.
Palynology of two upper Quaternary cores from Clear Lake, Lake County, California, with a section on dating
Adam, D. P.; Robinson, S. W.
1988
http://pubs.usgs.gov/pp/1363/report.pdf

Clay Marley
May 5, 2012 11:02 am

Slightly OT possibly, but I read a report recently where some Israeli scientists used core samples from the Dead Sea to reconstruct the history of earthquakes in the region back about 6000 years. Seems an EQ disrupts and mixes the top layer of sediment. It was possible to date the EQ and estimate the magnitude by the location and thickness of the disruption.
I hope the Berkley folks are at least aware of this. It could affect their results if they see it and misinterpret it.

DonS
May 5, 2012 12:04 pm

Clearly, from the introductory remarks, the science is already settled.

Dipchip
May 5, 2012 12:20 pm

In order to have a future to worry about we must first get our fiscal house in order.
If our scientific math is 20 times more accurate than our fiscal math we are still doomed!!

JRR Canada
May 5, 2012 1:27 pm

Admitting in advance that it is rude to mock a persons name, given what follows Looy’s name are you sure an N was not dropped? This list of assumptions in advance of the data does not / can not be called science

Interstellar Bill
May 5, 2012 3:55 pm

Johanna said:
“This is just incoherent. No serious biologist would suggest that what was happening with plants and animals 130,000 (or 120k, or 100k) years ago in any location has predictive power for what might happen to the biosphere at that location in the immediate future, even if the climate was roughly similar.”
It goes beyond incoherent to incompetent when they deliberately overlook Earth’s huge orbital eccentricity of 4% during the Eemian. This causes a semiannual insolation ratio of
[(1+e)/(1-e)]^2=1.174 between the Eemian summer perihelion and winter aphelion.
This 17% difference is 3 times today’s 6% level, so the climate regime will necessarily be quite different from today’s, with a semi-annual ‘forcing’ of of 236 Watts per sq meter. People would have to stay in the shade all summer or collapse from heat stroke, and today’s crops would wilt.
In fact, the Eemian climate regime was probably too unstable for agriculture to start then. In contrast, the low-eccentricity Holocene regime was greatly favorable to the origin of agriculture. Just imagine if eccentricity had stayed high, and we’d been kept in the Paleolithic because of an Eemian-style Holocene climate regime (yet another answer to the Fermi Paradox).

Avgjoe
May 5, 2012 11:04 pm

Already, the trumped up expectations of the seas rising up in walls of water are assumed and being forced on average people in the form of increased insurance costs for lands along the sf bay. Everyone now lives in a ‘flood’ area just by government fiat because they believe no they operate on fixed assumption of massive innundations from sea level rises of multiple feet! So AGW hysteria isn’t just a pet political agenda/lever, it’s impacting avg joe in the pocketbook right here, right now.

DaveR
May 6, 2012 2:06 am

“We are reconstructing the past to better forecast the future, because we need to know what’s coming in order to adequately prepare for it,” said project leader Cindy Looy….
And when we discover that the natural variation of the earth’s climate is far more than we are currently admitting, we will…..????………………have to stop the project quickly!

Gail Combs
May 6, 2012 4:05 am

crosspatch says:
May 4, 2012 at 8:35 pm
The sad thing is that scientists lately have a pattern of finding what they set out to find or what validates any hypothesis they had going in and are less likely these days to objectively look at what they find.
________________________________
That means they are not scientists they are Lysenkoists.

Tom Bakewell
May 6, 2012 7:08 am

It would be interesting to look at the USGS cores to get some sort of estimation of compaction vs depth. And somehow that figure of 1 cm may represent 10 years of deposition bothers me. Generally lacustrine sedimentation is episodic, esp in times of great climatic variation. Someone has not done her homework with any attention to the details..
Tom Bakewell B. Sc Geology RPI 1971

MikeH
May 6, 2012 8:21 am

geologyjim said on May 4, 2012 at 7:50 pm
Jeez, why didn’t they just resample the USGS cores at greater resolution?
I suppose it’s harder to get a mega-grant for retro-work.
[sigh]

Hey, those USGS cores are stale by now. Gee, they’re 130,039 years old now. The UC Berkeley work requires fresh 130,000 year old samples.

Unattorney
May 7, 2012 9:45 am

Global warming to climate change to global change. How fast language is manipulated.

SteveSadlov
May 7, 2012 12:53 pm

I wonder what may be implied about nearby mountains (especially to the north of the lake) during the pre Holocene Pleistocene? There is little discussion of glaciation in the coast ranges. If there is glaciation in the Trinity Alps today, then what of the higher Coast Range and southern Klamath Mtn areas?

Brian H
May 9, 2012 3:18 pm

Ted Clayton says:
May 4, 2012 at 10:48 pm

The real concern is not “Do these guys put their pants on one leg at a time” (no surprise), but will the actual data be open to the general community, for our cross-examination pleasures [cracks knuckles], and when.

That was my instant thought: “PROTECT THE RAW DATA!” Put it in accessible form immediately, allow open access.