Guest essay by Jim Steele
In 2012 the National Academies of Science published A Framework For K-12 Science Education Practices, Crosscutting Concepts, and Core Ideas. Although the framework often characterizes climate change as a negative and disruptive phenomenon, the framework clearly states students need to understand all the factors causing climate change writing, “Natural factors that cause climate changes over human time scales (tens or hundreds of years) include variations in the sun’s energy output, ocean circulation patterns, atmospheric composition, and volcanic activity.”
However instead of promoting textbooks that critically analyze and debate the relative contributions of a diverse array of climate factors, the American Association for the Advancement of Science (AAAS) has been attacking any state that wants to adopt textbooks promoting climate debate. Alan Leshner, a psychologist and CEO for the American Association for the Advancement of Science and Camille Parmesan (whose debunked climate claims have already been published in text books) argued, “From the scientific perspective, there are simply no longer “two sides” to the climate-change story: The debate is over. The jury is in, and humans are the culprit.”
Whatever the outcome of that political battle, the science framework is absolutely correct to state, “The actual doing of science or engineering can pique students’ curiosity, capture their interest, and motivate their continued study.” So I am creating a series of activities (that educators can download for free from my website) to supplement any Climate Literacy science text. Hopefully these lessons will engage students in critical thinking, simulate how NOAA/NASA climate scientists think and “do science” as currently executed.
The current pedagogy argues teachers should not be the “Sage on Stage“ spewing dogma for students to regurgitate. A teacher should be the “Guide on the Side,” alternating short bits of input followed by time slots (such as “Think, Pair and Share”) that allows students to investigate and discuss the issue with their peers. The first supplemental piece is my beta version of the “Temperature Homogenization Activity” and I would appreciate any comments and typo corrections) to help improve this and subsequent lessons before I post it to my website. If you know of any high school or college educators that might give this lesson a trial run, please pass it on.
“Temperature Homogenization Activity.”
Teaching Objective I: Understanding Trends and Anomalies
Students must be able to read and create quality graphs.
Teacher input: Give the students the graph “Maximum Temperature USHCN Raw Data” illustrating trends from 3 quality weather stations in the US Historical Climate Network, all located at a similar latitude in southern California. In pairs or small groups, have them discuss the following questions and formulate questions of their own.
Think, Pair Share:
1) What factors might make Brawley so much warmer than the other stations?
2) Which station(s) experienced the warmest temperatures between 1930 and 1950?
3) Which station(s) experienced the most similar climate trend from 1900 to 2010?
4) Is climate change affecting all stations equally?
Teacher input: Brawley is further inland and unlike stations closer to the coast doesn’t experience the ocean’s cooling effect. Drier desert conditions with few clouds and little vegetation creates a microclimate that heats up much more quickly than the other stations. Brawley and Cuyamaca shared the most similar trends but may have been difficult to see due to micro-climate differences. To better extract climate trends that can be compared between stations experiencing varied micro-climates, scientists graph anomalies.
Instruct students to visit the USHCN website and download the raw data for the 3 stations into a spreadsheet like EXCEL. To determine anomalies relative to the 1951-1980 period, calculate the average temperature for each station for that time period, then subtract the station’s average from the raw data for each and every year. This will produce negative temperatures representing cooler years and positive temperatures representing warmer years than average. Have students create their own anomaly graph, and compare their charts with the anomaly graph below.
(Teacher note: Do not use an average from years later than 1980. During the mid 1980s there was a massive change in equipment that also required relocations that brought the weather stations closer to buildings. )
Think, Pair, Share:
Have students discuss the value of using anomalies to extract regional trends.
Brainstorm about what factors could cause only Chula Vista’s micro‑climate to suddenly warm relative to the other stations?
Teaching Objective II: Understanding Artificial Inhomogeneities
Teacher input: Because each weather station exists in a unique micro‑climate, individual differences cause each station to exhibit small cooling or warming trends that might not be seen at the other stations. For example, changes in vegetation, sheltering or waste heat from various building configurations, or a difference in topography that funnels wind differently, all can affect short term temperature trends. Changes to the landscape such as removal of trees, a fire, or changes such as increased pavement affect how the soil holds the moisture and how much moisture is transpired into the air. The resulting differences between weather stations are called inhomogeneities. Natural inhomogeneities are expected and an integral part of local climate change. However, scientists must eliminate any artificial inhomogeneities caused by a growing population that alters the surface and add waste heat, or when stations relocate to a different micro-climate. All 3 stations exhibited trends that were reasonably similar until 1982. So what caused Chula Vista to artificially warm, or conversely did Brawley and Cuyamaca suddenly cool?
To answer that problem, instruct students to first visit the USHCN website and acquire the ID# for each station. Then have them visit NOAA’s Historical Observing Metadata Repository, plug in the ID# and look for information regarding any changes at that station and the year in which those changes occurred.
Think, Pair, Share:
Which station(s) moved and in which year? Compare the changes in temperatures from any station that moved to the temperature changes for stations that did not move. Then determine if the re-location caused a warming or cooling. How did the re‑location affect the temperature trend?
Teacher input: Confirm the students’ research. According to the Historical Observing Metadata Repository, Chula Vista moved 2.5 miles in 1982, from a location situated along salt evaporation ponds to an urban setting surrounded by buildings. In 1985, new instruments were installed that required new cable connections. So the weather station was moved 190 feet, presumably closer to a building.
After the 1982 relocation, the temperature at Chula Vista rose by 6°F, in contrast to a drop in temperatures at the other 2 stations, so Chula Vista’s move very likely caused its temperature to rise artificially. An interpretation of artificial warming is also consistent with the relocation to a warmer urban setting.
There was no verifiable relocations or change of instrumentation at the other 2 stations. However 7 months of temperature data in 1992 for Brawley were reported via a Hygrothermograph (HTG) but that would not affect the 1982 comparisons.
Teaching Objective III: Homogenizing Data to Create Meaningful Regional Climate Trends
Teacher input: Good scientists do not blindly accept raw data. Data must undergo quality control analyses that adjust data for documented changes known to create changes unrelated to climate change. After accounting for artificial inhomogeneities, scientists adjust the data to create what they believe is a more realistic regional trend.
Based on what they have learned so far, ask the students to create a graph that best exemplifies southern California’s regional climate change. Simplify their task by using the graph (below) for just Chula Vista and Cuyamaca (which are only 15 miles apart). Students are free to adjust the data in whatever manner they feel best represents real climate change and corrects for artificial inhomogeneities.
Teacher input: After students have graphed their own temperature trends, have them compare their results with the graph below illustrating how USHCN climate experts actually homogenize the data. (The comparison should promote lively discussions as most students will create trends for both stations that resemble Cuyamaca.)
Think, Pair, Share: Discuss why climate experts created such different trends. Why did scientists lower the high temperatures at Cuyamaca during 1930s to 1950s by 3 to 5°F? What other concerns may affect scientists’ expectations about how best to homogenize data.
Teacher input: Clearly the data was adjusted for other reasons than can be explained by Chula Vista’s relocation. Adjusting data for unknown reasons is different from quality control adjustments and is called homogenization. The use of homogenization is contentious because a change in a station’s trend is often assumed to be caused by unknown “artificial” causes. However the natural climate is always changing due to cycles of the sun, ocean oscillations like El Nino and the Atlantic Multidecadal Oscillation that alter the direction and strength of the winds, or natural landscape successions. So how can scientists reliably separate natural climate changes from undocumented “artificial” changes?
One method suggests comparing the data from more reliable weather stations that have undergone the least amount of known artificial changes to determine a “regional expectation.” That regional expectation can serve as a guide when adjusting trends at other les reliable stations. However as we have seen the most reliable stations can undergo the greatest adjustments. So what other factors are in play?
Many scientists working for NOAA and NASA believe that rising CO2 explains recent temperature trends. In addition, many scientists suggest that the proximate cause of regional climate change is driven more by natural changes in ocean circulation. In 2014 climate scientists published a peer-reviewed paper (Johnstone 2014) suggesting that climate change along the coast of North America could be best explained by natural cycles of Pacific Decadal Oscillation (PDO) due to its affects on sea surface temperatures in the eastern Pacific. Give the students the graph below from Johnstone 2014 and ask them to compare changes in sea surface temperatures (SST in red) with the raw and recently homogenized temperature data from southern California.
Think, Pair, Share: Which data sets (raw or homogenized trends) best agrees with the hypothesis that ocean temperatures drive regional warming trends? Which data sets best agrees with the hypothesis that rising CO2 drives regional warming trends? Could a belief in different hypotheses affect how temperature trends are homogenized?
Teacher input: Have students compare the temperatures trends for the northern hemisphere (below; created by the Japanese Meteorological Society and published by the National Academy of Science in 1977) with the new global trends presented by NASA’s Gavin Schmidt who argues 2014 was the warmest year on record. Point out that earlier scientific records suggested temperatures dropped by 0.6°C (1.1°F) between 1940 and 1980, with the 1980 temperatures similar to 1910. Compare those temperatures with Schmidt’s 2014 graph that suggests 1980s temperature anomalies were 0.5°C higher than the 1910s.
Think, Pair, Share: Why does the period between 1940 and 1980 in the 2 graphs disagree so dramatically? Does the new graph by NASA’s Gavin Schmidt’s represent real climate change or an artifact of homogenization? If the difference was due to homogenization, is that a valid reasons to alter older trends. If Gavin Schmidt’s starting point for the temperature data from 1980 to 2014 was lowered, so that 1980 temperatures were still similar to 1910 as suggested by earlier research, how much higher than the 1940s would the 2014 global temperature be?
Teaching Objective IV: In‑filling
Teacher input: As seen for the USHCN weather stations, raw data is often missing. Furthermore extensive regions around the world lack any weather stations at all. To create a global temperature climate scientists must engage in the art of infilling. A recent scientific paper, Cowtan and Way (2014) used in-filling to contradict other peer-reviewed research that determined a pause to global warming for the past 15 years or more. By in-filling, these scientists argued that there was no hiatus and warming trend continued.
Give the students a map of the world and an amber colored pencil. Instruct them to lightly shade all the continents to show they have all warmed. Now provide the map (below) of Global Historical Climate Network Stations showing the station locations. Instruct students to simulate infilling, by darkening the regions on all the continents where ever weather stations are sparse (the whitest areas). Then give them the NOAA’s 1950-2014 map modeling the continent’s warmest regions
Think, Pair, Share: Can in-filling reliably represent local temperatures trends? The warmest regions appear to be related to infilling. What other reasons would cause greater warming in the in-filled regions? Should regions without a sufficient density of weather stations be included in a global average temperature?
Extended Lesson Activities:
Have students pick a group of stations within a 500‑mile area within similar ecosystems (i.e. the Great Plains, or Southeastern forests, New England Forest) and examine the differences between raw and homogenized temperature data for all those stations. (See an example of such a comparison for Massachusetts in an analysis of moose migrating southwards) Use the metadata for those
I suggest looking at just the maximum temperatures in this extensions activity because minimum temperatures are much more sensitive to landscape changes and other microclimate changes. The next activity will examine differences between maximum and minimum temperatures and the effects of the landscape on temperature trends.
Related Teaching moment: Tolerance and Respectful Debate
Have students write an essay about how our nation is fostering greater tolerance and compassion toward different ethnicities, races, religions and people with different sexual preferences. Then contrast that tolerance with the condoned hatred that’s been hurled at people for having different climate beliefs. Often those very different beliefs are simply a result of trusting different virtual realities created by statistical temperature trends. Would more respectful debate, between scientists who trust homogenized trends and those who don’t, help the public better understand climate change?
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Very few, if any, of the global index manufacturers ever conducted any intelligent analysis of regional data, preferring to homogenize relative to urban records.
Thanks, Dr. Steele, for a valiant effort.
I think the mechanics of the data-gathering stage should by simplified by the use of downloadable spreadsheets in LibreOffice format.
Sorry, I meant “be simplified”.
And these downloadable spreadsheets should contain the data.
A few weeks ago, I showed the adjustments done to Reyjavik Iceland over the past year. Checked it again, and guess what, they have adjusted it yet again in just the last two weeks.
Quality controlled station temperature trend since 1900 (Iceland Met says nothing should be done to this data) —> +0.2C of warming.
NCDC adjusted trend as of February 21, 2014 —> +1.4C of warming.
NCDC adjusted trend as of January 17, 2015 —> +2.0C of warming.
NCDC adjusted trend as of January 29, 2015 —> +2.3C of warming.
What happened in the last two weeks that made 1900 temperatures 0.3C colder than they were as of just two weeks ago. Nothing. What changed in the last few years that made Rekyavik 1.1C colder in 1900 than the records indicate and 1.0C warmer today than the records of today indicate. Nothing.
These guys will not stop. They are getting away with it and are even being encouraged by their fellow climate scientists. We need a new law and a prosecutor or something.
Good catch Bill. We need to create a page that that compares all the homogeniztions over the past 3 years
I made a Gif animation of the changes.
The top right panel is the quality controlled temperatures from Iceland Met which they insist requires no further adjustment for any station moves, TOBs or polar bears or anything. The second right panel is the temperatures the NCDC reports to the public and the bottom right panel is the adjustment they apply to each year. I mean “reports to the public” is a government agency supplying temperature data to the whole world. 75% of the stations from the NCDC have this same pattern.
Note there is only 13 months of change recorded here and only 10 days in the last two parts of the animation. We need to stop these people now.
http://www.loogix.com//img/res/1/4/2/2/6/3/14226332222823775.gif
“Temperature Homogenization Activity” … that’s NASA/NOAA speak for “fudging the data”!
If the figures of a corporation are “massaged” and “cooked”, it’s called fraud.
“If the figures of a corporation are “massaged” and “cooked”, it’s called fraud.”
In climatology it gets you grants!
The overall concept is good but strictly college level. There are far too many advanced concepts (anomaly, micro-climate, inhomogenieties, USHCN, NOAA, etc.) for this to be done in high school. Moreover it would take several class sessions to get through this all. The average high school science class meets only about 60 hours a year, and has to cover everything called for in the state standards, so time is precious.
These are the standard problems with most advocacy lessons. They are far too advanced and way too long. A good lesson teaches just one basic concept in 35 minutes or less. I have designed a number of skeptical lessons that do this, but I had the help of high school teachers doing it. Plus I have my catalog of all the technical concepts that are taught in high school as a guide. One major concept in 35 minutes should be the standard.
David
http://www.stemed.info/index.html
One of my high school/middle school lessons, on solar activity, is here: http://climatescienceinternational.org/images/pdf/lesson_on_solar_activity_and_gw-20-06-2013.pdf My view is that the scientific debate is too technical for K-12 so I show them the debate rather than involving them in it.
Just to amplify what I said above, if you use any technical term or procedure that the students have not already been taught than you must teach it. Jim’s lesson probably uses dozens of terms and procedures that are not normally taught in high school. I suggest looking carefully at the standards to see what has been taught. This lesson might work for a sophomore or junior college course in climate science, certainly not in high school.
If anyone is interested in developing skeptical teaching materials for high school (or middle school) I will be happy to advise them. Email me at . My team did a $600K project for DOE in which we cataloged the technical terms used in K-12 and estimated the average grade in which each is taught. Our catalog is used to rank teaching materials by grade level in the search engine on http://www.scienceeducation.gov/. We call it grade level stratification.
The web is larded with alarmist K-12 teaching materials, many Federally funded. Skeptics have next to nothing and I would like to change that. But the materials have to be grade level appropriate.