Improving Climate Literacy: 'Temperature Homogenization Activity'

Guest essay by Jim Steele

In 2012 the National Academies of Science published A Framework For K-12 Science Education Practices, Crosscutting Concepts, and Core Ideas. Although the framework often characterizes climate change as a negative and disruptive phenomenon, the framework clearly states students need to understand all the factors causing climate change writing, “Natural factors that cause climate changes over human time scales (tens or hundreds of years) include variations in the sun’s energy output, ocean circulation patterns, atmospheric composition, and volcanic activity.”

However instead of promoting textbooks that critically analyze and debate the relative contributions of a diverse array of climate factors, the American Association for the Advancement of Science (AAAS) has been attacking any state that wants to adopt textbooks promoting climate debate. Alan Leshner, a psychologist and CEO for the American Association for the Advancement of Science and Camille Parmesan (whose debunked climate claims have already been published in text books) argued, “From the scientific perspective, there are simply no longer “two sides” to the climate-change story: The debate is over. The jury is in, and humans are the culprit.

Whatever the outcome of that political battle, the science framework is absolutely correct to state, “The actual doing of science or engineering can pique students’ curiosity, capture their interest, and motivate their continued study.” So I am creating a series of activities (that educators can download for free from my website) to supplement any Climate Literacy science text. Hopefully these lessons will engage students in critical thinking, simulate how NOAA/NASA climate scientists think and “do science” as currently executed.

The current pedagogy argues teachers should not be the “Sage on Stage“ spewing dogma for students to regurgitate. A teacher should be the “Guide on the Side,” alternating short bits of input followed by time slots (such as “Think, Pair and Share”) that allows students to investigate and discuss the issue with their peers. The first supplemental piece is my beta version of the “Temperature Homogenization Activity” and I would appreciate any comments and typo corrections) to help improve this and subsequent lessons before I post it to my website. If you know of any high school or college educators that might give this lesson a trial run, please pass it on.

“Temperature Homogenization Activity.”

Teaching Objective I: Understanding Trends and Anomalies

Students must be able to read and create quality graphs.

Teacher input: Give the students the graph “Maximum Temperature USHCN Raw Data” illustrating trends from 3 quality weather stations in the US Historical Climate Network, all located at a similar latitude in southern California. In pairs or small groups, have them discuss the following questions and formulate questions of their own.

Think, Pair Share:

1) What factors might make Brawley so much warmer than the other stations?

2) Which station(s) experienced the warmest temperatures between 1930 and 1950?

3) Which station(s) experienced the most similar climate trend from 1900 to 2010?

4) Is climate change affecting all stations equally?

clip_image002

 

Teacher input: Brawley is further inland and unlike stations closer to the coast doesn’t experience the ocean’s cooling effect. Drier desert conditions with few clouds and little vegetation creates a microclimate that heats up much more quickly than the other stations. Brawley and Cuyamaca shared the most similar trends but may have been difficult to see due to micro-climate differences. To better extract climate trends that can be compared between stations experiencing varied micro-climates, scientists graph anomalies.

Instruct students to visit the USHCN website and download the raw data for the 3 stations into a spreadsheet like EXCEL. To determine anomalies relative to the 1951-1980 period, calculate the average temperature for each station for that time period, then subtract the station’s average from the raw data for each and every year. This will produce negative temperatures representing cooler years and positive temperatures representing warmer years than average. Have students create their own anomaly graph, and compare their charts with the anomaly graph below.

(Teacher note: Do not use an average from years later than 1980. During the mid 1980s there was a massive change in equipment that also required relocations that brought the weather stations closer to buildings. )

clip_image004

 

Think, Pair, Share:

Have students discuss the value of using anomalies to extract regional trends.

Brainstorm about what factors could cause only Chula Vista’s micro‑climate to suddenly warm relative to the other stations?

Teaching Objective II: Understanding Artificial Inhomogeneities

Teacher input: Because each weather station exists in a unique micro‑climate, individual differences cause each station to exhibit small cooling or warming trends that might not be seen at the other stations. For example, changes in vegetation, sheltering or waste heat from various building configurations, or a difference in topography that funnels wind differently, all can affect short term temperature trends. Changes to the landscape such as removal of trees, a fire, or changes such as increased pavement affect how the soil holds the moisture and how much moisture is transpired into the air. The resulting differences between weather stations are called inhomogeneities. Natural inhomogeneities are expected and an integral part of local climate change. However, scientists must eliminate any artificial inhomogeneities caused by a growing population that alters the surface and add waste heat, or when stations relocate to a different micro-climate. All 3 stations exhibited trends that were reasonably similar until 1982. So what caused Chula Vista to artificially warm, or conversely did Brawley and Cuyamaca suddenly cool?

To answer that problem, instruct students to first visit the USHCN website and acquire the ID# for each station. Then have them visit NOAA’s Historical Observing Metadata Repository, plug in the ID# and look for information regarding any changes at that station and the year in which those changes occurred.

Think, Pair, Share:

Which station(s) moved and in which year? Compare the changes in temperatures from any station that moved to the temperature changes for stations that did not move. Then determine if the re-location caused a warming or cooling. How did the re‑location affect the temperature trend?

Teacher input: Confirm the students’ research. According to the Historical Observing Metadata Repository, Chula Vista moved 2.5 miles in 1982, from a location situated along salt evaporation ponds to an urban setting surrounded by buildings. In 1985, new instruments were installed that required new cable connections. So the weather station was moved 190 feet, presumably closer to a building.

After the 1982 relocation, the temperature at Chula Vista rose by 6°F, in contrast to a drop in temperatures at the other 2 stations, so Chula Vista’s move very likely caused its temperature to rise artificially. An interpretation of artificial warming is also consistent with the relocation to a warmer urban setting.

There was no verifiable relocations or change of instrumentation at the other 2 stations. However 7 months of temperature data in 1992 for Brawley were reported via a Hygrothermograph (HTG) but that would not affect the 1982 comparisons.

Teaching Objective III: Homogenizing Data to Create Meaningful Regional Climate Trends

Teacher input: Good scientists do not blindly accept raw data. Data must undergo quality control analyses that adjust data for documented changes known to create changes unrelated to climate change. After accounting for artificial inhomogeneities, scientists adjust the data to create what they believe is a more realistic regional trend.

Based on what they have learned so far, ask the students to create a graph that best exemplifies southern California’s regional climate change. Simplify their task by using the graph (below) for just Chula Vista and Cuyamaca (which are only 15 miles apart). Students are free to adjust the data in whatever manner they feel best represents real climate change and corrects for artificial inhomogeneities.

clip_image006

Teacher input: After students have graphed their own temperature trends, have them compare their results with the graph below illustrating how USHCN climate experts actually homogenize the data. (The comparison should promote lively discussions as most students will create trends for both stations that resemble Cuyamaca.)

clip_image008

Think, Pair, Share: Discuss why climate experts created such different trends. Why did scientists lower the high temperatures at Cuyamaca during 1930s to 1950s by 3 to 5°F? What other concerns may affect scientists’ expectations about how best to homogenize data.

Teacher input: Clearly the data was adjusted for other reasons than can be explained by Chula Vista’s relocation. Adjusting data for unknown reasons is different from quality control adjustments and is called homogenization. The use of homogenization is contentious because a change in a station’s trend is often assumed to be caused by unknown “artificial” causes. However the natural climate is always changing due to cycles of the sun, ocean oscillations like El Nino and the Atlantic Multidecadal Oscillation that alter the direction and strength of the winds, or natural landscape successions. So how can scientists reliably separate natural climate changes from undocumented “artificial” changes?

One method suggests comparing the data from more reliable weather stations that have undergone the least amount of known artificial changes to determine a “regional expectation.” That regional expectation can serve as a guide when adjusting trends at other les reliable stations. However as we have seen the most reliable stations can undergo the greatest adjustments. So what other factors are in play?

 

Many scientists working for NOAA and NASA believe that rising CO2 explains recent temperature trends. In addition, many scientists suggest that the proximate cause of regional climate change is driven more by natural changes in ocean circulation. In 2014 climate scientists published a peer-reviewed paper (Johnstone 2014) suggesting that climate change along the coast of North America could be best explained by natural cycles of Pacific Decadal Oscillation (PDO) due to its affects on sea surface temperatures in the eastern Pacific. Give the students the graph below from Johnstone 2014 and ask them to compare changes in sea surface temperatures (SST in red) with the raw and recently homogenized temperature data from southern California.

clip_image010

 

 

 

Think, Pair, Share: Which data sets (raw or homogenized trends) best agrees with the hypothesis that ocean temperatures drive regional warming trends? Which data sets best agrees with the hypothesis that rising CO2 drives regional warming trends? Could a belief in different hypotheses affect how temperature trends are homogenized?

 

Teacher input: Have students compare the temperatures trends for the northern hemisphere (below; created by the Japanese Meteorological Society and published by the National Academy of Science in 1977) with the new global trends presented by NASA’s Gavin Schmidt who argues 2014 was the warmest year on record. Point out that earlier scientific records suggested temperatures dropped by 0.6°C (1.1°F) between 1940 and 1980, with the 1980 temperatures similar to 1910. Compare those temperatures with Schmidt’s 2014 graph that suggests 1980s temperature anomalies were 0.5°C higher than the 1910s.

clip_image012

clip_image014

 

Think, Pair, Share: Why does the period between 1940 and 1980 in the 2 graphs disagree so dramatically? Does the new graph by NASA’s Gavin Schmidt’s represent real climate change or an artifact of homogenization? If the difference was due to homogenization, is that a valid reasons to alter older trends. If Gavin Schmidt’s starting point for the temperature data from 1980 to 2014 was lowered, so that 1980 temperatures were still similar to 1910 as suggested by earlier research, how much higher than the 1940s would the 2014 global temperature be?

Teaching Objective IV: In‑filling

Teacher input: As seen for the USHCN weather stations, raw data is often missing. Furthermore extensive regions around the world lack any weather stations at all. To create a global temperature climate scientists must engage in the art of infilling. A recent scientific paper, Cowtan and Way (2014) used in-filling to contradict other peer-reviewed research that determined a pause to global warming for the past 15 years or more. By in-filling, these scientists argued that there was no hiatus and warming trend continued.

Give the students a map of the world and an amber colored pencil. Instruct them to lightly shade all the continents to show they have all warmed. Now provide the map (below) of Global Historical Climate Network Stations showing the station locations. Instruct students to simulate infilling, by darkening the regions on all the continents where ever weather stations are sparse (the whitest areas). Then give them the NOAA’s 1950-2014 map modeling the continent’s warmest regions

clip_image016

clip_image018

 

 

Think, Pair, Share: Can in-filling reliably represent local temperatures trends? The warmest regions appear to be related to infilling. What other reasons would cause greater warming in the in-filled regions? Should regions without a sufficient density of weather stations be included in a global average temperature?

 

 

Extended Lesson Activities:

 

Have students pick a group of stations within a 500‑mile area within similar ecosystems (i.e. the Great Plains, or Southeastern forests, New England Forest) and examine the differences between raw and homogenized temperature data for all those stations. (See an example of such a comparison for Massachusetts in an analysis of moose migrating southwards) Use the metadata for those

I suggest looking at just the maximum temperatures in this extensions activity because minimum temperatures are much more sensitive to landscape changes and other microclimate changes. The next activity will examine differences between maximum and minimum temperatures and the effects of the landscape on temperature trends.

Related Teaching moment: Tolerance and Respectful Debate

Have students write an essay about how our nation is fostering greater tolerance and compassion toward different ethnicities, races, religions and people with different sexual preferences. Then contrast that tolerance with the condoned hatred that’s been hurled at people for having different climate beliefs. Often those very different beliefs are simply a result of trusting different virtual realities created by statistical temperature trends. Would more respectful debate, between scientists who trust homogenized trends and those who don’t, help the public better understand climate change?

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

175 Comments
Inline Feedbacks
View all comments
Sweet Old Bob
January 28, 2015 6:25 pm

Great material . Hope “Party Line” instructors are brave enough to use this.
Not holding my breath….(8>()

Lance Wallace
January 28, 2015 6:29 pm

Very thoughtful. I suggest adding figure numbers to make it easier to discuss. Under your Objective 3, the figure might benefit from an arrow showing the move of Chula Vista station in 1982.

Reply to  Lance Wallace
January 28, 2015 9:02 pm

Agreed

u.k.(us)
January 28, 2015 6:51 pm

Wait….what ?
Your final paragraph, reads like a call to arms.

clipe
January 28, 2015 6:56 pm

Related Teaching moment: Tolerance and Respectful Debate

Not likely
http://bishophill.squarespace.com/blog/2015/1/28/tol-on-radical-greens.html

Reply to  clipe
January 28, 2015 9:05 pm

clipe, I understand your pessimism but I can only hope as students witness the tremendous complexities and uncertainties of climate change that they will be more sympathetic towards diverging viewpoints, and help stave off the growing attempts to impost intellectual tyranny.

January 28, 2015 6:58 pm

2 + 2 = global warming

Reply to  Max Photon
January 28, 2015 7:36 pm

no. In the Church of CAGW, 2+2 = 5, just as Orwell predicted.

garymount
January 28, 2015 7:20 pm

Typo : XCEL should be Excel.

Global cooling
January 28, 2015 7:27 pm

Well, it is always a good idea in science to go to the original raw data and research. In this case, students should think whether the given samples (of weather stations) prepresent the whole population.
I have been wondering if we really need the global average temperature if we are only interested in trends. Is one tree in Yamal enough or should we infill missing data in paleoclimatology as well as.
Confirmation bias is very strong. IPCC needed research to support its political agenda and found Mann. Researchers noticed the demand of results showing warming and created them. Selection bias in funding and publicity makes the rest. No need for conspiracy.

True Conservative
January 28, 2015 7:34 pm

Brilliantly done, Mr. Steele, brilliant!

John Pickens
January 28, 2015 7:34 pm

In addition to the correct spelling of Excel, you could suggest a free spreadsheet program like Open Office. Something like: “import the data to a spreadsheet program like Excel or Open Office”. Encouraging schools to use open source operating systems and software could save thousands of dollars per student over their 13 year public school cycle.

Reply to  John Pickens
January 28, 2015 7:56 pm

That’s a good idea John. I would need to try out the open source programs first, before I recommend them. Is Open Office the best open source?

garymount
Reply to  jim Steele
January 28, 2015 8:26 pm

This is what William Briggs writes in his book “Breaking the Law of Averages” :
If you want to create your own data set, open a spread sheet (like OpenOffice.org) …
Please don’t mention open source operating systems.
Microsoft will be offering free Office software to buyers of inexpensive computing devices in the near future.

John Andrews
Reply to  jim Steele
January 28, 2015 8:29 pm

Open Office and Libre Office are essentially the same set of progrms. Libre Office is a fork of Open Office and is frequently used in Linux distributions. Both and others are available at no or low cost. I would select Libre Office first because many of the contributors to that program set have left Open Office to work with less control over how and why they do things by the commercial supporter of Open Office. Software politics going on.

Kit
Reply to  jim Steele
January 29, 2015 3:16 am

libreoffice is better than open office, altho either will work for this purpose

Paul Jackson
Reply to  jim Steele
January 29, 2015 9:18 am

OpenOffice and it’s fork LibreOffice are both seemingly about the same quality wise and are generally considered better than other open soure office suites. OpenOffice and LibreOffice definately open excel documents better than excel does with Open Document Format files, frequently OpenOffice and LibreOffice will handle older versions of Excel documents better than Excel will. Puritanical open source advocates will recomend LibreOffice over OpenOffice mostly for political reasons.
I use it at work as much as I can, Excel has some annoying bugs, where you can’t display blanks in cells with zero, not-a-number values or blanks as easily as you should be able too.

John Pickens
Reply to  jim Steele
January 29, 2015 1:14 pm

I use Libre Office, which is the version of Open Office used in the Ubuntu Linux operating system I use. Libre Office and Apache Open Office both share much of the same code. Libre Office seems to be updated by its user base more often. They are both free and open source.

January 28, 2015 7:35 pm

“Why does the period between 1940 and 1980 in the 2 graphs disagree so dramatically? Does the new graph by NASA’s Gavin Schmidt’s represent real climate change or an artifact of homogenization?”
A very reasonable answer would be that one is Northern Hemisphere, the other is global. If, that is, the students are ever told that.

Michael 2
January 28, 2015 7:55 pm

Really excellent. I can appreciate adjusting data when you have a good reason to do so and a reasonable baseline; in other words, an obvious step change in the trend.
It would be better not to adjust data but one then has to discard much of the record entirely. So, “adjusted data” can be used with caution.
“Infilling” is in my opinion improper science. If you don’t have data, don’t invent it.

jimmyjoe
January 28, 2015 7:56 pm

You probably should include an explanation of what an ‘anomaly’ and how to calculate it is for the high schoolers.

Reply to  jimmyjoe
January 28, 2015 9:07 pm

Agreed a definition of an anomaly would improve things.

January 28, 2015 8:00 pm

NIck Stokes says “A very reasonable answer would be that one is Northern Hemisphere, the other is global. If, that is, the students are ever told that.”
That seems a tad snarky Nick in light of the fact that I said, “Have students compare the temperatures trends for the northern hemisphere (below; created by the Japanese Meteorological Society and published by the National Academy of Science in 1977) with the new global trends presented by NASA’s Gavin Schmidt”
Also consider that all climate scientists acknowledge that the most reliable data exists only for the northern hemisphere.
2 demerits for you Nick.

Reply to  jim Steele
January 28, 2015 8:09 pm

It’s still a very reasonable answer. Especially as one is NH land only, the other global land/ocean. It’s a very obvious difference. Why would it be attributed to homogenisation?
I and others have made indices of non-homogenised data. Zeke Hausfather surveys that here. Homogenisation makes very little difference. Students might want to get quantitative instead of hand-waving.

Mike the Morlock
Reply to  Nick Stokes
January 28, 2015 8:17 pm

The purpose of the exercise is to get students to think for themselves just as you do. By the way Nick Congratulations you are becoming a skeptic. You that came here drank the water breathed the air …the change is inevitable it is just a matter of time…
michael

Reply to  Nick Stokes
January 28, 2015 8:39 pm

Nick, My lesson emphasizes scrutinizing the data, yet you label it hand waving?!? The activity simply asks them to explore and discuss the data. You get 2 more demerits for false accusations
NIck, It was not clear to me that the National Academies of Science graph of 1977 was and only for land. Where did you find that information because I agree it is an important distinction. However many climate scientists will suggest that changes in land temperatures should follow changes in ocean surface temperatures. I remember a video of Phil Jones justifying altering data to make land and ocean temperatures more homogeneous. So based on your criticism I will ask students to discus correlations between ocean and land temperatures, and add it to the discussion of the Johnstone paper.
Of additional interest might be that National Academies of Science discussed Budyko’s climate model in some detail writing, “Budyko assumed that the release of waste heat increased by 4 percent each year so that after 200 years waste heat, rather than solar energy, would be the controlling factor in climate.” Such a historic scientific discussion of waste heat is a valuable addition to CO2 warming.
The National Academies of Science’s discussion of waste heat preceded arguments on WUWT by 20 years. Curiously like CO2 advocates they also argued that waste heat could melt Arctic ice writing, “If we follow through Budyko’s arguments, we would expect the polar ice covers to retreat and eventually disappear.”

Reply to  Nick Stokes
January 28, 2015 9:14 pm

NIck how much infilling was required to generate ocean surface temperatures before the advent of Argo? and after Argo? I would expect greater interannual variations due to El Nino/LA Nina. I went to your link and didn’t see any error bars on your graphs. Would your lack of error bars qualify as “hand-waving”? And lastly what is the method of quantifying error bars regards infilled data that may easily differ 100% from reality?

Reply to  Nick Stokes
January 28, 2015 9:26 pm

Jim,
There were no suitable collections of SST data available to Budyko in 1967. There was precious little land data. Very little had been digitised. I think he mentioned 260 stations. Even in the late 1980’s, GIStemp was using a Met Stations only index.

Reply to  Nick Stokes
January 28, 2015 9:55 pm

Jim,
“Would your lack of error bars qualify as “hand-waving”? “
There are no error bars on any of your graphs here either.
In fact that exercise was about methods. Different methods of calculating an index. We showed that alternative methods to GISS etc gave the same result. With or without homogenisation.

Reply to  Nick Stokes
January 28, 2015 10:08 pm

Nick, forgive my skepticism regards your sincere efforts. However based on the gross dissimilarities between raw data trends and homogenized trends, especially between the gross dissimilarities between raw data and homogenized for stations that have not relocated or changed instrumentation, plus the gross dissimilarities between homogenized data in 2011 vs 2015, I honestly don’t understand how you can truly claim, “We showed that alternative methods to GISS etc gave the same result. With or without homogenisation.”

Reply to  Nick Stokes
January 28, 2015 10:22 pm

NIck says “There were no suitable collections of SST data available to Budyko in 1967. ”
I totally understand. But the lack of data for the pre-1970s could not improve by much without a time machine. How much did digitization add? Whatever the case may be, the absolute lack of data suggests any homogenization of trends before 1970s should be viewed with suspicion. On the other hand Budyko’s, and NAS’ updated graph, are supported by my many proxy studies showing that temperatures have not exceeded the 1930s to 50s, depending on location. To some, a homogenization process that lowers the peak warming that is suggested by many sources, is just another example of “hiding the decline.”

Reply to  Nick Stokes
January 28, 2015 10:38 pm

The differences seem to only occur for land temperatures in the NH after 2000. The problem from 1910 to 1980 is still there.
http://www.woodfortrees.org/graph/crutem3vnh/plot/hadsst3nh/plot/hadcrut4gl

Reply to  Nick Stokes
January 28, 2015 10:51 pm

Jim,
“How much did digitization add?”
It’s the whole thing. Before digitisation, the information sat on handwritten log books and typed forms. Current indices like CRUTEM use of order a million daily data a year. Easy if you can get it into a computer, even in the 1970’s. But a lot of typing, even if you can get the docs together. It was Jones and GISS in the ’80s that made land data usable, although the Met Offices did the brunt of the typing. And more slowly, Folland etc for SST. The GHCN project of the early ’90s capped it off and put the land data on CD.
‘I honestly don’t understand how you can truly claim, “We showed that alternative methods to GISS etc gave the same result. With or without homogenisation.”’
Well, we did it. Those are the results.
The point is that again, the global average is made up of thousands of stations. Some are adjusted up, some down. It evens out. In fact, more mathematically, much homogenisation involves partly replacing station data with a mean of neighbors. In the average, that is just a minor reweighting of the same data. That is what would have happened to your Cuyamaca data. It’s a mess, and the adjustment would have introduced a lot of other station data. That makes a big difference to the Cuy plot, but not to the average.
The second point, of course, is that the land/ocean data is dominated by ocean.

Reply to  Nick Stokes
January 29, 2015 1:28 am

“Homogenisation makes very little difference.”
Then stop doing it!

Solomon Green
Reply to  Nick Stokes
January 30, 2015 6:25 am

I do not see anything in the site to which we are referred by Nick Stokes that indicates Zeke was using raw data.

Mike the Morlock
Reply to  jim Steele
January 28, 2015 8:21 pm

oops cut the “that” after “You” I really must proof read. Tsk

Reply to  Mike the Morlock
January 28, 2015 8:40 pm

Mike we share the same affliction.

mothcatcher
Reply to  jim Steele
January 29, 2015 1:23 am

Jim – it would do no harm to add a pointer to (or lead the students to find) the cautionary logic that Nick seems to be pointing out (especially about the SH uncertainties). Would add a bit of balance in my view, and wouldn’t detract at all from your excellent piece. In fact, I think it would add something to the whole. Sure, Nick is primarily picking holes, but that’s what he’s there for!

rooter
Reply to  jim Steele
January 29, 2015 4:13 am

This shows that this teacher does not know much about the subject at hand. He does not know that there is a difference between global land ocean indexes and land only. He also just seems to ignore the fact that the number of stations are much lower in the firste graph and in the second. And there is a different geographical distribution of those stations. How could the students know this?
And considering the teachers stressing of the sparsity of stations later this becomes even more puzzling. The teacher could better ask why are the indexes not more different than they are.

Reply to  rooter
January 29, 2015 10:31 am

Rooter, I am not sure why your emphasis is on denigrating “the teacher” . When you say the teacher does not know the difference between land and ocean plus land indexes, that is your own fabrication. You must hail from the Slandering Sou school of sniping as she likes to create such straw dogs and false attributions. People erroneously validate their anger by hurling different graphs for different time periods, and different conditions at each other.
So I want to focus your attention on the last extension activity regards tolerance and respectful debate, “contrast that tolerance with the condoned hatred that’s been hurled at people for having different climate beliefs. Often those very different beliefs are simply a result of trusting different virtual realities created by statistical temperature trends. Would more respectful debate, between scientists who trust homogenized trends and those who don’t, help the public better understand climate change?”
Perhaps writing down your thoughts will help you be sincerely introspective and help you understand that snarky words of anger will not improve the lesson.

rooter
Reply to  rooter
January 29, 2015 11:59 pm

Jim Steele wants to focus on tolerance and respectful debate. By writings like this:
” You must hail from the Slandering Sou school of sniping as she likes to create such straw dogs and false attributions.”
Thanks to Steele for the lecture.
It is a fact that the teacher in question did not know this difference between data for land only and data for land-ocean.
Or perhaps the teacher wanted to draw attention to the fact that even with these differences in data type and difference between amount of data the difference was actually small?

Reply to  jim Steele
January 29, 2015 7:29 am

Jim,
What Nick (and I) did was simply download the raw GHCN-M data and use it to calculate global land temperatures. We got results similar (though not identical) to NASA/NOAA/Hadley results. So did skeptics Jeff Id and Roman M. You can see a discussion of all the land temp reconstructions that were being done back in 2010 here: http://rankexploits.com/musings/2010/another-land-temp-reconstruction-joins-the-fray/
We’ve improved methods a bit since then, but the general conclusion remains. While the impact of homogenization is particularly strong in the U.S, its impacts globally are much smaller, as shown in the first figure here: http://judithcurry.com/2014/07/07/understanding-adjustments-to-temperature-data/
Finally, I’m not entirely sure what you are trying to imply with your discussion of infilling. Land temperature stations are not evenly distributed (in fact, about 2/3rds are in the U.S. or Western Europe). Simply averaging all the anomalies would give you a representation of a world where 2/3rds of the global land area is in the U.S. or Western Europe, something I hope you’d agree is rather incorrect. Thats why groups like NCDC or Hadley use grid cells (usually 5×5 lat/lon) to ensure that each station’s temperatures are weighted proportionate to the land area they represents in global reconstructions. Groups like GISS, Cowtan and Way, and Berkeley Earth do more complex interpolation and tend to infill areas further from stations, though this mainly matters in the Arctic where station coverage is sparse.

Reply to  Zeke Hausfather
January 29, 2015 8:39 am

Zeke says “While the impact of homogenization is particularly strong in the U.S, its impacts globally are much smaller”.
Another way of conceptualizing the problem is that in places with a greater concentration of stations as well as more stations that extend over the longer time periods like the USA and western Europe, those are the stations that most distorted the most by homogenization. That suggests homogenization is distoring the most reliable data. It is a red herring that anyone is suggesting “averaging all the anomalies” so “2/3rds of the global land area is in the U.S. or Western Europe”
It also suggests the global average is nw dominated by stations that have shorter life spans of 60 years or less, and averaging those stations will indeed give a nice warming trend. But those stations can not provide information about the previous warming trends in the 20s through 50s, so they do not suffer from homogenization like the Cuyamaca where earlier warming peaks are shaved off more and more with each round of homogenization. The short-lived stations can not provide insight about the cyclical nature of climate change that one would expect from dominating influences like Pacific And Atlantic Oscillations.
It was seems obvious when we examine the California raw data, there has been a definitive impact from the PDO but the homogenization process obliterates. Short term stations are tainted by the population boom. Tuscon comes to mind and other more urban Arizona stations. The heat waves of the 30s fostered a boom in air conditioning, which in turn opened much of Arizona to rapidly expanding retirement communities beginning in the 60s, and right abut that time, we see rapid warming of minimum temperatures in those regions.
To repeat the homogenization process obscure local dynamics and creates the greatest errors in the most reliable station. Assuming the errors will simply cancel out is folly, obscures the real underlying issues, and leads to incorrect diagnoses. To repeat it reminds me of the joke about the man with his head in the oven and feet in the freezer. The doctor attending to his pain, suggested his problem was psychological because on average his temperature was just right. That’s what canceling out gets you

Reply to  Zeke Hausfather
January 29, 2015 10:11 am

Jim,
Most of the net effect of homogenization in the U.S. is to deal with two things: CRS to MMTS transitions, and TOBs changes. Both of these introduce strong (and measurable) cooling biases. Other more minor factors that tend to be picked up during homogenization are station moves and urbanization biases. This is why the residual impact of homogenization on U.S. temperature trends is small when you exclude TOBs-related adjustments and the post-1980s MMTS transition, as shown in the figures here: http://judithcurry.com/2014/07/07/understanding-adjustments-to-temperature-data/
Homogenization doesn’t have nearly as large an effect in Europe as in the U.S. The major reason is that temperature measurement systems in Europe were traditionally state-run rather than volunteer-run, and didn’t have large-scale TOBs changes or other system-wide changes that introduce systemic biases like in many U.S. stations. Generally speaking, homogenization has the largest impact where individual stations tend to do things radically different than their near neighbors. This is unrelated to station density, apart from the fact that denser networks allow for better pair-wise comparisons. This is why the Berkeley record (with 42,000 stations and 16,000 in the U.S. alone) ends up with a nearly identical homogenized series as the NCDC record (~7,000 stations and 1,218 in the U.S.).

Reply to  Zeke Hausfather
January 29, 2015 10:12 am

Jim,
If you object to what you term “infilling”, how exactly do you propose to calculate global mean land temperatures from a discrete number of stations that are not evenly geographically dispersed? It seems like you would have to use some form or another of either gridding or more explicit spatial interpolation.

RACookPE1978
Editor
Reply to  Zeke Hausfather
January 29, 2015 10:23 am

Zeke Hausfather (replying to Jim)

If you object to what you term “infilling”, how exactly do you propose to calculate global mean land temperatures from a discrete number of stations that are not evenly geographically dispersed?

Why use in-filling at all> UNLESS, it is your specific political goal to show as many “spread out” as many – and as few! – hot (red) areas across as many parts of an exaggerated Northern-hemisphere Cartesian-coordinate-map as possible to present the most effective propaganda as possible?
The methods chosen by Hansen and Mann and Nye and the others need to show “red areas” across as much of a map as possible to create their hysteria and thus their power. A Cartesian coordinate map is the worse possible to show scientific data or impacts or relationships, and it skews all interpretation of all secondary data and all data trends over time.
For example, what are the actual “grids” that GISS uses as they skew data and smear it across the tundra and oceans? Where are their data “points” and what areas are there no data points? We don’t know, and they use this deliberately.

Reply to  Zeke Hausfather
January 29, 2015 10:57 am

Zeke asks, “If you object to what you term “infilling”, how exactly do you propose to calculate global mean land temperatures from a discrete number of stations ”
As an ecologist there is no denying that all organism respond locally. The importance of understanding climate change is at the local and regional level. That’s where changing weather exerts its impact. Teach people about the more concrete examples of regional climate change. If you have concerns about heat stress and heat accumulation, then talk about the maximum temperatures, instead of averaging a minimum temperature that is tainted by landscape changes, and amplifies the average without accumulating heat. So I propose looking a climate change from local and regional perspective based on actual data.
If you want to create a global average for the purpose of illustrating the effect of CO2, then make it absolutely clear that much of the data is hypothetical. Subtract the warming from ventilating Arctic heat, instead of using it to amplify and adjust for theoretical missing heat.
Indeed regions with greater station density will yield greater inhomogeneities. But that proves my point that more stations with smaller life spans create short term trends that obscure the dynamics of the past 100 years. Furthermore studies have shown homogenization has skewed European data similarly, removing peak warming in the 40s and making a similar warming trend that suggests a systematic bias in the way homogenization is applied
http://landscapesandcycles.net/image/88852915.png
Chylek 2006 used the data from Greenland’s only two long term stations, so no homogenization possible and he illustrated the cyclical nature of Greenland’s climate and a faster rate of warming in the 30s.
http://landscapesandcycles.net/image/78283826.jpg

Reply to  Zeke Hausfather
January 29, 2015 11:18 am

Zeke says, “Most of the net effect of homogenization in the U.S. is to deal with two things: CRS to MMTS transitions, and TOBs changes. Both of these introduce strong (and measurable) cooling biases.”
First, your claim does not explain why homogenized trend in 2015 is much steeper than the homogenized trend in 2011, long after the MMTS moves in the 1980s. It does not explain the change in trends of in stations with no MMTS moves.
Second I do not see your claim manifest itself in all weather stations so homogenization needs to be done station by station. Perhaps a sampling bias but many stations show a warming after MMTS deployment. The Doesken Ft Collins study illustrated a warming bias for minimum temperatures and a cooling bias for maximums but the degree of that bias varied with each month and over the years.

DHF
Reply to  Zeke Hausfather
January 29, 2015 1:36 pm

Zeke,
The intentions of most actors may have been the best, but anyhow, now there is near perfect linear correlation between adjustments from raw to final temperatures for USHCN and the rise of the CO2 content in the atmosphere.
Ref: NCDC Breaks Their Own Record For Data Tampering In 2014
https://stevengoddard.wordpress.com/2015/01/05/ncdc-breaks-their-own-record-for-data-tampering-in-2014/
In my opinion data trumps good explanations. I think that scientific minds should feel inclined to stop telling how fabulous the adjustments are, and start asking: How can it be?
Do you have an understanding of why there can be so good correlation between the increase in adjustments and the CO2 increase?

Reply to  Zeke Hausfather
January 29, 2015 7:35 pm

“If you object to what you term “infilling”, how exactly do you propose to calculate global mean land temperatures from a discrete number of stations that are not evenly geographically dispersed?”
Or one might ask people like Zeke why they take temperature readings of a tiny point in space and then smear that temperature reading over huge distances. If the temperature reading is in a valley or on a hill, or near the sea, or inland, or in an urban area, or in the country, you’ve completely destroyed any meaning that temperature reading might have had. Stop infilling. It’s STUPID.

rooter
Reply to  Zeke Hausfather
January 30, 2015 12:17 am

Jim Steele is still wrong:
“It was seems obvious when we examine the California raw data, there has been a definitive impact from the PDO but the homogenization process obliterates.”
The only land temperature data he has shown is homogenized data. It is the homogenized data that shows the close match to sea surface temperatures.
Why does Steele continue in repeating the same error?

rooter
Reply to  Zeke Hausfather
January 30, 2015 12:25 am

Steele is trying this one:
“Indeed regions with greater station density will yield greater inhomogeneities. ”
He lost the explanation from Zeke. In the US the stations were run by volunteers and there were changes in time of day for reading the measurements that created the largest inhomogeneities. In Europe with great station density that is not so and the inhomogeneities in those data are smaller. Steele’s claim is just not true.

rooter
Reply to  Zeke Hausfather
January 30, 2015 12:32 am

Steele writes:
“Chylek 2006 used the data from Greenland’s only two long term stations, so no homogenization possible and he illustrated the cyclical nature of Greenland’s climate and a faster rate of warming in the 30s.”
What is Steele trying to say by this? Two stations is enough for Greenland?
Might be. But whatever happened to the problem with small density of stations and infilling? Not a problem in Greenland?
(.. and we do not want to use updated data, do we?)

Reply to  Zeke Hausfather
January 30, 2015 11:43 am

Rooter bombs away “The only land temperature data he has shown is homogenized data. It is the homogenized data that shows the close match to sea surface temperatures.
Why does Steele continue in repeating the same error?”
Rooter you are truly a Slandering Sou reincarnation. Why do you keep ranting about the wrong point. Over the past year the homogenization has increasingly amplified the warming trend. INstead of critically examining the sequential changes in the trend due to homogenization, you blather about nebulous homogenization as if you are proving a point. Your false accusations and choice of words will not change the documented amplification of warming trends that after serial homogenization obliterated the cyclical nature of climate change. Somehow you think if you keep bombing away that someone will believe that 2015 homogenization looks like the PDO and NE Paciic SST? Your biting off your nose trying to save your homogenized face.

Reply to  Zeke Hausfather
January 30, 2015 2:41 pm

Zeke Hausfather
January 29, 2015 at 7:29 am

Finally, I’m not entirely sure what you are trying to imply with your discussion of infilling. Land temperature stations are not evenly distributed (in fact, about 2/3rds are in the U.S. or Western Europe). Simply averaging all the anomalies would give you a representation of a world where 2/3rds of the global land area is in the U.S. or Western Europe, something I hope you’d agree is rather incorrect. Thats why groups like NCDC or Hadley use grid cells (usually 5×5 lat/lon) to ensure that each station’s temperatures are weighted proportionate to the land area they represents in global reconstructions.

Well, I think it’s closer to what the measurements say, and I think you can make more use of the measurement than what’s done.
For a baseline I use the same stations prior day’s reading. So I calculate a station day to day min temp change, and a day to day max temp change. Now with NCDC GSOD data set I have ~120 million samples of now much the temp changed day over day. I average this anomaly by the day for the station in an area, from 1 x 1 degree cells all the way up to the globe. I also take a years average for the same areas.
With the daily data for the 2 hemispheres you can see the average daily rate of temperature change as the length of day changes, and with a larger area you can average out a lot of the effect of weather. Then you can do things like look at the rate of change in an area as the amount of Sun per day changes, for each year, for both warming and cooling.
which would look like this
http://www.science20.com/sites/all/modules/author_gallery/uploads/543663916-global.png

rooter
Reply to  Zeke Hausfather
January 31, 2015 4:45 am

Jim Steele asks:
“Why do you keep ranting about the wrong point.”
Answer: Because you keep on repeating your error after you should know it is wrong. Johnston compares SST (of course adjusted and homogenized), SAT (land surface, homogenized) and SLP (I believe those are actually reanalysis, ie. modelled). And you keep on repeating like this:
““It was seems obvious when we examine the California raw data, there has been a definitive impact from the PDO but the homogenization process obliterates.””
You should change that part of section 1:
” Give the students the graph below from Johnstone 2014 and ask them to compare changes in sea surface temperatures (SST in red) with the raw and recently homogenized temperature data from southern California”
There at no raw unadjusted land data in that graph. Not raw unadjustet SST data or SLP data either.

rooter
Reply to  Zeke Hausfather
January 31, 2015 4:53 am

Jim Steele writes:
“Your false accusations and choice of words will not change the documented amplification of warming trends that after serial homogenization obliterated the cyclical nature of climate change. Somehow you think if you keep bombing away that someone will believe that 2015 homogenization looks like the PDO and NE Paciic SST? Your biting off your nose trying to save your homogenized face.”
Jim Steele did not know that he was trying to show his point about cyclical nature of climate change by using homogenized and model data. So how could homogenization (and modelling..) remove the cyclical nature of climate change when those were the type of data Steele used?

Geoff Shorten
January 28, 2015 8:16 pm

“Teaching Objective III: Homogenizing Data to Create Meaningful Regional Climate Trends
Teacher input: Good scientists do not blindly accept raw data. Data must undergo quality control analyses that adjust data for documented changes known to create changes unrelated to climate change. After accounting for artificial inhomogeneities, scientists adjust the data to create what they believe is a more realistic regional trend.”
Suggest replacing the first occurrence of ‘changes’ with e.g. ‘events’.

John Andrews
January 28, 2015 8:21 pm

A couple of thoughts after reading about Cuyamaca where I went fishing as a boy, and mostly skimming the rest of the article. First, Cuyamaca is in the mountains behind San Diego and the campgrounds there are at 4000 and 5000 feet elevation. Cuyamaca Lake is lower. I don’t know where the weather station is located, but elevation differences should be noted in the teaching materials. Chula Vista is at sea level along the coast right beside San Diego Bay.
Homoginization is a tricky subject and I think it is misused when it is used to change recorded data. Proper use might be to estimate what the temperature might be at an unmonitored location between similar monitored sites. This is a computed value and not raw data. Emphasis is needed here. The data is the raw measurements and this must be preserved. Important point. Anomalies are not data because they are based on an average (a calculated value, not data) subtracted from each reading.
In-filling is useful but there are a number of ways to do it. The choice of method must be defended. Again, the in-filled results are not data, but are only estimates.
Very good start. I hope it is accepted widely.

Reply to  John Andrews
January 28, 2015 8:53 pm

John writes “but elevation differences should be noted in the teaching materials.”
Absolutely agree. I am in the midst of developing additional activities evaluating how small changes in topography, elevation, vegetation, etc. create different microclimates and affect maximum and minimum temperatures in different ways. The great complexities of local and regional climate change can not be addressed all at once. Nonetheless i can simply add that when students visit the USHCN website that they note the elevational difference as an important variable. The next activity in this sequence will request students to zoom with the satellite view for each USHCN to view the surroundings.

Mike the Morlock
Reply to  jim Steele
January 28, 2015 9:25 pm

John Jim,The problem you are going to run into is class time. How many hours can a teacher devote to this. One, two, three periods? Just to teach the basic’s requires a full semester. Plus not all kids have computers. Means they would have to use school ones. And if you know who blocks the sites at school….
For this to work skeptics must have a presence on school boards and in the local PTA’s
If teachers are going to use this they must first be safe and protected.

Bill_W
Reply to  jim Steele
January 29, 2015 3:02 am

I like the exercise but it seems as if it would only be suitable for an advanced high school class. It would be beyond the comprehension of most high school students and you will be lucky to find 1 in 5 teachers who could handle it either.

Dr. S. Jeevananda Reddy
January 28, 2015 8:34 pm

Excellent job — One of my friend from IITM/Pune did such job with rainfall data. He published the data series [monthly, seasonal, annual] sub-division-wise for 1871 to 1994. He now retired..
Dr. S. Jeevananda Reddy

mebbe
January 28, 2015 8:40 pm

“Teacher input: Good scientists do not blindly accept raw data.”
This enjoinder is too moralistic for my taste.
It’s not good science to accept unquestioningly raw data.
A scientist should not accept raw data without subjecting it to scrutiny.

Richard G
Reply to  mebbe
January 29, 2015 2:30 pm

Students should be asked to collect their own data. This should sensitze them to the problems involved in collecting good data, and how easily field work can yield bad data through bad technique.

Marty Cornell
January 28, 2015 8:47 pm

Brilliant sarcasm! I especially liked the infilling section. I’m still laughing at the absurdity of politically correct climate science that your piece so wonderfully exposes.

mebbe
January 28, 2015 8:55 pm

Is climate change affecting all stations equally?
The only thing climate change can change(affect) is the mental state of those that conceive it.
Climate is an abstraction, so a revised version of that abstraction is also an abstraction and cannot cause one molecule anywhere to zig where it would have zagged. (This is slightly overstated since brains are molecules, too).
Are the changes in all stations equally indicative of climate change?

Reply to  mebbe
January 29, 2015 5:09 am

“Is climate change affecting all stations equally?”
No.
I effects the stations that don’t exist more than it does the ones that do.
On the other hand, the Urban Heat Island effect seems to effect stations not under the UHI.
Amazing how that works, isn’t it?
/sarc

masInt branch 4 C3I in is
January 28, 2015 9:00 pm

“Climate Modeling” by GISS, NOAA, UK Met Of and UEA, and arithmetic fudging by GISS, NCDC and NOAA are giving Applied Physics a very bad name.
The immediate result is that the Applied Physics community will be forced to rise up and kill GISS, NOAA, UK Met Of and NCDC (including Executive and Operative Personnel) stating that their activities amount to Government Sponsored Fraud that includes defrauding IRS.
“Obama’s Last Stand at Little Big Horn” is taking shape.
Good!

January 28, 2015 9:02 pm

Typo corrections:
What factors might make Brawley so much warmer than the other stations.
Have a ?
3: Which station(s) experienced the most similar climate trend from 1900 to 2010? 4. Is climate change affecting all stations equally?
Start 4 on a new line.
Give the students a map of the world and an umber colored pencil. 
amber colored pencil?
by darkening the regions on all te continents where 
the continents
[Done. .mod]

Reply to  Werner Brozek
January 28, 2015 9:18 pm

Thanks Werner

mebbe
Reply to  Werner Brozek
January 29, 2015 8:38 am

Umber is as much a colour as amber!

Reply to  mebbe
January 29, 2015 10:05 am

… and in the spirit of Global Warming wasn’t there a “burnt umber” in the Crayola box?

Reply to  mebbe
January 29, 2015 11:25 am

actually I was thinking of burnt umber and I need to dig out a color chart and compare umber and amber to see which bests describes the charts colors

January 28, 2015 9:58 pm

I want to thank everyone for the helpful comments. Within 24 hours I will make all the needed corrections and revise the final lesson plan on my website at:
http://landscapesandcycles.net/temperature-homogenization-activity.html
After that time please forward this first activity to any educators that you think might use it in their classroom and encourage them to provide feedback in order to improve the lesson. Thanks again.

Jake
Reply to  jim Steele
January 29, 2015 7:24 am

Jim; I am an AP Chemistry teacher in the Northeast, where our school schedule leaves a substantial gap of time between the test and the end of our school year. I usually fill that time doing cool experiments so that the kids leave high school with a thirst for more science, but I’m certainly entertaining the idea of working on this. I’m blessed to work with truly talented/gifted kids, so the technical content would provide a valuable learning experience for them. If I do have an opportunity to follow through, I will do my best to provide feedback for you.
As an aside, the “religious” aspect of the CAGW movement has had an amazing negative impact on the students who I work with today. The brainwashing I have observed in the students I work with over the last five years, as far as I can tell, will be debilitating for them. Questioning and free thought are at an all-time low, and blind acceptance the norm. I wonder how this young generation will go about research, and I fear bias driven conclusions will be a larger issue than we already understood them to exist.. In this sense, the fanatics have won, but I am hopeful that (over time) a greater acceptance of all angles of research will once again become the norm.

Reply to  Jake
January 29, 2015 8:00 am

I love to hear you may try this. And I agree about the brainwashing problem. Throughout the USA the standards promote a one-size fits all type of teaching so kids can score high on standardized tests. Because there is so much depth and breadth to be covered, rarely can teachers cover the whole suggested curriculum. So students memorize the key jargon and move to the next topic with little time to dig deeper.

Victor Frank
January 28, 2015 10:09 pm

For what age group are your lesson plans intended? My experience is that most students in public colleges taking their first introductory meteorology or oceanography labs are woefully unprepared for reading or drawing graphs or using Excel, or solving formulas, or using statistics. You would hope they had picked up these skills in high school, but this is unfortunately not the case, at least in the SF area. They even have trouble interpreting the simple graphs in the CAHSEE Math tests (which start next week).

Reply to  Victor Frank
January 28, 2015 10:36 pm

Victor, Indeed there is a large segment of the college population that is “woefully” unprepared. When I taught the Introductory to biology for majors labs at San Francisco State University, I would estimate 30% were clueless regards using a spread sheet. But for my purposes I could bring them up to speed within a period. Regards this lesson, I think any student from high school on up, can quickly be shown how to use the spreadsheet functions.
They downloaded data in text form from USHCN which can be converted quickly via the text to columns menu. Students can then copy and paster select data from 1950 to 1980 into a new column and easily get the average. To get the anomaly they can quickly be shown how to create a new column of anomalies by creating an equation that subtracts the calculated average from the first year’s temperature and then dragging that formula down entire length of the column from, say, 1900 to 2015. Graphing that data can be easily taught as well. It is an easy enough procedure to master that I think an 8th grader could master it in all in no more than two 50 minutes periods.

Juan Slayton
Reply to  jim Steele
January 28, 2015 10:51 pm

Hmmm…. Could you get Phil Jones to enroll?
: > )

Chuck Bradley
January 28, 2015 10:44 pm

I think you will get better comments by being clear about what skills and knowledge the students have. Some 9th grade students are proficient with elementary algebra and some 12th graders are not.
Do something with smoothing, using real data. Hourly data vs daily high and low. Weekly, monthly, annual averages. Where did the wild swings go?
Look at the basics of data integrity. Where is the original data? What was changed, in detail, and why? Then repeat the questions for each subsequent change of the previous data.
What do you think of someone that will not show you the data or how they changed the data, kids?
Explain the scientific method, using Bacon, Einstein, Feynman, others.
If you make it available, make sure the downloader can not be traced by anything on your site. I expect most of the downloads will be by pseudonyms. Be prepared for denial of service attacks.
Thanks for the effort, and good luck with it.

January 28, 2015 10:46 pm

Just a note for any budding teacher
“The current pedagogy argues teachers should not be the “Sage on Stage“ spewing dogma for students to regurgitate. A teacher should be the “Guide on the Side,” alternating short bits of input followed by time slots”
Sounds good in theory. The truth is that you need to be a Sage on Stage for classroom control. You need to feed them material and get them to rote learn to control them, where there is zero chance of cocking the activity up.
You also need to give them activities to discover in order to learn but you risk rebellion because they risk cocking things up. The timing and balance is not only difficult to get right, it changes for the same class during the day.

MikeB
Reply to  Robert B
January 29, 2015 3:57 am

The Seven Myths:of Modern Teaching
* Facts prevent understanding
* Teacher-led instruction is passive
* The 21st century fundamentally changes everything
* You can always just look it up
* We should teach transferable skills
* Projects and activities are the best way to learn
* Teaching knowledge is indoctrination.

Reply to  Robert B
January 29, 2015 7:52 am

Robert I agree with your concerns. There is a need for some rote learning from a sagely teacher to prevent students from “cocking” things up, and I am not advocating that. The new pedagogy is aimed at secondary teachers who lecture to much. The Teacher Inputs are the needed short sage on stage moments. BTW “Cocking things” up suggests males are the problem. In America we use the F word as it has a broader application. Or is your phrase a way of skirting the censors?

1 2 3