NOAA sent me a list of 655 GHCN stations for CONUS with < 10% missing data from 1900 to 2020. Using that list, I plotted out the “days with Tmax>= 95F” and “days with Tmax>=90F” per active station, through 2023. NOAA used this same list for the 2022 state report for WY, Fig 2.
A link to that WY report follows. CONUS=contiguous (or coterminous) US. The CONUS plot in the NOAA report was provided for better context considering the shorter record for Wyoming.
Yes, he often uses the USHCN list of 1,218 stations for similar analyses and plots. I would expect very similar results. I used the list of 655 stations here because that is how NOAA recently generated the plots in fig 2 and fig 3 of the report linked in my original comment. On X, Matthew Wielicki posted last month about the NOAA fig 2 plot, which uses data through 2020, and wondered about an update. So that is the direct reason why I did this using NOAA’s list and first posted the results on X yesterday.
BTW, it was because of Tony Heller’s work that I was motivated to develop my own methods using R scripts to independently confirm his results. It’s quite complicated to extract the values correctly from the data files, but I eventually worked it out.
Please note that I did not use a lat/lon grid averaging method as NOAA did for their plots in Fig 2 and Fig 3. (As understood from the metadata posted for each of the figures.) These plots I have posted simply show the total number of reported hot days or warm nights divided by the number of active stations for each year. The overall sense of the history is the same either way.
Conclusions: The contiguous US experienced far more “hot days” in the early part of this record. And “warm nights” are about at the same numbers as earlier.
I would think the increase in average temperatures is caused more by increase in minimum temperatures than by increase in maximum temperatures. Which makes sense as warming caused by CO2 would show up as an increase in sky temperature which would have more of an effect at night than during the day. Note that clear nights tend to be cooler than cloudy nights.
Yes, I would agree. It’s consistent with a minor enhancement in the greenhouse effect and also consistent with the statement that there is NO CLIMATE EMERGENCY!
David, Thanks for the plots which comfirm my impression that the 1930s had highest daytime temps but current night time temps have been going up over the past few decades. That seems to be true in my neck of the woods.
It is clear there is a change in the charts in the mid 1970’s. The most logical explanation is that prior to 1975 most stations had their temperatures recorded by people rather than computers, and it is known that human readings and recording are simply inaccurate and need to be “adjusted” because none of those weather station operators had training in thermometer reading. /s
Thanks for this David. I suspect that if you do similar plots of number of days with less than 0°, -10° etc. you will see some equally interesting trends. Winter minimum temperatures seem to have been slowly but steadily rising since the early 1970’s, at least where I live and work in Ontario. One tangible effect is the length of the trucking season on winter ice roads serving remote communities in the north.
I look forward to a “winter” version of your analysis. I strongly suspect that a lot of the “warming” reported by the climate-industrial-media complex is due to an increase in winter minimum temperatures in the northern hemisphere. From my (anecdotal) experience, I’ve seen little or no evidence of summers getting hotter in the boreal forest, despite the media’s attempts to assert just that.
And here is a plot for days with Tmin<=-10F from the same records.
Rich Davis
February 25, 2024 2:55 am
The ‘market’ for Climastrology research grants is saturated. We need a new bogus crisis to research! Something that nobody can see or smell or taste, but it’s ominous! And caused by the scourge of human flourishing.
Eureka! How about ‘microplastics’?
And best of all, there are no known effects, so there is an infinite number of “Does X affect Y?” opportunities for papers.
It was tough but I watched all of it. My conclusion is that M. Mann gave the questions he wanted to speak about to Ms. Amanpour who has no concept of the truthfulness of his rants nor the intention to give him a difficult question.
The BS starts at the very top, and filters down to all the states
Levelized Cost of Energy Deceptions, by US-EIA, et al.
Most people have no idea wind and solar systems need grid expansion/reinforcement and expensive support systems to even exist on the grid.
With increased annual W/S electricity percent on the grid, increased grid investments are needed, plus greater counteracting plant capacity, MW, especially when it is windy and sunny around noon-time.
Increased counteracting of the variable W/S output, places an increased burden on the grid’s other generators, causing them to operate in an inefficient manner (more Btu/kWh, more CO2/kWh), which adds more cost/kWh to the offshore wind electricity cost of about 16 c/kWh, after 50% subsidies
The various cost/kWh adders start with annual W/S electricity at about 8% on the grid.
The adders become exponentially greater, with increased annual W/S electricity percent on the grid
The US-EIA, Lazard, Bloomberg, etc., and their phony LCOE “analyses”, are deliberately understating the cost of wind, solar and battery systems
Their LCOE “analyses” of W/S/B systems purposely exclude major LCOE items.
Their deceptions reinforced the popular delusion, W/S are competitive with fossil fuels, which is far from reality.
The excluded LCOE items are shifted to taxpayers, ratepayers, and added to government debts.
W/S would not exist without at least 50% subsidies
W/S output could not be physically fed into the grid, without items 2, 3, 4, 5, and 6. See list.
1) Subsidies equivalent to about 50% of project lifetime owning and operations cost,
2) Grid extension/reinforcement to connect remote W/S systems to load centers
3) A fleet of quick-reacting power plants to counteract the variable W/S output, on a less-than-minute-by-minute basis, 24/7/365
4) A fleet of power plants to provide electricity during low-W/S periods, and 100% during high-W/S periods, when rotors are feathered and locked,
5) Output curtailments to prevent overloading the grid, i.e., paying owners for not producing what they could have produced
6) Hazardous waste disposal of wind turbines, solar panels and batteries. See image.
In the jargon of cost benefit analysis this is the difference between direct and indirect costs. LCOE is the direct cost — the output of a generator divided by its cost. It is supposed to include your #6 by the way, being the life cycle cost divided by the lifetime generation. I suspect that annual figures are sometimes used which would be a deception.
The real deception is knowingly ignoring the indirect costs that you correctly list. Thus the system cost is far greater than the LCOE. Many people are in fact fooled by this deception. But I think that most skeptics who care about this issue understand the point.
Reacting to French farmers protests against the French declination of “WOTUS” especially and administrative control of farming in general, in a word, against the Sovietisation of French agriculture, “ultra liberal” Macron went full commie: he wants price control of agriculture goods.
Macron is a French Trudeau: he sounds very smart – but only for the ultra deluded and hyper unintelligent.
I found it interresting because it points out the culture wars left vs right wing bullshit we see in many other topics.
And the article states that, altough there is a bias, most on the left don’t care for vegetarian / vegan diets either.
Just another useless polarization to split and distract.
I eat meat too and, surely surprising for many here, am not right-wing.
most on the left don’t care for vegetarian / vegan diets either.
That may be, but the push for it, which includes finding ways to legislatively reduce meat consumption, comes from the left, and there isn’t a lot of visible push back from others the left.
There is a drive on the part of some climate extremists to abolish meat eating. What’s very interesting in this is the extent to which attitudes to a number of things seem to be forming into sets, which are accepted or rejected as a set. So the promotion of veganism on climate grounds does seem to be associated with ‘progressive’ views on race and on gender, though there is no logical connection, they are logically independent.
“”Compared to the standard American diet of highly processed, low-fiber, high-calorie, sugary foods, vegan diets have some health advantages. However, researchers found that avoiding all animal foods may lead to nutritional deficiencies in vitamin B12, omega-3, calcium, zinc, iron, magnesium, and high-quality protein.
These deficiencies may be associated with increased risk for certain types of cancer, stroke, bone fractures, preterm birth, and failure to thrive. Avoiding consumption of animal-sourced food may also be related to higher rates of depression and anxiety. Hair loss, weak bones, muscle wasting, skin rashes, hypothyroidism, and anemia are other issues that have been observed in those strictly following a vegan diet.””
Blackrock and Vanguard rated companies on their level of greenness and ESG compliance so people could invest with their conscience. What if a company provided ratings and investment opportunities for companies based on real-life contributions towards making less of an impact on the environment of providing higher levels of liveability at lower cost? There was quite a burst of interest in people getting together and building buildings that would house multiple people, with shared facilities. I recall seeing a lot of retirees putting up group homes with nice living units and shared recreational facilities. The small home movement uses less material and less energy for heating and cooling. I remember MoPeds being advertised when I was a kid that touted 175 mpg. Talk about affordable transportation. What about investments in companies that built components for passive solar homes or earth-sheltered homes. Sign me up.
I don’t know Paul. I like my solo residence with a large, private yard. I like being outside working on it, sitting out there in the evening enjoying the fruits of my labors. I wouldn’t like being lumped in with a bunch of old geezers all living and playing together. This is about me being happy and doing what I like, as I like, when I like.
As for mopeds, what Florida doesn’t need is old people on those things. But then, it might just thin the aged population faster than COVID did.
Ha Ha. I am not suggesting people be forced into anything. I just thought that there are companies that are innovating and making user friendly and energy helpful products that could benefit from additional funding. If I want to support that kind of activity it would be helpful if there was fund I could direct my money to. If the products being produced are truly useful, companies producing them should grow and my investment would grow. Win win.
Paul, if that’s what makes you feel good about yourself then have at it, I suppose. Don’t ask me to do the same and I won’t begrudge you. It’s clear you come at it from a heart of gold.
Objectively speaking though, it’s irrational to think that enforcing poverty on yourself helps anyone. It is nihilism writ small. Not consuming leads to not producing leads to unemployment leads to not consuming. Rinse and repeat.
Nothing in the universe is sustainable. There’s a hydrogen crisis. The sun will eventually use its supply up turning it into helium and so on to iron.
Today’s handwringing is as boneheaded as the whale oil crisis and the horse manure crisis of the nineteenth century.
The Tropics receive about 48.5% of the world’s energy from the sun.
. WV, Tropics, is about 24811 ppm, at 27 C and 70% humidity on land; 35,912 ppm, at 27 C and 100% humidity near ocean surface
.
WV molecules are about 24811/423 = 58.66 times more prevalent than CO2 molecules on land, and 84.90 times more prevalent near ocean surface
. Weighted average molecule ratio = 66/208 x 58.66 + 142/208 x 84.90 = 76.57 Weighted average ppm = 66/208 x 24811 + 142/208 x 35912 = 32,389 ppm
WV molecules absorb 32389/(32389+ 423) = 98.7%, and CO2 molecules 1.3% of available photons
WV much Better Than CO2: Whereas CO2 absorbs energy (gets warmer), transports that energy (convection) and distributes it by collision (conduction) and radiation, WV does all that much more effectively by adding phase change, liquid-to-vapor (constant temperature), transport, collision and radiation, and adding vapor-to-rain/snow/hail (constant temperature).
.
It’s not merely the molecule-count of WV vs CO2 that makes WV the dominant global warming gas.
.
WV is much more efficient in absorbing, transporting and distributing energy than CO2, and it is far more abundant than CO2, especially in the Major League Tropics compared to the Minor League temperate zones.
. NOTE: At 16 C and 50% humidity, WV in air is 0.0056 lb H2O/ lb dry air, or 2.5424 g H2O/ 454 g dry air. After converting to moles, 0.009022 mole H2O/mole dry air, or 9022 ppm.
A mole of WV is 18 g, a mole dry air is 29 g
. https://www.engineeringtoolbox.com/water-vapor-air-d_854.html http://www.uigi.com/psychrometry.html
WV (human, 2% and natural, 95%) causes 97% of global warming. Reducing fossil fuels would reduce the human 2%. The world surface temperature has increased about 1 C in 100 years. That caused about 7% more WV in the atmosphere, which caused some global warming. What factors are the cause of the 1 C increase? Was it fossil CO2, natural CO2, or other causes?. See URL https://globalchange.mit.edu/news-media/in-the-news/greenhouse-gases-water-vapor-and-you
Funny, it was 4 F at my cabin 3 days ago, the coldest temperature this winter. It got up to the mid 40s that day, then the next morning it was 8 F before again warming to the 40s.
Of course clear skies and no wind both days so the direct radiation to space for the cold nights, and direct radiation from the sun to the land both days.
One reason for water vapor being a more potent radiative gas is that the water molecule is polar and thus the vibrational and rotational modes make a much better antenna than the non-polar CO2 molecule. The folks at the MIT Rad Lab in WW2 found out the hard way when the 1.25cm radars had very poor range due to frequency being almost on top of the water molecule’s rotational frequency of ~21GHz. That resonance is very broad, which is why microwave ovens can cook at 2.45GHz, despite being nearly a factor of 10 off resonance.
OTOH, the much longer time for an excited CO2 molecule to emit a photon is why CO2 lasers work as well as they do.
Erik,
I have been looking for info regarding WV molecules being more likely to absorb a photon and release it by collision with N2, O2, etc.
I would like to compare photon absorb-to-release times of WV and CO2 molecules.
If, as you say, CO2 is slow to absorb and slow to release, it would greatly damage the CO2 reputation of being a greenhouse gas, especially compared to the much more versatile WV molecules; like comparing a sports car to a family sedan
First, the CO2 molecule is not slow to absorb, more accurate to say that it is less likely to absorb a passing photon and more picky about the wavelength (narrower lines than water). At sea level, the mean free path path of photon in a CO2 resonance is less than a meter. A very important point: CO2 will not absorb IR photons outside of the CO2 resonances
The slow to emit means that a CO2 molecule excited by absorbing a photon will be orders of magnitude more likely to transfer that extra energy to another molecule than re-radiating that energy. OTOH. That means the absorbed photon’s energy goes into warming the air. Having said that, a CO2 molecule can be excited by a collision and thus would have a small chance of radiating.
Bottom line is that an increase in CO2 concentration causes warming by raising the altitude at which the CO2 IR lines are being radiated into space and thus requiring the earth to get a little bit warmer to compensate.
Best resource for the information you are looking for is van Wingadern and Happer’s 2023 primer on greenhouse gases. Bear in mind that having an understanding of how antennas work and amplitude modulation will make the paper easier to understand.
Climate change is essentially all natural but humanity has contributed a little as a result of contributing to water vapor (TPW). About 90% of the human contribution to water vapor is from irrigation. The correlation is 0.789 with the probability of it occurring from random chance (null hypothesis) of 0.00000031. Burning fossil fuels (CO2 increase) has no significant effect on climate.
Back in the olden days, the excuse for a kid not having his homework done was “the dog ate my homework”. Would not at all be surprising if kids these days who fail to do their homework tell the teacher “sorry, I couldn’t do my homework because of climate change”! AKA it was too hot, too cold, the lights went out in a storm, etc., etc…
The kids do not need to do homework, because they get moved to the next grade level anyway, until they “graduate” knowing very little about anything.
The US “graduates” rank about 15th in the world?
observa
February 25, 2024 6:25 am
Look we get how our discounting helped murder the horrible residual value of your Tesla coming off lease so howsabout we let you bring the exorbitant FSD package you paid in full for over to a shiny new one eh? Tesla now offering Full Self-Driving transfers to boost demand in Australia (thedriven.io)
Best we don’t insult you with a trade-in price and you flog the old one sans FSD privately OK?
Charles – I can’t find anywhere else that would make sense to make such suggestions/requests. I would have done the same. Could you point us to the correct place for future reference?
Chen et al show doubling CO2 from 380 to 760 ppm only yields 0.55 W/m² forcing in the troposphere which represent a 0.18°C temperature differential.
How do they get this number? They use ERA5 reanalysis data with a radiation model. This leads to water vapor values that are realistic.
This is similar to the work of Miskolczi, Gray/Schwartz and Christy. When realistic water vapor numbers are input to radiation models, most of the the warming disappears.
Global warming found to increase the diversity of active soil bacteria
Warmer soils harbor a greater diversity of active microbes, according to a new study from researchers at the Centre for Microbiology and Environmental Systems Science (CeMESS) at the University of Vienna. https://phys.org/news/2024-02-global-diversity-soil-bacteria.html
i thought this was to be an extinction event
i have studied Thermodynamics for years now. On another post, I stated that the second law refers to isolated systems only. A Mr. Bo said it was hogwash. I found a statement that seemed to support his position. I admitted that he was correct and quoted that statement. That statement is:
“The second law of thermodynamics for a closed system states that the total entropy change must be positive for an irreversible system and surroundings and zero for a reversible system and surroundings.”
I expected a response from Mr. Bo. He has been very quiet and non-responsive. The quote actually refers to isolated systems. Whenever you include a system and its surroundings, you are dealing with an isolated system. So, I’m confused. Do trained scientist of Thermodynamics pretend they don’t do Thermodynamics? Why doesn’t Mr. Bo respond?
“It comes from the law of propagation of uncertainty; not the CTL.”
My main point was that there is a systematic bias present. The scaling of 1/sqrt(N) normally increases precision because as the number of Ns increases, so does the value of N in the denominator.
“c) It is unnecessary to classify components as random or systematic (or in any other manner) when evaluating uncertainty because all components of uncertainty are treated in the same way. Benefit c) is highly advantageous because such categorization is frequently a source of confusion; an uncertainty component is not either random or systematic. Its nature is conditioned by the use made of the corresponding quantity, or more formally, by the context in which the quantity appears in the mathematical model that describes the measurement. Thus, when its corresponding quantity is used in a different context, a random component may become a systematic component, and vice versa.”
You didn’t interpret this section correctly. Here’s an excerpt from this section:
“It thus exemplifies the viewpoint adopted in this Guide and cited in E.1.1, namely, that all components of uncertainty are of the same nature and are to be treated identically.”
If you read section E.1.1, it reads:
“This Guide presents a widely applicable method for evaluating and expressing uncertainty in measurement. It processes a realistic rather than a ‘safe’ value of uncertainty based on the concept that there is no inherent difference between an uncertainty component arising from a random effect and one arising from a correction for a systematic effect.”
“What did you ask ChatGPT?
This is the formula for the variance of the sum of i’s. It is not the formula for the variance of the average of i’s. Thus, why I want to know what you asked ChatGPT.”
I’m referring to ∑^N(i=1)σ^2(i), not σavg=∑^N(i=1)σ^2(i)+NS^2σ^2(s)/N^2. ∑^N(i=1)σ^2(i) is just notation that represents the contribution of uncertainty from all individual measurements. This could be human interpretation of the scale, for example, factoring out the environmental extreme’s impact on observer subjectivity that I mentioned earlier.
Regarding ChatGPT, I asked it to correct my grammar and punctuation in my sentence. I was editing my writing on here; I copied and pasted the corrected version of my writing directly from ChatGPT and forgot to take out:
“ChatGPT
ChatGPT
Your sentence is well-constructed, but there’s a minor grammatical suggestion for clarity:”
walter: You didn’t interpret this section correctly. Here’s an excerpt from this section
My interpretation of the section is that it is possible for a systematic component can become a random component when there is a context switch.
An example of a context switch is as follows. There are multiple measurement devices. Each device has its own systematic component of error. Device 1 is biased by the amount x1, device 2 is x2, and so until you get to device n which is biased by xn. The set of devices x:{1 to N} forms a probability distribution function of biases. Some biases may be negative while others may be positive. So in the context of a single device the measurements are all contaminated by the same bias. But in the context of a all devices the measurements are contaminated by different amounts. Again, some may be negative and some may be positive. Therefore measurement models built in the context of a single device will have the error appearing as systematic while measurement models built in the context of the group will have the error appearing as random.
walter: I’m referring to ∑^N(i=1)σ^2(i)
This is the familar root sum square (RSS) formula. Using a more accepted format I’ll write it as u(sum)^2 = Σ[u(x_i)^2, 1, N].
This formula describes the combined variance of the individual uncertainties when the individual measurements are summed together.
For example, if I have two string with lengths of 1.2 ± 0.3 m and 2.1 ± 0.6 m then if I want to know their combined length it would 1.2 m + 2.1 m = 3.3 m with an uncertainty of sqrt[(0.3m)^2 + (0.6m)^2] = 0.67 m so we write it as 3.3 ± 0.7 m.
It is important to reiterate that the formula you present is for measurement models that computes sums. It is not the correct formula for measurement models that compute other quantities like products, averages, etc.
walter: Regarding ChatGPT, I asked it to correct my grammar and punctuation in my sentence.
Yeah, no worries there. I don’t have an issue with the use ChatGPT. I’m just wondering why ChatGPT gave you the formula for the combined uncertainty of sums as opposed to the combined uncertainty of averages.
“My interpretation of the section is that it is possible for a systematic component can become a random component when there is a context switch.”
True, and not overstated. Yes, the more systematic components you have, the more randomized they become. But this condition is met, with the millions of measurements processed over decades using many different processes.
The deflectors rely on one misconception built on another to claim that systemic errors result in modern trends that are qualitatively different from those we know. First, denial of the fact that the larger a systemic component of error, the more likely it is that it will be spotted and corrected. Second -and most awe inspiring – that those errors would magically line up over time to qualitatively change resulting trends. The last always brings to mind the scene of the monkeys at their typewriters who will – sooner or later – type out the encyclopedia for us.
Can’t hold a candle to the depth/breadth of yours or bdgwx’s know how. You both post in multiple tech venues, while this is the only one where even I can see and comment on the bad wiring exhibited.
This chapter does not assert that systematic effects can undergo a context switch to behave more like random errors; instead, it suggests that a correction for the systematic error may be treated as such.
Just because systematic biases manifest in both negative and positive directions across a large quantity of instruments doesn’t transform them into random errors. Their influence is likely not scattered around the true value. However, we do know that their deviation would be consistent; that’s all that is needed to be classified as a systematic bias.
walter: This chapter does not assert that systematic effects can undergo a context switch to behave more like random errors
That is what it says…literally in section E.3.6.
walter: it suggests that a correction for the systematic error may be treated as such.
The word “correction” appears nowhere in section E.3.6. That’s not to say that a correction cannot have a random and/or systematic effect itself. It can. But that doesn’t preclude systematic effects behaving like random effects when there is a context switch.
walter: Just because systematic biases manifest in both negative and positive directions across a large quantity of instruments doesn’t transform them into random errors.
The GUM isn’t saying that systematic error transforms into random error when there are a large quantity of instruments. It is saying that when you view those systematic errors in a different context (like as a group instead of individually) they appear as random errors. They do so because those systematic errors are different and thus form a probability distribution function.
Said another way…when you randomly sample measurements from a group of instruments as opposed to one instrument you won’t necessarily get the same error every time like you would if you randomly sample measurements from a single instrument.
walter: Their influence is likely not scattered around the true value.
Not necessarily. There is no fundamental reason why mean of the set of systematic errors could not be zero. That is not to say that it is always zero. It’s not. But this is a complication that can only be dealt with after more basic fundamental truths have been understood.
walter: However, we do know that their deviation would be consistent; that’s all that is needed to be classified as a systematic bias.
Correct. See section B.2.22. It should be obvious then that when you sample measurements from a group of instruments you will get different (as in not consistent) deviations from the true value of the same measurand even if there is no random error. So if you truly accept the statement you just made then the epiphany that switching the context to that of a group of instruments with different systematic errors as opposed to a single instrument with the same systematic error will necessarily result in systematic errors behaving as random errors should be immediate and obvious.
This is the first paragraph of the entire section (E.3):
“The focus of the discussion of this subclause is a simple example that illustrates how this Guide treats uncertainty components arising from random effects and from corrections for systematic effects in exactly the same way in the evaluation of the uncertainty of the result of a measurement. It thus exemplifies the viewpoint adopted in this Guide and cited in E.1.1, namely, that all components of uncertainty are of the same nature and are to be treated identically.”
Not necessarily. There is no fundamental reason why mean of the set of systematic errors could not be zero. That is not to say that it is always zero. It’s not. But this is a complication that can only be dealt with after more basic fundamental truths have been understood.
Correct. See section B.2.22. It should be obvious then that when you sample measurements from a group of instruments you will get different (as in not consistent) deviations from the true value of the same measurand even if there is no random error. So if you truly accept the statement you just made then the epiphany that switching the context to that of a group of instruments with different systematic errors as opposed to a single instrument with the same systematic error will necessarily result in systematic errors behaving as random errors should be immediate and obvious.
There is a fundamental reason: the viscosity of mercury. If a rapid cold front passes through an area and the mercury doesn’t contract quickly enough to keep up with the air mass, there can be consistent, large positive deviations. Climate science demands precision to the tenth decimal place. To say that these errors average out to zero or near zero, and that they are scattered, is a huge assumption and wishful thinking. We don’t know the true value because we get one chance to record the measurement at a given time.
“There is a fundamental reason: the viscosity of mercury.”
Do you instead mean the specific heat and heat conductivity of mercury? Given the tiny volumes and column thickness of that mercury, I can’t imagine convection to be any better with water, for the conditions you’re describing. And also for those conditions, given those tiny volumes, the fact that mercury conducts electricity quite well, and the engineering rule of thumb that good electrical conductors are also good heat conductors, I can’t imagine the difference in temp between the mercury and ambient being meaningfully different for more than a minute or 2.
No, no backup data. Do you have any to support your claim?
[MONKEY MAN] If a rapid cold front passes through an area and the mercury doesn’t contract quickly enough to keep up with the air mass, there can be consistent, large positive deviations
[ALSO MONKEY MAN] Carl Mears tried to claim that the surface data is more accurate than satellite data, which is nonsense. He’s just a little bitch, who felt the pressure of having a dataset with the Pause.
walter: This is the first paragraph of the entire section (E.3):
That has nothing to do with the points being made in E.3.6.
walter: if a rapid cold front passes through an area and the mercury doesn’t contract quickly enough to keep up with the air mass, there can be consistent, large positive deviations.
A better example is a tree grows in the vicinity of the same thermometer it will start reader lower and lower over time resulting a large negative deviation. At least in this case it is truly systematic since the negative deviation is there all of the time.
Cold fronts don’t happen all of the time and when they do happen the temperature tends to remain low allowing the float to eventually catch up and record the actual minimum temperature regardless. BTW…the effect you’re talking about has a name…it’s called the time constant. For LiGs the typical time constant is on the order of 1-3 minutes.
AI has been very unhelpful when providing these equations. Do you get them from the GUM? If so, in the future, alongside your provided equation, please provide the specific section.
The familiar root sum square (RSS) equation is a derivation of JCGM 100:2008 equation 10 when the measurement model is y = Σ[x_i, 1, N] since the partial derivative ∂y/∂x_i = 1 for all x_i. Notice that y combines measurements via a trivial sum here.
Nobody sums temperatures though. They average them. The measurement model that computes an average is y = Σ[x_i, 1, N] / N. In that case the partial derivative ∂y/∂x_i = 1/N for all x_i which means the uncertainty u(y) = sqrt[Σ[u(x_i)^2, 1, N]] / N. And if u(x) = u(x_i) for all x_i then it simplifies further to u(y) = u(x) / sqrt(N). That is the familiar standard error of the mean (SEM) equation.
*
Much more relevant, however is the fact that the uncertainty gang ‘operating’ on this blog always ask for ‘Where is your uncertainty calculation?’ but never provides anything the like – let alone would they ever be able to generate the data you present, and then to add the uncertainty calculation they missed in your presentation.
Even worse…the gang’s challenges are entirely based on making numerous algebra mistakes some so trivial that even elementary age students could spot them. I don’t know how many times I’ve had to explain to the Gorman’s that addition (+) and division (/) are different operations that produce different results and that sums and averages are different operations that produce different results. I even tried to convince them to use a computer algebra system to check their work and the response was along the lines of “computer algebra systems are wrong”. That’s what we’re dealing with here.
The monkey forgot to mention that if you see that a post of yours wasn’t sent successfully, you can get it back by clicking on the browser’s ‘history back’ arrow; move to the thread’s end if you don’t see it. There you find the rejected text and can edit it again, e.g. by inserting a blank between d and c or between p and t.
NOAA sent me a list of 655 GHCN stations for CONUS with < 10% missing data from 1900 to 2020. Using that list, I plotted out the “days with Tmax>= 95F” and “days with Tmax>=90F” per active station, through 2023. NOAA used this same list for the 2022 state report for WY, Fig 2.
A link to that WY report follows. CONUS=contiguous (or coterminous) US. The CONUS plot in the NOAA report was provided for better context considering the shorter record for Wyoming.
https://statesummaries.ncics.org/chapter/wy/
I will also post plots for Tmin>=70F and Tmin>=65F.
Tmax>=90F
Correction – the y axis says “Tmin”. It should say “Tmax”.
Those charts are remarkably similar to the ones from Tony Heller. 🙂
Yes, he often uses the USHCN list of 1,218 stations for similar analyses and plots. I would expect very similar results. I used the list of 655 stations here because that is how NOAA recently generated the plots in fig 2 and fig 3 of the report linked in my original comment. On X, Matthew Wielicki posted last month about the NOAA fig 2 plot, which uses data through 2020, and wondered about an update. So that is the direct reason why I did this using NOAA’s list and first posted the results on X yesterday.
BTW, it was because of Tony Heller’s work that I was motivated to develop my own methods using R scripts to independently confirm his results. It’s quite complicated to extract the values correctly from the data files, but I eventually worked it out.
I don’t think people realise just how much of a computer programming geek Tony Heller is. 🙂
It is worth going to his site to see some of his newer stuff in action. 🙂
Tmin>=70F
Tmin>=65F
Please note that I did not use a lat/lon grid averaging method as NOAA did for their plots in Fig 2 and Fig 3. (As understood from the metadata posted for each of the figures.) These plots I have posted simply show the total number of reported hot days or warm nights divided by the number of active stations for each year. The overall sense of the history is the same either way.
Conclusions:
The contiguous US experienced far more “hot days” in the early part of this record. And “warm nights” are about at the same numbers as earlier.
David,
The way I eyeball that, I see two takeaways:
Global milding, not exciting enough! Let’s talk about sea level and hyper-hurricanes or something!
Oh, let’s talk about extreme mildness instead.
Or maybe:
The U.S. coast is in an unprecedented
hurricane drought – why this is terrifying –
The Washington Post August 4th 2016
I would think the increase in average temperatures is caused more by increase in minimum temperatures than by increase in maximum temperatures. Which makes sense as warming caused by CO2 would show up as an increase in sky temperature which would have more of an effect at night than during the day. Note that clear nights tend to be cooler than cloudy nights.
Yes, I would agree. It’s consistent with a minor enhancement in the greenhouse effect and also consistent with the statement that there is NO CLIMATE EMERGENCY!
David, Thanks for the plots which comfirm my impression that the 1930s had highest daytime temps but current night time temps have been going up over the past few decades. That seems to be true in my neck of the woods.
It is clear there is a change in the charts in the mid 1970’s. The most logical explanation is that prior to 1975 most stations had their temperatures recorded by people rather than computers, and it is known that human readings and recording are simply inaccurate and need to be “adjusted” because none of those weather station operators had training in thermometer reading. /s
Thanks for this David. I suspect that if you do similar plots of number of days with less than 0°, -10° etc. you will see some equally interesting trends. Winter minimum temperatures seem to have been slowly but steadily rising since the early 1970’s, at least where I live and work in Ontario. One tangible effect is the length of the trucking season on winter ice roads serving remote communities in the north.
I look forward to a “winter” version of your analysis. I strongly suspect that a lot of the “warming” reported by the climate-industrial-media complex is due to an increase in winter minimum temperatures in the northern hemisphere. From my (anecdotal) experience, I’ve seen little or no evidence of summers getting hotter in the boreal forest, despite the media’s attempts to assert just that.
However, it may be part of the natural climate variability.
Here is a plot for days with Tmin<=0F from the same records.
And here is a plot for days with Tmin<=-10F from the same records.
The ‘market’ for Climastrology research grants is saturated. We need a new bogus crisis to research! Something that nobody can see or smell or taste, but it’s ominous! And caused by the scourge of human flourishing.
Eureka! How about ‘microplastics’?
And best of all, there are no known effects, so there is an infinite number of “Does X affect Y?” opportunities for papers.
https://futurism.com/the-byte/microplastics-sediment-layers
Unbelievable !https://www.cnn.com/videos/tv/2024/02/14/amanpour-michael-mann.cnn
It was tough but I watched all of it. My conclusion is that M. Mann gave the questions he wanted to speak about to Ms. Amanpour who has no concept of the truthfulness of his rants nor the intention to give him a difficult question.
Didn’t he do a similar thing in the Penn State investigation?
That is, gave the investigators the questions to ask him?
You’re expecting to get unbiased reporting from Christine Amanpour???
An interesting point:
https://www.cfact.org/2024/02/23/why-californias-climate-disclosure-law-should-doom-green-energy/
Think renewables disclosing the emissions from their supply chain all the way back to mining.
Dave,
The BS starts at the very top, and filters down to all the states
Levelized Cost of Energy Deceptions, by US-EIA, et al.
Most people have no idea wind and solar systems need grid expansion/reinforcement and expensive support systems to even exist on the grid.
With increased annual W/S electricity percent on the grid, increased grid investments are needed, plus greater counteracting plant capacity, MW, especially when it is windy and sunny around noon-time.
Increased counteracting of the variable W/S output, places an increased burden on the grid’s other generators, causing them to operate in an inefficient manner (more Btu/kWh, more CO2/kWh), which adds more cost/kWh to the offshore wind electricity cost of about 16 c/kWh, after 50% subsidies
The various cost/kWh adders start with annual W/S electricity at about 8% on the grid.
The adders become exponentially greater, with increased annual W/S electricity percent on the grid
The US-EIA, Lazard, Bloomberg, etc., and their phony LCOE “analyses”, are deliberately understating the cost of wind, solar and battery systems
Their LCOE “analyses” of W/S/B systems purposely exclude major LCOE items.
Their deceptions reinforced the popular delusion, W/S are competitive with fossil fuels, which is far from reality.
The excluded LCOE items are shifted to taxpayers, ratepayers, and added to government debts.
W/S would not exist without at least 50% subsidies
W/S output could not be physically fed into the grid, without items 2, 3, 4, 5, and 6. See list.
1) Subsidies equivalent to about 50% of project lifetime owning and operations cost,
2) Grid extension/reinforcement to connect remote W/S systems to load centers
3) A fleet of quick-reacting power plants to counteract the variable W/S output, on a less-than-minute-by-minute basis, 24/7/365
4) A fleet of power plants to provide electricity during low-W/S periods, and 100% during high-W/S periods, when rotors are feathered and locked,
5) Output curtailments to prevent overloading the grid, i.e., paying owners for not producing what they could have produced
6) Hazardous waste disposal of wind turbines, solar panels and batteries. See image.
In the jargon of cost benefit analysis this is the difference between direct and indirect costs. LCOE is the direct cost — the output of a generator divided by its cost. It is supposed to include your #6 by the way, being the life cycle cost divided by the lifetime generation. I suspect that annual figures are sometimes used which would be a deception.
The real deception is knowingly ignoring the indirect costs that you correctly list. Thus the system cost is far greater than the LCOE. Many people are in fact fooled by this deception. But I think that most skeptics who care about this issue understand the point.
Reacting to French farmers protests against the French declination of “WOTUS” especially and administrative control of farming in general, in a word, against the Sovietisation of French agriculture, “ultra liberal” Macron went full commie: he wants price control of agriculture goods.
Macron is a French Trudeau: he sounds very smart – but only for the ultra deluded and hyper unintelligent.
Why Right Wingers Are Going Crazy About Meat
https://www.thedailybeast.com/why-right-wingers-are-going-crazy-about-meat?ref=scroll
I have always eaten meat. This is entirely natural.
This fact makes me right wing?
The article and these weird biases you have are absurd.
I found it interresting because it points out the culture wars left vs right wing bullshit we see in many other topics.
And the article states that, altough there is a bias, most on the left don’t care for vegetarian / vegan diets either.
Just another useless polarization to split and distract.
I eat meat too and, surely surprising for many here, am not right-wing.
“”it points out the culture wars left vs right wing bullshit “”
This is based on some of the lunatic theories beloved by academia in the US:
critical race theory, queer theory, wokeism etc etc and a real sense of not knowing what a woman actually is.
In short it’s UNhinged.
most on the left don’t care for vegetarian / vegan diets either.
That may be, but the push for it, which includes finding ways to legislatively reduce meat consumption, comes from the left, and there isn’t a lot of visible push back from others the left.
You do seem though to be one those people who feel motivated to tell others what you “feelings” are about inconsequential topics.
Is it part of the “identity fad” epidemic that’s being inflicted on the world at the moment?
Just for your ejamakashun –
most people dgaf about identities or fad ideologies.
There is a drive on the part of some climate extremists to abolish meat eating. What’s very interesting in this is the extent to which attitudes to a number of things seem to be forming into sets, which are accepted or rejected as a set. So the promotion of veganism on climate grounds does seem to be associated with ‘progressive’ views on race and on gender, though there is no logical connection, they are logically independent.
Its interesting to wonder why this should be.
You go eat Gates’ bugs, leave normal people alone.
“”Compared to the standard American diet of highly processed, low-fiber, high-calorie, sugary foods, vegan diets have some health advantages. However, researchers found that avoiding all animal foods may lead to nutritional deficiencies in vitamin B12, omega-3, calcium, zinc, iron, magnesium, and high-quality protein.
These deficiencies may be associated with increased risk for certain types of cancer, stroke, bone fractures, preterm birth, and failure to thrive. Avoiding consumption of animal-sourced food may also be related to higher rates of depression and anxiety. Hair loss, weak bones, muscle wasting, skin rashes, hypothyroidism, and anemia are other issues that have been observed in those strictly following a vegan diet.””
Research Shows Vegan Diet Leads to Nutritional Deficiencies, Health Problems; Plant-Forward Omnivorous Whole Foods Diet Is Healthier | Saint Luke’s Health System (saintlukeskc.org)
Blackrock and Vanguard rated companies on their level of greenness and ESG compliance so people could invest with their conscience. What if a company provided ratings and investment opportunities for companies based on real-life contributions towards making less of an impact on the environment of providing higher levels of liveability at lower cost? There was quite a burst of interest in people getting together and building buildings that would house multiple people, with shared facilities. I recall seeing a lot of retirees putting up group homes with nice living units and shared recreational facilities. The small home movement uses less material and less energy for heating and cooling. I remember MoPeds being advertised when I was a kid that touted 175 mpg. Talk about affordable transportation. What about investments in companies that built components for passive solar homes or earth-sheltered homes. Sign me up.
I don’t know Paul. I like my solo residence with a large, private yard. I like being outside working on it, sitting out there in the evening enjoying the fruits of my labors. I wouldn’t like being lumped in with a bunch of old geezers all living and playing together. This is about me being happy and doing what I like, as I like, when I like.
As for mopeds, what Florida doesn’t need is old people on those things. But then, it might just thin the aged population faster than COVID did.
Ha Ha. I am not suggesting people be forced into anything. I just thought that there are companies that are innovating and making user friendly and energy helpful products that could benefit from additional funding. If I want to support that kind of activity it would be helpful if there was fund I could direct my money to. If the products being produced are truly useful, companies producing them should grow and my investment would grow. Win win.
Paul, if that’s what makes you feel good about yourself then have at it, I suppose. Don’t ask me to do the same and I won’t begrudge you. It’s clear you come at it from a heart of gold.
Objectively speaking though, it’s irrational to think that enforcing poverty on yourself helps anyone. It is nihilism writ small. Not consuming leads to not producing leads to unemployment leads to not consuming. Rinse and repeat.
Nothing in the universe is sustainable. There’s a hydrogen crisis. The sun will eventually use its supply up turning it into helium and so on to iron.
Today’s handwringing is as boneheaded as the whale oil crisis and the horse manure crisis of the nineteenth century.
El Niños, Hunga Tonga Volcanic Eruption, and the Tropics
https://www.windtaskforce.org/profiles/blogs/hunga-tonga-volcanic-eruption
.
Refer to this URL to see additional images
https://www.windtaskforce.org/profiles/blogs/natural-forces-cause-periodic-global-warming
.
EXCERPTS
The Tropics receive about 48.5% of the world’s energy from the sun.
.
WV, Tropics, is about 24811 ppm, at 27 C and 70% humidity on land; 35,912 ppm, at 27 C and 100% humidity near ocean surface
.
WV molecules are about 24811/423 = 58.66 times more prevalent than CO2 molecules on land, and 84.90 times more prevalent near ocean surface
.
Weighted average molecule ratio = 66/208 x 58.66 + 142/208 x 84.90 = 76.57
Weighted average ppm = 66/208 x 24811 + 142/208 x 35912 = 32,389 ppm
WV molecules absorb 32389/(32389+ 423) = 98.7%, and CO2 molecules 1.3% of available photons
WV much Better Than CO2: Whereas CO2 absorbs energy (gets warmer), transports that energy (convection) and distributes it by collision (conduction) and radiation, WV does all that much more effectively by adding phase change, liquid-to-vapor (constant temperature), transport, collision and radiation, and adding vapor-to-rain/snow/hail (constant temperature).
.
It’s not merely the molecule-count of WV vs CO2 that makes WV the dominant global warming gas.
.
WV is much more efficient in absorbing, transporting and distributing energy than CO2, and it is far more abundant than CO2, especially in the Major League Tropics compared to the Minor League temperate zones.
.
NOTE: At 16 C and 50% humidity, WV in air is 0.0056 lb H2O/ lb dry air, or 2.5424 g H2O/ 454 g dry air. After converting to moles, 0.009022 mole H2O/mole dry air, or 9022 ppm.
A mole of WV is 18 g, a mole dry air is 29 g
.
https://www.engineeringtoolbox.com/water-vapor-air-d_854.html
http://www.uigi.com/psychrometry.html
From an MIT Conference:
WV (human, 2% and natural, 95%) causes 97% of global warming. Reducing fossil fuels would reduce the human 2%. The world surface temperature has increased about 1 C in 100 years. That caused about 7% more WV in the atmosphere, which caused some global warming. What factors are the cause of the 1 C increase? Was it fossil CO2, natural CO2, or other causes?. See URL
https://globalchange.mit.edu/news-media/in-the-news/greenhouse-gases-water-vapor-and-you
Because warmer is better, why care about how we get there.
Tom,
I agree. It was 0 F at my house in Vermont this morning
Important Role of CO2 for Flora Growth
.
Many plants require greater CO2 than 400 ppm to survive and thrive, so they became extinct, along with the fauna they supported. As a result, many areas of the world became arid and deserts. The current CO2 needs to at least double or triple to reinvigorate the world’s flora and fauna.
CO2 has increased from about 280 ppm in 1900 to 423 ppm at end 2023. It increased:
.
1) Greening of the world by at least 10 to 15%, as measured by satellites since 1979.
2) Crop yields per acre.
3) Partially due to burning fossil fuels
.
https://www.windtaskforce.org/profiles/blogs/new-study-2001-2020-global-greening-is-an-indisputable-fact-and
https://www.windtaskforce.org/profiles/blogs/co2-is-a-life-gas-no-co2-no-life
https://www.windtaskforce.org/profiles/blogs/co2-is-not-pollution-it-s-the-currency-of-life
Funny, it was 4 F at my cabin 3 days ago, the coldest temperature this winter. It got up to the mid 40s that day, then the next morning it was 8 F before again warming to the 40s.
Of course clear skies and no wind both days so the direct radiation to space for the cold nights, and direct radiation from the sun to the land both days.
Must have been the CO2.
One reason for water vapor being a more potent radiative gas is that the water molecule is polar and thus the vibrational and rotational modes make a much better antenna than the non-polar CO2 molecule. The folks at the MIT Rad Lab in WW2 found out the hard way when the 1.25cm radars had very poor range due to frequency being almost on top of the water molecule’s rotational frequency of ~21GHz. That resonance is very broad, which is why microwave ovens can cook at 2.45GHz, despite being nearly a factor of 10 off resonance.
OTOH, the much longer time for an excited CO2 molecule to emit a photon is why CO2 lasers work as well as they do.
Erik,
I have been looking for info regarding WV molecules being more likely to absorb a photon and release it by collision with N2, O2, etc.
I would like to compare photon absorb-to-release times of WV and CO2 molecules.
If, as you say, CO2 is slow to absorb and slow to release, it would greatly damage the CO2 reputation of being a greenhouse gas, especially compared to the much more versatile WV molecules; like comparing a sports car to a family sedan
First, the CO2 molecule is not slow to absorb, more accurate to say that it is less likely to absorb a passing photon and more picky about the wavelength (narrower lines than water). At sea level, the mean free path path of photon in a CO2 resonance is less than a meter. A very important point: CO2 will not absorb IR photons outside of the CO2 resonances
The slow to emit means that a CO2 molecule excited by absorbing a photon will be orders of magnitude more likely to transfer that extra energy to another molecule than re-radiating that energy. OTOH. That means the absorbed photon’s energy goes into warming the air. Having said that, a CO2 molecule can be excited by a collision and thus would have a small chance of radiating.
Bottom line is that an increase in CO2 concentration causes warming by raising the altitude at which the CO2 IR lines are being radiated into space and thus requiring the earth to get a little bit warmer to compensate.
Best resource for the information you are looking for is van Wingadern and Happer’s 2023 primer on greenhouse gases. Bear in mind that having an understanding of how antennas work and amplitude modulation will make the paper easier to understand.
Climate change is essentially all natural but humanity has contributed a little as a result of contributing to water vapor (TPW). About 90% of the human contribution to water vapor is from irrigation. The correlation is 0.789 with the probability of it occurring from random chance (null hypothesis) of 0.00000031. Burning fossil fuels (CO2 increase) has no significant effect on climate.
Good point, irrigation and whole host of rain man made surface effect : roads, black tops runways buildings and other condensating processes process?
I have a hangnail that was caused by climate change.
Where can I get some justice and compensation?
Yes, it’s absurd. Just like all the other claims that say they are caused by climate change.
Climate alarmists say/think every bad thing is caused by climate change.
Climate alarmists are delusional.
Back in the olden days, the excuse for a kid not having his homework done was “the dog ate my homework”. Would not at all be surprising if kids these days who fail to do their homework tell the teacher “sorry, I couldn’t do my homework because of climate change”! AKA it was too hot, too cold, the lights went out in a storm, etc., etc…
The kids do not need to do homework, because they get moved to the next grade level anyway, until they “graduate” knowing very little about anything.
The US “graduates” rank about 15th in the world?
Look we get how our discounting helped murder the horrible residual value of your Tesla coming off lease so howsabout we let you bring the exorbitant FSD package you paid in full for over to a shiny new one eh?
Tesla now offering Full Self-Driving transfers to boost demand in Australia (thedriven.io)
Best we don’t insult you with a trade-in price and you flog the old one sans FSD privately OK?
Why are comments limited to only 1000 characters?
Such low limits greatly reduce flexibility regarding comments.
Adding just one image eats up most of the limit
Please consider having 5000 characters as the limit
done. although this is literally not the place to make these suggestions and you’re lucky I saw this.
Charles – I can’t find anywhere else that would make sense to make such suggestions/requests. I would have done the same. Could you point us to the correct place for future reference?
Thank you, Charles
A recent study highlights the main problem with climate science.
https://notrickszone.com/2024/01/22/new-journal-of-climate-study-reduces-doubled-co2-climate-sensitivity-by-40-to-0-72c/
Chen et al show doubling CO2 from 380 to 760 ppm only yields 0.55 W/m² forcing in the troposphere which represent a 0.18°C temperature differential.
How do they get this number? They use ERA5 reanalysis data with a radiation model. This leads to water vapor values that are realistic.
This is similar to the work of Miskolczi, Gray/Schwartz and Christy. When realistic water vapor numbers are input to radiation models, most of the the warming disappears.
ERA5 uses the RRTM for its radiative transfer model. The RRTM says 2xCO2 generates a 4.0 W/m2 radiative force.
Global warming found to increase the diversity of active soil bacteria
Warmer soils harbor a greater diversity of active microbes, according to a new study from researchers at the Centre for Microbiology and Environmental Systems Science (CeMESS) at the University of Vienna.
https://phys.org/news/2024-02-global-diversity-soil-bacteria.html
i thought this was to be an extinction event
The largest diversity of “living things” is generally found in “HOT and WET” areas.
I bet they cancel you everywhere you go, It’s such nonsense to state the obvious.
The rainy season in the Amazon basin.

i have studied Thermodynamics for years now. On another post, I stated that the second law refers to isolated systems only. A Mr. Bo said it was hogwash. I found a statement that seemed to support his position. I admitted that he was correct and quoted that statement. That statement is:
“The second law of thermodynamics for a closed system states that the total entropy change must be positive for an irreversible system and surroundings and zero for a reversible system and surroundings.”
I expected a response from Mr. Bo. He has been very quiet and non-responsive. The quote actually refers to isolated systems. Whenever you include a system and its surroundings, you are dealing with an isolated system. So, I’m confused. Do trained scientist of Thermodynamics pretend they don’t do Thermodynamics? Why doesn’t Mr. Bo respond?
bdgwx,
“It comes from the law of propagation of uncertainty; not the CTL.”
My main point was that there is a systematic bias present. The scaling of 1/sqrt(N) normally increases precision because as the number of Ns increases, so does the value of N in the denominator.
“c) It is unnecessary to classify components as random or systematic (or in any other manner) when evaluating uncertainty because all components of uncertainty are treated in the same way. Benefit c) is highly advantageous because such categorization is frequently a source of confusion; an uncertainty component is not either random or systematic. Its nature is conditioned by the use made of the corresponding quantity, or more formally, by the context in which the quantity appears in the mathematical model that describes the measurement. Thus, when its corresponding quantity is used in a different context, a random component may become a systematic component, and vice versa.”
You didn’t interpret this section correctly. Here’s an excerpt from this section:
“It thus exemplifies the viewpoint adopted in this Guide and cited in E.1.1, namely, that all components of uncertainty are of the same nature and are to be treated identically.”
If you read section E.1.1, it reads:
“This Guide presents a widely applicable method for evaluating and expressing uncertainty in measurement. It processes a realistic rather than a ‘safe’ value of uncertainty based on the concept that there is no inherent difference between an uncertainty component arising from a random effect and one arising from a correction for a systematic effect.”
“What did you ask ChatGPT?
This is the formula for the variance of the sum of i’s. It is not the formula for the variance of the average of i’s. Thus, why I want to know what you asked ChatGPT.”
I’m referring to ∑^N(i=1)σ^2(i), not σavg=∑^N(i=1)σ^2(i)+NS^2σ^2(s)/N^2. ∑^N(i=1)σ^2(i) is just notation that represents the contribution of uncertainty from all individual measurements. This could be human interpretation of the scale, for example, factoring out the environmental extreme’s impact on observer subjectivity that I mentioned earlier.
Regarding ChatGPT, I asked it to correct my grammar and punctuation in my sentence. I was editing my writing on here; I copied and pasted the corrected version of my writing directly from ChatGPT and forgot to take out:
“ChatGPT
ChatGPT
Your sentence is well-constructed, but there’s a minor grammatical suggestion for clarity:”
My interpretation of the section is that it is possible for a systematic component can become a random component when there is a context switch.
An example of a context switch is as follows. There are multiple measurement devices. Each device has its own systematic component of error. Device 1 is biased by the amount x1, device 2 is x2, and so until you get to device n which is biased by xn. The set of devices x:{1 to N} forms a probability distribution function of biases. Some biases may be negative while others may be positive. So in the context of a single device the measurements are all contaminated by the same bias. But in the context of a all devices the measurements are contaminated by different amounts. Again, some may be negative and some may be positive. Therefore measurement models built in the context of a single device will have the error appearing as systematic while measurement models built in the context of the group will have the error appearing as random.
This is the familar root sum square (RSS) formula. Using a more accepted format I’ll write it as u(sum)^2 = Σ[u(x_i)^2, 1, N].
This formula describes the combined variance of the individual uncertainties when the individual measurements are summed together.
For example, if I have two string with lengths of 1.2 ± 0.3 m and 2.1 ± 0.6 m then if I want to know their combined length it would 1.2 m + 2.1 m = 3.3 m with an uncertainty of sqrt[(0.3m)^2 + (0.6m)^2] = 0.67 m so we write it as 3.3 ± 0.7 m.
It is important to reiterate that the formula you present is for measurement models that computes sums. It is not the correct formula for measurement models that compute other quantities like products, averages, etc.
Yeah, no worries there. I don’t have an issue with the use ChatGPT. I’m just wondering why ChatGPT gave you the formula for the combined uncertainty of sums as opposed to the combined uncertainty of averages.
“My interpretation of the section is that it is possible for a systematic component can become a random component when there is a context switch.”
True, and not overstated. Yes, the more systematic components you have, the more randomized they become. But this condition is met, with the millions of measurements processed over decades using many different processes.
The deflectors rely on one misconception built on another to claim that systemic errors result in modern trends that are qualitatively different from those we know. First, denial of the fact that the larger a systemic component of error, the more likely it is that it will be spotted and corrected. Second -and most awe inspiring – that those errors would magically line up over time to qualitatively change resulting trends. The last always brings to mind the scene of the monkeys at their typewriters who will – sooner or later – type out the encyclopedia for us.
You touch on an interesting. That is the larger the systematic error the more likely it is to be noticed and corrected.
Thank you for this outstanding remark.
Can’t hold a candle to the depth/breadth of yours or bdgwx’s know how. You both post in multiple tech venues, while this is the only one where even I can see and comment on the bad wiring exhibited.
bdgwx,
This chapter does not assert that systematic effects can undergo a context switch to behave more like random errors; instead, it suggests that a correction for the systematic error may be treated as such.
Just because systematic biases manifest in both negative and positive directions across a large quantity of instruments doesn’t transform them into random errors. Their influence is likely not scattered around the true value. However, we do know that their deviation would be consistent; that’s all that is needed to be classified as a systematic bias.
That is what it says…literally in section E.3.6.
The word “correction” appears nowhere in section E.3.6. That’s not to say that a correction cannot have a random and/or systematic effect itself. It can. But that doesn’t preclude systematic effects behaving like random effects when there is a context switch.
The GUM isn’t saying that systematic error transforms into random error when there are a large quantity of instruments. It is saying that when you view those systematic errors in a different context (like as a group instead of individually) they appear as random errors. They do so because those systematic errors are different and thus form a probability distribution function.
Said another way…when you randomly sample measurements from a group of instruments as opposed to one instrument you won’t necessarily get the same error every time like you would if you randomly sample measurements from a single instrument.
Not necessarily. There is no fundamental reason why mean of the set of systematic errors could not be zero. That is not to say that it is always zero. It’s not. But this is a complication that can only be dealt with after more basic fundamental truths have been understood.
Correct. See section B.2.22. It should be obvious then that when you sample measurements from a group of instruments you will get different (as in not consistent) deviations from the true value of the same measurand even if there is no random error. So if you truly accept the statement you just made then the epiphany that switching the context to that of a group of instruments with different systematic errors as opposed to a single instrument with the same systematic error will necessarily result in systematic errors behaving as random errors should be immediate and obvious.
This is the first paragraph of the entire section (E.3):
“The focus of the discussion of this subclause is a simple example that illustrates how this Guide treats uncertainty components arising from random effects and from corrections for systematic effects in exactly the same way in the evaluation of the uncertainty of the result of a measurement. It thus exemplifies the viewpoint adopted in this Guide and cited in E.1.1, namely, that all components of uncertainty are of the same nature and are to be treated identically.”
There is a fundamental reason: the viscosity of mercury. If a rapid cold front passes through an area and the mercury doesn’t contract quickly enough to keep up with the air mass, there can be consistent, large positive deviations. Climate science demands precision to the tenth decimal place. To say that these errors average out to zero or near zero, and that they are scattered, is a huge assumption and wishful thinking. We don’t know the true value because we get one chance to record the measurement at a given time.
“There is a fundamental reason: the viscosity of mercury.”
Do you instead mean the specific heat and heat conductivity of mercury? Given the tiny volumes and column thickness of that mercury, I can’t imagine convection to be any better with water, for the conditions you’re describing. And also for those conditions, given those tiny volumes, the fact that mercury conducts electricity quite well, and the engineering rule of thumb that good electrical conductors are also good heat conductors, I can’t imagine the difference in temp between the mercury and ambient being meaningfully different for more than a minute or 2.
No, no backup data. Do you have any to support your claim?
Compare and contrast:
[MONKEY MAN] If a rapid cold front passes through an area and the mercury doesn’t contract quickly enough to keep up with the air mass, there can be consistent, large positive deviations
[ALSO MONKEY MAN] Carl Mears tried to claim that the surface data is more accurate than satellite data, which is nonsense. He’s just a little bitch, who felt the pressure of having a dataset with the Pause.
https://www.drroyspencer.com/2024/03/uah-global-temperature-update-for-february-2024-0-93-deg-c/#comment-1638409
Who will tell him about how satellites extract their “measurements” or about Da Paws in 2016?
That has nothing to do with the points being made in E.3.6.
A better example is a tree grows in the vicinity of the same thermometer it will start reader lower and lower over time resulting a large negative deviation. At least in this case it is truly systematic since the negative deviation is there all of the time.
Cold fronts don’t happen all of the time and when they do happen the temperature tends to remain low allowing the float to eventually catch up and record the actual minimum temperature regardless. BTW…the effect you’re talking about has a name…it’s called the time constant. For LiGs the typical time constant is on the order of 1-3 minutes.
AI has been very unhelpful when providing these equations. Do you get them from the GUM? If so, in the future, alongside your provided equation, please provide the specific section.
The familiar root sum square (RSS) equation is a derivation of JCGM 100:2008 equation 10 when the measurement model is y = Σ[x_i, 1, N] since the partial derivative ∂y/∂x_i = 1 for all x_i. Notice that y combines measurements via a trivial sum here.
Nobody sums temperatures though. They average them. The measurement model that computes an average is y = Σ[x_i, 1, N] / N. In that case the partial derivative ∂y/∂x_i = 1/N for all x_i which means the uncertainty u(y) = sqrt[Σ[u(x_i)^2, 1, N]] / N. And if u(x) = u(x_i) for all x_i then it simplifies further to u(y) = u(x) / sqrt(N). That is the familiar standard error of the mean (SEM) equation.
An interesting hint:
https://stats.stackexchange.com/questions/311728/why-is-it-called-standard-error-and-not-standard-uncertainty
*
Much more relevant, however is the fact that the uncertainty gang ‘operating’ on this blog always ask for ‘Where is your uncertainty calculation?’ but never provides anything the like – let alone would they ever be able to generate the data you present, and then to add the uncertainty calculation they missed in your presentation.
Even worse…the gang’s challenges are entirely based on making numerous algebra mistakes some so trivial that even elementary age students could spot them. I don’t know how many times I’ve had to explain to the Gorman’s that addition (+) and division (/) are different operations that produce different results and that sums and averages are different operations that produce different results. I even tried to convince them to use a computer algebra system to check their work and the response was along the lines of “computer algebra systems are wrong”. That’s what we’re dealing with here.
Hogle
All posts sent to Roy Spencer’s blog containing the letter sequences ‘dc’ or ‘rpt’ are automatically ignored.
Thus you can’t write posts containing ‘absorption’ or ‘excerpt etc. But ‘option’ for example isn’t a problem.
This is of course valid for links as well, e.g.
https://www.ncdc.noaa.gov/cdo-web/datasets
Thank you, Bindidon.
The monkey forgot to mention that if you see that a post of yours wasn’t sent successfully, you can get it back by clicking on the browser’s ‘history back’ arrow; move to the thread’s end if you don’t see it. There you find the rejected text and can edit it again, e.g. by inserting a blank between d and c or between p and t.