Guest Post by Bob Vislocky

From the department of “use garbage temperature data to show causation by correlation” comes this recent Washington Post article which cites a report of a 60-fold decrease in the number of insects in Puerto Rico’s El Yunque rainforest … all because of climate change! Of course, this is “more widespread than scientists realized” the article states!

https://www.washingtonpost.com/science/2018/10/15/hyperalarming-study-shows-massive-insect-loss/?utm_term=.0515c8ceaf77

According to the report the maximum temperature increased by 4*F over the 38-year period of study (1976-2013), and that was the lone cause cited for the decline of insects. Of course this immediately set off the BS detector, so further investigation was initiated. First, maximum temperature data (summarized by month & year) was obtained for the closest NWS station to the El Yunque rainforest (San Juan) and for the closest active NCDC coop station which happens to be located just outside the rainforest (Juncos). These data are available from the San Juan NWSFO web site:

https://w2.weather.gov/climate/xmacis.php?wfo=sju

While there are no active NWS or Coop stations reporting within the El Yunque rainforest, there is one NWS Coop station inside the rainforest (Pico del Este) that reported maximum temperatures through 2004 (29 of the 38 years in the study). These data were compiled by NCDC over the years and maintained by the Southeast Regional Climate Center at the following web site:

http://www.sercc.com/climateinfo/historical/historical_pr.html

Maximum temperature data for the three stations were organized in a spreadsheet, uploaded to the following link, and graphed as Figures 1-3 below.

https://drive.google.com/open?id=1EdASW0qYTVCNkNcYAyAdqMxI-FPLSg5l

Figure 1.

Figure 2.

Figure 3.

Results show the max temperature trend in Juncos for the 1976-2013 study period was +0.0164*F per year. This amounts to a whopping gain of +0.62*F after 38 years. Meanwhile the max temperature trend in San Juan for the same period actually **declined** by -0.024* F per year! Finally, the trend from an actual site within the El Yunque rainforest (Pico del Este) was significantly negative by -0.075* F per year from 1976 to 2004. Clearly, there does not seem to be any obvious indication that the maximum temperature in the rainforest actually increased by 4* F claimed by the report, at least when using trusted weather data from nearby reliable sources.

So what weather data was used in the study? According to the original PNAS journal article that the Washington Post cited, the maximum temperature data was obtained from two research stations in the El Yunque rainforest (El Verde and Bisley) currently maintained by the USFS and USGS, and displayed in Figure 4 below.

https://www.pnas.org/content/115/44/E10397

Figure 4.

The problem is that the data from those two stations should never have ever been used in this research study. For example, Bisley only started reporting in 1994, which means that the temperature record only covered the second half of the 38-year time period used to study the insects. Unless the authors had temperature data going back to the start of their study period in 1976 how in the world can they claim that the insect decline from their first expedition in 1976-77 to the second one in 2012-13 was due to climate change?

The situation for the observing station in El Verde is outright comical. This station has had a checkered history beginning with its odd choice of thermometer location (on top of a concrete roof). Roughly 25% of the data between 1976 and 2013 is reported as missing with no values provided. From 1976-1978 and 1987-1989 the daily observations were not real, but rather long-term average values were substituted. Moreover, prior to 1992 many of the max temperature observations were extrapolated from surrounding locations. Additionally, from 1989-1992 the temperatures reported were often that of the current reading at observation time instead of the maximum temperature, so adjustments had to be applied to correct those values. To top all of that there were instrument issues cited in the early 1990s that prompted replacement of the thermometer in September of 1992. The amount of corrupt data was so extensive prior to 1992 that the caretakers of the data set specifically state that this data is **suspect and not valuable for interpreting long-term trends** as it would pollute the later data record after 1992. Here are the links to the historical El Verde max temperature spreadsheets and more importantly the description of the data where the issues regarding the data sets are exposed.

https://luq.lter.network/data/luqmetadata181

https://luq.lter.network/data/luqmetadata16

But wait, there’s more! After the instrument change in September 1992, recorded maximum temperatures increased substantially at the station! As a result, a correction factor was applied to the data beginning in September of 1992 to make it compatible to previous data record; however the exact nature of the correction was not documented. Unfortunately, the correction factor ceased to be applied starting in 1997, so who knows what impact this had on the temperature trend. Lastly, even after the instrument change in 1992, over 35% of the daily readings are still reported as missing.

Despite all the warts on the post 1992 data, the caretakers of the El Verde max temperature database provided a handy tool to plot the post-1992 data on a graph (see link below) as a complement to the spreadsheet data they also provide.

https://climhy.lternet.edu/plot.pl

Interestingly, when the data is plotted (see figure 5) it reveals a cooling trend contrary to what the authors describe for El Verde!!

Figure 5.

By now it should be obvious that the max temperature data used by the researchers in this study was total garbage. There is no way they can justify a 4* F temperature rise from 1976 to 2013 and claim that climate change is killing off the insects inside the rainforest. Moreover, for the researchers to use the temperature data blindly for El Verde and Bisley when more reliable weather data was available from nearby trusted sites, at least as a double-check, is nothing short of gross incompetence. This is especially true considering that their entire thesis that insects are affected by climate change rests entirely on the integrity of the temperature data. The authors should also understand that correlations do not equal causations. Rather than blame climate change, perhaps the researchers should have looked elsewhere, such as land development, deforestation, tourist encroachment, pollution and invasive species, as the following articles point out.

https://elyunquetourism.weebly.com/environmental-issues.html

https://youandyunque.weebly.com/water-pollution.html

Will Mosquito still be a State Bird of Alaska?

Already is.

From what I know of how such insect counts are done, it involves fogging a tree with insecticide, and collecting and classifying everything that falls out. The procedure seems a bit prone to artifacts related to procedure, as in which species of tree, what kind of insecticide, and the rigor of the collection process for a few possible glitches.

That is on top of the dubious temperature records noted in the post.

that was my first thought, were the collection techniques the same? How could they be? Next thought, this technique was never designed to give population counts so much as to opportunistically identify species types.

I’ve been involved in biosurveys in the past and the technique is haphazard at best and is prone to being misused to extrapolate things and draw conclusions that should never be made. For example a species could be observable in the plot right beside the marked plot but not inside the plot being measured – ergo, it’s extirpated from the area, next thing you know it’s being proposed that the thing is ‘going extinct’.

Lastly I wonder if I’d be allowed to pour toxins into a river today as I was 25 years back to do a survey.. if not, how the heck can I compare anything using that technique to the modern Greenpee approved biodegradable, vegan, gluten free, eco-friendly, e-commerce, diversity and gender neutral method used today?

A 60 fold decline in insects!

It’s a tourist site with around 600,000 visitors in 2005. Gee, I wonder if that could have anything to do with a decline in insects.

With that much decline in the number of insects, I wonder how the amazingly lush jungle plants are pollinated.

So many questions …

Maybe the insects are being harvested for human consumption.

I see about 1 C deg of warming at that location.

https://www.esrl.noaa.gov/psd/cgi-bin/data/timeseries/timeseries.pl?ntype=1&var=Air+Temperature&level=1000&lat1=18.&lat2=19&lon1=65&lon2=66&iseas=0&mon1=0&mon2=1&iarea=0&typeout=2&Submit=Create+Timeseries

Re-analyses are good?

Pretty good eye, on my part. I downloaded the NOAA data, and plotted the annual average with a trend. The warming there is 1.15 C deg/century. Notable is that there is a cooling there, since 1998.

Can someone remind me how to embed a chart?

The temps there have fallen since 1998. Its flat since mid-80s.

Period of study in the article was from 1976 to 2013, so that’s the pertinent trend.

Temps are rising slightly over the period 1976-2013. Less than 1 deg/century, or about 1/3 of a degree over the study. Temps at the start and end are nearly identical.

Keep in mind that the authors specifically blamed the rise in *maximum* temperatures. The reanalysis data just shows the *average* temps. It’s a subtle but important distinction since AGW has more influence on nighttime surface temps than daytime. Nonetheless, no matter how you look at it there’s still no way it rose what the authors claimed over the 38 year study period.

Les, the authors were specific in blaming the increase in maximum temperature. The reanalysis data link provided does not display the max temp.

Tourists to Puerto Rico’s El Yunque rainforest area would be very, very grateful for the decline in insect numbers. However the decline is probably as fictitious as the temperature report. It takes more than a bit of warmth to knock out insects.

Most insects thrive on extra warmth, their breeding cycle shortens.

Not finding a link to actual report with the claims of insect reduction. I will assume the collection time over the years used for comparisons was the same; otherwise that introduces skewed comparisons.

My surmise, taking the temperature correlation as being flawed (ie: O.P.’s discovery), is that the culprit for measured insect decline is changes in pathogen levels they host. As a consequence bugs are vulnerable to substantial colony die offs; blaming water pollution is too simplistic.

https://www.pnas.org/content/115/44/E10397

Having now read the report I will address it’s mentioned walking stick insect as an example of my above comment. These bugs do have an entomo-pathogen they are known to sometimes host.

Certain leaf mites can latch onto the larval walking stick & establish themselves on the walking stick’s abdomen. These mites can themselves host a helical bacteria which has no proper wall called Spiro-plasmas; there are different kinds of Spiro-plasmas & not all are problematic for the insect they get into – but some are (ex: GenBanks’ unnamed AY569829).

What happens is that Spiro-plasma can cross the walking stick’s gut barrier & get into the bug’s hemo-lymph (circulatory system) with fatal results; depending on what species of Spiro-plasma involved. Alternatively, the Spiro-plasma can enter the walking sticks reproductive system & can result in male bugs being killed (walking sticks females can still provide the colony with young, but I posit at smaller replacement rate); again depending on what species of Spiro-plasma involved.

Quite possibly the mites can transfer their Spiro-plasma species’ genes to the walking stick. The evidence for this gene transfer is that above referenced AY569829 (sequenced Gene Bank code #) Spiro-plasma species was found in walking sticks that were progeny of a laboratory colony that was not reared on wild leaves & being a lab colony no parasitic mites. If part of the PuertoRico studied region’s walking stick population has “bad” Spiro-plasma species’ genes transfer this would have even more dramatic impact on the bug colony survivorship than even just parasitic mites infecting random bugs with a “bad” Spiro-plasma.

Dragonflies are another insect with a reported fatal Spiro-plasma species. To be precise not all Spiro-plasma species are bad for bugs; some confer benefits to host bugs.

If their ‘science’ (and honesty) is as good as this post suggests, how much should we trust their bug counting?

Maybe there was a bug in their software.

thats not a bug

its a feature;-)

I’m leery of the bug count. A 60% decline just doesn’t sound plausible to me unless there is significant changes in the local flora and fauna.

I have personally seen a significant decline in insects. I lived in the last suburb before active and abandoned farmland started. As development caught up to the area, insects such as grasshoppers and praying mantises became scarce. The hatches of seventeen year locusts declined substantially. The farms, fields, and forests declined as the dominant flora.

Now I’m not saying their study area went from rain-forest to shopping malls between the counts, but I’m not so sure the two studies were under the same conditions with the only thing that changed over time was the temperature (which was pointed out as being most likely negligible and opposite in sign).

This study strikes me as somehow being less than rigorous.

and the cycles of the bugs as well

butterflys have been few here for years

but this year? I would say i counted 10 in my yard the other day.

grasshoppers and mantises are rare

locusts not so much

little brwony gold dung beetles were in epic numbers in 07 to 11

so were rutherglen bugs

this year so far anyway very few to zero gold ones

last yr a few rutherglen ones

they always appear as mulberry tree ripens;-(

rutherglen bugs ruin the fruit

“However, as climate warming continues, the frequency and intensity of hurricanes in Puerto Rico are expected to increase (133), along with the severity of droughts and an additional 2.6–7 °C temperature increase by 2099 (134), conditions that collectively may exceed the resilience of the rainforest ecosystem.”

It is statements like this which are found in almost every climate science study report that exasperate me. There is no mechanism to take one’s PhD away from these fools. And there is no law against willingly telling a lie. How this is possible in the year 2018 is mind bongling. Any database in the world on extreme weather events shows that there are no more extreme weather events now than there ever was. How can PhD’s get away with this? How can anybody seriously think that the average global temperature will increase 7C even if we burned all the fossil fuels in the ground?

applet-magic.com/cloudblanket.htm

Clouds overwhelm the DWIR produced by CO2. At night with and without clouds, the temperature difference can be as much as 11C. The amount of warming provided by DWIR from CO2 is negligible.The pyrgeometers assume emission coeff of 1 for CO2. CO2 is NOT a blackbody. Clouds contribute 85% of the DWIR. GHG’s contribute 15%. See the analysis in link. The IR that hits clouds does not get absorbed. Instead it gets reflected. When IR gets absorbed by GHG’s it gets reemitted either on its own or via collisions with N2 and O2. In both cases, the emitted IR is weaker than the absorbed IR. Don’t forget that since CO2 is an isotropic molecule; it emits in all directions. Therefore a little less than 50% of the absorbed IR by the CO2 gets reemitted downward to the earth surface.

Reflected IR from clouds is not weaker. Water vapour is in the air and in clouds. Even without clouds, water vapour is in the air. No one knows the ratio of the amount of water vapour that has now condensed to water/ice in the clouds compared to the total amount of water vapour/H2O in the atmosphere but the ratio can’t be very large. Even though clouds cover on average 60 % of the lower layers of the troposhere, since the tropsphere is approximately 8.14 x 10^18 m^3 in volume, the total cloud volume in relation must be small. Certainly not more than 5%. H2O is a GHG. Water vapour outnumbers CO2 by a factor of 50 to 1 assuming 2% water vapour. So of the original 15% contribution by CO2 of the DWIR, we have .15 x .02 =0.003 or 0.3% to accountfor CO2. Now we have to apply an adjustment factor to account for the fact that some water vapour at any one time is condensed into the clouds. So add 5% onto the 0.003 and we get 0.00315 or 0.315 % CO2 therefore contributes 0.315% of the DWIR. We will neglect the fact that the IR emitted downward from the CO2 is a little weaker than the IR that is reflected by the clouds. Since, as in the above, a cloudy night can make the temperature 11C warmer than a clear sky night, CO2 contributes 0.00315 of 11C = 0.03465 C. Without clouds, there is no physical reason why CO2’s effect would be any greater than this 0.03465 C.

So how would any average global temperature increase by 7C or even 2C if the maximum temperature warming effect of CO2 from downward back IR is only 0.03465 C? Sure, if we quadruple the CO2 in the air which at the present rate of increase would take 278 years, we would increase the effect of CO2 (if it is a linear effect) to 4 X 0.03465 = 0.1386 C Whoopedy doo!!!!!!!!!!!!!!!!!!!!!!!!!!

”Therefore a little less than 50% of the absorbed IR by the CO2 gets reemitted downward to the earth surface”.

I thought radiating back to earth was impossible due to second law of TD. I thought the theory was that radiation transfer to space was slowed because of capture by co2. Would not the atmosphere need to be COOLER as it approaches the Earth’s surface for IR to move back down?

Please excuse my ignorance if I misunderstand…

Second Law is about the

nettransfer of energy – more energy will always go from “hot” to “cool” than from “cool” to “hot.” That doesn’t mean that a single quantum of energy, if we could observe it, won’t go from “cool” to “hot.” (Several do, in fact – just some figure less than 50% of them.)By the way, there is a confusion up there – yes, carbon dioxide is an isotropic molecule, but that is not significant in any way. Even if it only emitted absorbed IR in a single direction (relative to the structure), the molecules themselves are randomly arranged in the atmosphere, so they would be emitted in all directions anyway, relative to the surface of the Earth. That less than 50% are emitted towards the surface (DWIR) is due to the fact that the Earth is an approximate sphere; from the “viewpoint” of an atmospheric molecule, the Earth’s surface covers less than half of the “view.”

It is considerably less. Imagine the co2 molecule as a cube. Only one side of six could face the Earth at one time. Or imagine it as a randomly spinning gun. The earth as a target gets rapidly smaller with distance and the odds of hitting it increasingly small.

Now imagine the molecule as a full glass of water and infrared radiation as water. Pour more water in and the same amount spills out. Once full none is retained.

Now you can imagine that molecule as a spherical mirror from which all radiation is reflected. Since most would miss the Earth the effect would be cooling.

I am certain some physicist with a pencil can explain why common sense does not apply but have I seeded a little doubt?

Very first lecture in Heat Transfer class, first paragraph after the introductions, the professor said, Heat moves

from the higher temperature to the lower temperature. It cannot go from colder to hotter. There are no exceptions. And 40 years of my experience and other peoples’ research still has revealedONLYexceptions.noInfrared radiation isn’t heat. There is no law to prevent it going from a colder to a hotter object. But more radiation will be going the other way.

Yes, infrared

heat. That was paragraph 2 of that same lecture, heat moves by 3 different methods, 1) conduction, 2) convection, and 3) radiation (that’s where that difference of T^4 comes in). Why do you thinkISALLheat transfer equations have some form of ∆T? I repeat, heat can only move from the higher temperature to the lower temperature, it cannot move from any lower temperature to a higher temperature.heat energy moves by 4

~~3~~different methods, 1) conduction (proportional to delta T^1), 2) convection (proportional to delta T^1), and 3) radiation (that’s where that difference of T^4 comes in) 4) evaporation (proportional to delta T^1).Where did you learn Heat Transfer? And I don’t mean that sarcastically, I mean what, year, what school, and what textbook? I don’t think evaporation is a separate method of heat transfer, at least not the way I learned it. I think the equation for convective Heat Transfer still applies, and evaporation is addressed as merely part of the Heat capacity of the fluid. But I also thought for convection to happen, Heat moved from a solid to a fluid only by what fluid was in contact with the solid, i.e., conduction. I think the professor explained that by saying conduction was solid-solid while convection was solid-fluid.

No, infrared radiation

movesheat butisn’theat. It is longwave photons. Heat is atomic/molecular movement. And a photon can perfectly well be emitted by colder source and be absorbed by a warmer target. The photon has no memory and a single photon transmits very little information about the temperature of the source (though the frequency distribution of a large number of photons emitted by a source does).If we make a thought experiment: two isolated parallell metal plates isolated in interstellar space, one at 500 K and one at 300 K. Both will emit IR radiation into the very cold (3 K) space, but they will also both emit IR radiation to the other plate (albeit with different IR spectra) and absorb IR radiation from the other plate, and both will therefore cool slower than if they were alone in space. However the

netheat flow will of course be from the hotter to the colder plate.It is very strange that this simple physical phenomenon seems completely incomprehensible to a lot of people.

Seems you studied engineering because there are few scientists who have any understanding of heat transfer certainly not those who call themselves climate scientists. Engineers work with facts and use equations based on empirical data. The 4th postulate of thermodynamics (often called the 2nd law) is based on entropy which is defined in the 3rd postulate. Boltzmann wrote a proof. There is nothing net about it. It is a fact even at an atomic scale.

Another fact is that the Stefan-Boltzmann equation only applies to surfaces in a vacuum. CO2 in the atmosphere is a gas. It has no surface. Prof Hottel made tests which showed one could use a volume represented by partial pressures and a path length for the radiation. His equation uses an emissivity which for CO2 at atmospheric temperatures is close to zero as is the partial pressure in the atmosphere.

RA Cook the fourth heat transfer is called phase change which involve latent heats. eg ice going to liquid water and liquid water going to vapour or steam. The boiling point of liquids depends on the pressure or more correctly the partial pressure in the gas over the liquid. A vacuum aids evaporation. For example in a cyclone or tornado the pressure is reduced and this sucks up water which feeds the cyclonic action. (the heat comes from the ocean or lake surface)

Everything else being equal, there is a real effect of DWIR. You can see it on a cloudy night versus a clear sky night. As I said the difference in temperature can be as much as 11 C. However clouds are transitory and water vapour is never evenly mixed in the atmosphere as CO2 is. Therefore CO2 may well have a constant DWIR component as I calculated above. Even that is not clear as the collisions with N2 and O2 and convection may completely negate the CO2 DWIR effect. My calculations were really only to show the maximum effect that CO2 may have. This maximum effect of 0.03465 C may well be ZERO as explained above , but since the science is unsettled, I strove to calculate the maximum possible effect CO2 could have. The reason why clouds don’t cause runaway global warming over the oceans is twofold. 1) there is not enough upward IR from the ocean surface to worry about and 2) the clouds are always moving on. They never stay in the same spot. The CO2 effect would be the same over oceans as it is on land but if we are worrying about CAGW from those numbers we are fools. See link below for derivation of the 85% clouds effect versus the 15% CO2 effect.

“The IR that hits clouds does not get absorbed. Instead it gets reflected.”

I think you will find it is refracted, not reflected. I can’t vouch for your mathematics but I agree about the clouds.

I was out last night with an IR camera. The temperature of the clouds over the small city of Waterloo was -17 C while the air temperature was zero. Without the clouds the night sky would have been much colder.

http://applet-magic.com/cloudblanket.htm

I mistyped the URL

http://www.sjsu.edu/faculty/watkins/resume2.htm

DONT UNDERESTIMATE THIS MAN. His resume is absolutely amazing. If you go to his website you will find a list of his mathematical and physical….etc essays that is truly mind boggling. In one of his essays he shows where even Einstein made a mistake in his 1905 theory of special relativity. He has written at least 3 articles on global warming , one of which I alluded to in the above.

http://www.applet-magic.com/newpages.htm

I took his conclusion to the bitter end to show the real maximum warming effect of CO2 which isn’t much.

He has written many articles on global warming

http://www.sjsu.edu/faculty/watkins/normalvariation.htm

Here is an argument against using Normalized statistic techniques when analyzing climate data.

Does it apply to temperature data?

The following is one of Thayer Watkins shorter essays.

“Normal Variable Statistics and the Assessment of Variation

One of the big issues in the policy debate concerning the impact of human activities on climate is how assess what is the extent of variation in weather conditions for the existing climate. Everyone accepts that weather is variable and even if there is no change in the climate there will be surprises, even disasterous surprises from time to time. The question is how far out of line can some event be and still be within the normal variation of the weather.

The usual approach is to compile a record of the past values of the condition in question and compute the mean and standard deviation. Those statistics are then used to compute how many standard deviation units away from the mean is the condition in question. Assuming a normal bell-shaped distribution the probability of getting an event that far or farther from the mean can be computed. If that probability is very, very low then the usual conclusion is that the event in question was not due to normal variation but the result of some change in climate; i.e., the underlying distribution of the weather variables.

There is often good reason to assume normal bell-shaped distributions because of the Central Limit Theorem. The Central Limit Theorem is a powerful, marvelous mathematical result that says that sums of statistically independent random variables tend toward normal distributions. The larger the number of independent variables the closer the distribution of the sum is to a normal distribution. The Central Limit Theorem is applied to random sampling. Usually if the sample size is 30 or greater it is presumed that the sample means can safely be presumed to have a normal distribution. A normal distribution is completely charactized by two parameters: the mean and standard deviation. Once those two parameters are known then the probabilities are completely given. Normal statistics, meaning standard statistics, is founded on the normal bell-shaped distribution. The point that is made here is that such normal statistics cannot be safely applied to weather condition statistics.

The Central Limit Theorem (CLT) was discovered in the 19th century after normal distributions were found in empirical investigations. Statisticians thought the CLT was of universal application. By the 20th century mathematicians were discovering limitations on its applicability. There was a hidden assumption. The independent variables involved in a sum had to be of finite variance.

The Stable Distributions

One significant property of normal distribution variables is that the sums of such variables also has a normal distribution. The French mathematician Paul Lévy investigated this property. He defined a stable distribution as being one such that if two variables have a stable distribution then their sum will also have a stable distribution. He found that there is a family of stable distributions characterized by four parameters. One parameter represents the central tendency of the distribution. For a normal distribution this is the mean value. Another represents the dispersiveness of the distribution. For a normal distribution the dispersiveness parameter is equal to its standard deviation. Another parameter represents skewness. Normal distributions are symmetrical about the means and their skewness is equal to zero. The fourth parameter is one that represents the shape of the distribution. It is usually called the α parameter. For normal distributions this shape parameter α is equal to 2. For all other stable distributions α is less than 2. The graph below shows the influence of α on the distribution.

The Fat-Tailed Distributions

A decade or so before Paul Lévy’s theoretical work, an economist named Wesley Claire Mitchell discovered that rates of return in stock markets did not truly have normal distributions even though the distributions looked more or less like normal distributions.

Because a higher proportion of the probability was in the tails of the distribution compared with the case of the normal distribution such distributions were called fat-tailed distributions. They were also given a name based upon Greek, leptokurtic.

As seen above, the leptokurtic distributions deviated from normal distributions not only in be fat-tailed but also being more peaked. This means the leptokurtic distributions not only had more extreme deviations from the mean but also more cases of small deviations from the mean. They differ from the normal distributions in their having fewer moderate deviations from the mean. The fact that there are an excess of small deviations would tend to lead observers to underestimate the volatility of such variables; at least until a very large deviation comes along.

The leptokurtic distributions are just special cases of the Levy stable distributions. Furthermore, there is a generalization of the Central Limit Theorem that says that the sum of a large number of independent random variables will have a stable distribution. Thus if some phenomenon such as changes in stock prices or rain from a storm is the result of a large number of independent influences then it would be expected that the distribution would be a stable distribution but not necessarily a normal distribution.

If a distribution is fat-tailed then that fact would account for the unexpected extreme cases and consequently in large changes in variables, the sort of occurrences associated with catastrophes.

An Application of Stable Distributions

The notion of Lévy stable distributions was applied to the rainfall statistics for the San Jose, California area. The resulting estimates of the parameters for the stable distributions for the monthly statistics are found at San Jose Rainfall and Parameter Estimates. San Jose does not ordinarily have extreme events but in September of 1918 there was one such. September is normally a low rainfall month. The rainy season does not normally start until November. The estimated values of the parameter α for most months were in the range 1.7 to 1.9, but for September the value of α was 1.1. September is therefore a prime candidate for an extreme weather event.

Extreme Weather Events

In 1918 the major industry of what is now called the Silicon Valley was drying plums to make prunes. The Santa Clara Valley was the prune capital of the world. In September the plums were laid out in wooden flats to dry in the sunshine. On September 11th through 13th there was six inches of rain. The whole prune crop for the year was ruined.

Although that September deluge was a disaster for the San Jose area it could not compare to the disaster that occurred in north central China in August of 1975. There were two major dams and about sixty smaller dams built on the river systems of north central China. The two major dams were each built to handle a maximum of about a half meter of rainfall over a three day period. In early August a typhoon moved over China in the south and traveled north to where its warm, humid air encountered the cold air of the north. The very first day, August 5th, each the area received a half meter of rainfall and then the storm continued raining for another 13 hours the second day and 16 hours the third day. Because of planning errors and operational policy errors the dams could not hold the water or even pass it through. Instead the dams collapsed along with about sixty of the smaller dams on the river system. It was a colossal disaster in which about eighty five thousand people were killed outright and about eleven million people severely affected.

Clearly a major source of the disaster was the underestimate of the chances of a severe storm. The major dams were built to withstand storms with probabilities of only 1/500 and 1/1000 per year. These were drastic underestimates based upon a limited record and theoretical distributions which did not take into account the probabilities of storms of severity not yet seen. The stable distributions allow the shape of the distribution from the past record to provide information about the probabilities for storms of such severity that they have not yet occurred in the record.

The Standard Deviation as a Measure of Variation

Normal statistics uses the standard deviation as the measure of variability and this is appropriate when the distributions involved are normal bell-shaped ones. For the non-normal stable distributions the standard deviation is infinite. The standard deviation from a finite sample never settles down to single value and is basically meaningless. The dispersion parameter for a non-normal stable distribution is finite but it is not the same as the standard deviation. Therefore it is meaningless to evaluate an unusual event in terms of the number of standard deviation units away from the mean.

It should be noted that there are many perfectly legitimate probability distributions for which the standard deviation is infinite. The standard deviation for a distribution is finite only if the probability goes to zero faster than 1/x² where x is the deviation from the mean value.

Sample Means

Where the Central Limit Theorem applies the mean of a random sample is an unbiased estimate of the population mean. For a sufficiently large sample size the distribution of the sample mean can be take to be a normal distribution. The standard deviation of the normal distribution of the sample means can be estimated from the sample values.

If the data has a non-normal stable distribution the sample means will also have a non-normal stable distribution and that will hold true no matter how large the sample size. For this case the sample standard deviation is meaningless.

Conclusions

Some weather/climate statistics have been shown to have non-normal stable distributions. Therefore it can never be safely assumed than any arbitrary statistics will have a normal distribution. This applies to sample means as well as the original variables. Therefore statistical tests based upon the standard deviation cannot be validly applied. This means that statistical tests purporting to show that some observation is beyond the normal variation are unlikely to be valid.”

DOES THIS APPLY TO TEMPERATURE DATA? I THINK IT DOES.

The rub lies here:

“sums of

statistically independentrandom variables tend toward normal distributions”Climatic time series are very often (not always) strongly autocorrelated and thus not independent. The temperature tomorrow is

notindependent of that of today.Let us say that we have a time series of daily temperatures with an long-term average of 60 F. On day 1 we measure a temperature of 95 F. If the Day 2 temperature was independent from Day 1 the expected value for Day 2 would be 60 F. However it isn’t, it is much closer to 95 F. This is autocorrelation and it means that using statistical methods requiring independent samples will give completely erroneous results.

Very elementary statistics theory, but seemingly unknown to many “climate scientists” who blithely use methods and formulas

onlyapplicable to normally distributed data. For example that 2 SD is equal to 95% confidence level.Onlytrue for normally distributed data.Here’s a suggestion for a “hands on” experiment to test the hypothesis. Camp out in the bush at (say) near Sydney Australia, Latitude 33S, and then do the same near Lae Papua New Guinea Latitude 6S. To make things fair do Sydney bush in December (midsummer) and Lae bush in June (midwinter). Whilst in these places note the quantity and variation of insects. You don’t have to collect any data – it will be quite obvious by observation. Try a night or so sleeping without an insect net. See how you go there. Hint: you won’t last a night

up there near Lae.

So what’s the difference between the two locations. Well Lae is hotter, far hotter than Sydney. And there are more insects, far more insects in Lae. Insects love and are abundant in hot climates.

Just so you know I have camped in both locations.

But insect count varies as generations come and go so fast. A ripple of rain in a warm place will activate many species so that the insect count is a proxy of preceeding weather, but not so easily a proxy of a climatic max temp trend.

Moisture is at least as important as heat. And by the way I have camped both in PNG, NSW and northern forests in Canada, Scandinavia and Russia. The taiga up north is by far the worst for mosquitoes (though PNG does have chiggers and leeches to make up for the more modest mosquito density).

Am I the only one to recall Hurricane Sandy ripped through just prior to the later ate insect count?

I’m sure the researchers controlled for hurricanes. After all, this was a scientific study and being a scientificky thing, all possible variables must have been accounted for, right?

😜

Insects. Are they supposed to stay constant? Definitively not. They can change in absence of any external ’cause’: https://compphys.go.ro/chaos/

I found the article on insects some months ago and it didn’t pass the smell test. Another misappropriation of causes to “prove” the effects of CO2. The author was stating that a 2C rise in temperature was killing off insects.

What I found was that Puerto Rico hosts a great many pharmaceutical and chemical companies and apparently pesticide use is prevalent http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1063&context=rurals “Today, the majority of Puerto Rican farmland is dominated by industrial agriculture: a system of chemically-intensive food production implemented in enormous single-crop

farms. This method depends on the purchase of pesticides and chemical fertilizers that have a negative impact on the environment and consumer health.”

In addition, the banned insecticide Naled was used in 1987, if I recall correctly, to control mosquitoes in San Juan, not so far from the forest in question, and I wonder if the forest had been targeted as well?

The role of insecticides in insect decline in the forest is speculation. But, it makes a bit more sense to investigate this as a potential contributor to insect declines than to conclude that a 2C rise is responsible.

If I were a chemical company in Puerto Rico and if I wanted to deflect attention away from the impact of insecticides on the environment– perhaps from an experiment gone wrong in the forest, which serves as a research area, or from some accident or over-zealous spraying– then I’d want to blame it on global warming. Just sayin’.

Don132

Out of all the surface area being represented in the study, just two very different “plots” were used to collect temperature data. Those plots are generally very small, certainly not more than 10 ft by 10 ft. Two plots, with likely multiple uncontrolled variables unique to each one. And the results are supposed to make us want to buy into what?

What I want to know is why grantors and university research oversite committees are so uninterested in policing the researchers. Do they turn a blind eye on purpose? It seems likely to me.

If their temperature data are so wacky, the next obvious question is how good their insect data counts are.

There may well be GIGO going on with both sides of their bi-variant correlation.

One tenth of “normal” to ten times “normal” is where Mother Nature tends to “control” populations of wildlife, including insects. Grad students looking for a trip away from their subsidized university housing just need to invent a good climate change hypothesis to get a department approved paid vacation these days. And if their results sound like a crisis, it’s good for another all expense paid trip next year too.

I want to know just what on earth is that supposed to mean anyway? Is that supposed to be like 600% reduction? Which just can’t happen! It’s nonsensical and not just useless but confusing! You’re better off to not say anything at all, than to use terms like that! /rant

I read the article and this is the sentence that introduces the 60-fold decrease:

If the “60-fold decrease” was worse than the 75% to 87.5% decreases quoted, then it can’t be a misquote of “60-percent decrease”. And it seems reasonable to assume that the numbers can’t be negative (although in climate science anything is possible). The only candidate for a “60-fold decrease” that I can see is the later value being

one sixtiethof the earlier or 1.66 percent, or a 98.33 percent decrease. Still a striking number, and one readily understood by anyone who actually reads a newspaper, so the 60-fold decrease is just an attempt at sensationalism.In trying to reconstruct the logic behind the term (perhaps logic isn’t the right word) I realised that, if the numbers were reversed, it would be a 60-fold increase, so there is some logic in it, especially for those who think backwards. And that would include climate scientists who work from conclusions to data.

The mean max temps are below 90. What is an increase, if there really was, supposed to mean? Mosquitoes suck up blood from animals at 100 (some excrete liquids to evaporatively cool while doing it) while I was under the impression that temperatures of 115 F were required to severely affect insects but not kill them. Surely some index of days above a very high temperature would be better. Means can go up without more extreme days, especially since most of the warming is due to a more humid atmosphere and humid regions do not normally get above 100F.

A 60 fold decline in insects!

In mathematical terms what in the hell is 60 fold?

Oh there it is. Sorry, I said the same thing.

Seeing items ticked off is wonderful feeling. Just

like positive thinking can change your life, so can negative thoughts.

For law and order to exist, an authority must show up to establish and keep

it.