Guest Post by Kevin Kilty
No planning is likely possible without calculations of what the future may hold, but such calculations are fraught with uncertainty when they also involve exponential processes. Indeed, as the author of one chapter in a recent book  states:
“One characteristic of an exponential growth process that humans find it really difficult to comprehend is how fast such a process actually is. Our daily experiences do not prepare us to judge such a process accurately, or to make sensible predictions.” [emphasis is mine.]
Quests to reveal a future governed by exponential processes, or what people guess to be exponential processes, run through many themes here at WUWT — future climate, energy demand, economics, epidemics. This guest contribution takes a selected look at exponential growth. Two examples are historical, and perhaps obscure, but pertinent. The third one, which comprises the bulk of this essay, is an examination of R0, which dominates the present imagination.
Failure on the Plains
Cattle arrived on the Northern Plains of the U.S. frontier first in the mid-1860s. The industry was infested with promoters, people with interests in railroads and such, who promoted using tales of how to get rich on the plains to Eastern and European investors. Some early investors made money selling to bigger cattle corporations. But the industry was based on cattle herds rather than titles to real property, and cattle counts were notoriously difficult to carry out. Thus, much of the promotion and accounting became based on “book” counts. These were not credible, but had the effect of a stampede to the plains financed by people who little understood the business or its risks.
Figure 1 shows an actual book count against a Fibonacci series representing a hypothetical rabbits. The exponential behavior of the book count is obvious.
The hard winter of 1886-1887, which was an instance of weather not climate change, wiped out many live cattle, but it wiped out many larger book counts. It provided an opportunity for the range managers to adjust plainly inaccurate inventories and save face at the same time. The story at the present day, for the few who know anything of the story at all, is of millions of cattle perishing in blizzards. It is much more acceptable to be bankrupted by weather than by foolish belief in an exponent.
Is there a modern equivalent? Well, the strangely smooth curve of Chinese deaths from COVID19 looks like one. It resembles a calculated curve with a certain goal in mind, rather than a measured curve with all the wiggles back and forth like the comparison curves from other countries.
Projection of Electric Energy Demand
Electric energy demand grew at an exponential rate after WWII, especially during the 1960s, when the grid expanded into every conceivable corner of North America, and new uses, such as the mercury vapor light, expanded into every conceivable market. The near perfect fit of geometrical growth of 7.13% per annum to electrical demand in 1960-1972 as Figure 2 shows, led to wild predictions of future demand and its consequences. A simple projection of constant geometrical growth (Figure 3) arrives at a staggering demand of 12 TWhr in year 2000, and one might be tempted to dismiss it. However, 1972 a workshop held at Cornell, sponsored by NSF, produced a “consensus” estimate of 10.25 TWhr, which is not much lower.  These estimates were driven by exponential growth in usage and population.
What occurred in the 1970s was a constant drumbeat of future shortages, the decimation of free flowing rivers, the needed changes to society and the economy, the need for government mandates because government is the only institution big enough to deal with the crisis. Untold amounts of taxpayer and private money poured into schemes long forgotten (magnetohydrodynamics) or schemes that should have been (geothermal). The crisis prompted everyone to push their preferred hobby horse. Sounds familiar.
What actually happened post 1970s? Actual electrical energy consumption never reached 40% of these projections. Figure 4 shows electric consumption to the present time along with the supply available from selected sources. Note the supply from petroleum. It provided a large source of electrical energy pre-1973. However, the two oil price shocks (1973 and 1979) had the effect of immediately putting a halt to the growing use of petroleum to generate electricity and diminished it each time. People may not comprehend the speed of exponentials, but they respond quickly to prices.
More interesting still is that not only did demand not grow exponentially after 1972, but that post 2008 it hasn’t grown at all, as Figure 4 also shows. We appear to have reached a point where slowing economic growth has enabled innovation such as outsourcing, container ships, LED light bulbs and myriad other things to provide increased standard of living without use of more energy. Can it continue? Time will tell.
Trajectory of a Pandemic
In times of crisis, real or imagined, people become fixated on certain technical measures or parameters of the problem, which become something like fetishes. The mean temperature of the Earth, the level of CO2, or its rate of production all play such a role in climate change. The parameter R0, the basic reproductive ratio, plays such a role in the present COVID19 crisis. Let’s explain what R0 describes, and what it has to do with some selected observations about the present pandemic.
What is R0?
The best way to explain R0 is through a simple model of an epidemic involving three populations: X, the population of people who are susceptible to a disease but who are presently not infected; Y, the population of infected (and infectious) people; and Z, the population who have recovered, and are not for the present time likely to fall back into population Y.
Many factors affect population X — births, deaths, migration, and so forth. However, over a short period of an epidemic we might consider only becoming infected and transitioning to population Y as having any pertinence. People often model the factor describing this transition as a term like -BXY. The product of populations (XY) indicates something about the probability of an X person encountering an infected one; B is a factor of transmissibility describing the probability that the encounter between an X and a Y results in X becoming infected.
It should be obvious that in the short term any person leaving the group of Xs does so by entering the Ys. So, the equation describing the rate of change of Y contains the term +BXY. However, the change of Y also depends on the rate at which the infected become well, and transition to the group of Zs — a rate we call U, and the rate at which infected people die and vanish from the model altogether — a rate we call V. Thus our differential equation for Y is
dY/dt = (BX – (U+V)) Y
Someone familiar with differential equations will recognize the factor (BX – (U+V)) as a sort of time constant; large BX tends to make this time constant positive, and results in a population of Y which grows exponentially; large (U+V) tends to push it toward negative values which would result in exponential decay.
People don’t like to deal with summations of factors in a time constant, and in the case of epidemics what people have done is to turn the time constant into a ratio, with those factors tending to make it positive in the numerator and those making it negative in the denominator.
The resulting definition is something like R0 = BX/(U+V) ref.
There is a tendency, apparently even among the medical community, to think of R0 as a sort of time constant, but it is not. It is a dimensionless measure more akin to what engineers would call a figure of merit. There is also a tendency to think of it as intrinsically a function of the disease itself. It is not. Let’s discuss each factor in turn and explain what about each factor is important to the epidemic.
The factor B has to do not only with how easily a disease intrinsically jumps from person to person (like measles with a large value of B), but also has to do with cultural and social factors of the Xs. Touchy-feely sorts of societies will make B larger and push R0 to a value larger than 1.0; other societies have more intrinsic distance and push B toward smaller values. All sorts of strategies to increase social distance — lockdowns, isolation of the vulnerable, isolation of the infected and even disinfecting surfaces — seek to make B smaller in value.
X, the population of unaffected people, doesn’t necessarily include the entire population. There are people with intrinsic immunity to the disease. For example, Willis’s contribution from some time ago pointed out that on the Grand Princess not everyone who was exposed became infected. Perhaps only 20-40% did. Obviously X depends on the age distribution and also on the distribution of other morbidities in a population. A common strategy to reduce X is immunization.
Factor U has to do with the virulence of the disease, but also has to do with population characteristics such as age distribution and other morbidities. Within my home state we have an unusually large fraction of the known infected who have recovered quickly. It suggests a lower R0 than places displaying long convalescent periods. Does this tell us anything valuable about COVID19, or does it simply reflect differences in various state departments of health making assessments of recovery? One strategy toward boosting U is to employ treatments such as what New York City is attempting with chloroquine.
What is important about R0?
R0 is not a constant. As a disease progresses through a population X becomes smaller and tends to push R0 to smaller values. Eventually it becomes small enough that R0 falls to a value less than one and the epidemic peters out. This is the principal factor that converts the initial exponential growth of an epidemic to a logistic sort of curve toward its conclusion. Also, just like example about energy, people change their behavior in a time of stress. They avoid other people, and improve hygiene — factors which improve B. Also, different ethnic groups and different parts of the U.S. will display different values of R0. These combined factors probably explain the wiggly behavior of the various graphs on the Daily Coronavirus Graph page.
Getting a handle on R0
Having an accurate estimate of R0, especially early in an epidemic cycle, would be very useful for public health policies. Here are the hurdles one has to clear to get an accurate value:
First, the most valuable estimate of R0 to get ahead of an epidemic is one made early in the epidemic. Without experience to draw upon a person has to use observations. The only population leading to a useful estimate of R0 is the infected, Y. We have no idea how this population is growing at present relative to X.
Second, I have commented elsewhere about individuals local to me who are not only included in the “cases” of two neighboring states (double counted), but who may have been placed within the data at the wrong time of exposure and infection. Early estimates of R0 are made when there are very few infected individuals. Such estimates are very sensitive to errors of observation. Observations placed erroneously too far along in the epidemic will have the effect of erroneously making R0 too large; while those placed erroneously too early will make R0 appear, erroneously, too small.
Because all of the factors involved in R0 keep changing with time, one has to keep collecting timely data for evaluation about effectiveness of strategies. Thus one is always presented with the problem of limited individuals who are pertinent, and then decisions about which individuals should be counted, and exactly where to place them in sequence. At no time does estimating R0 become simple.
Third, because R0 is not a time constant, but rather a dimensionless figure of merit, the pertinent observations for its estimate are of the growth generation to generation — that is, growth of Y in the chain of transmission from person to person. In my state public health officials estimate that more than 60% of the infected can explain where they were infected. However, this estimate has to be tempered with knowledge of how faulty people’s memories are.
One MD has spoken elsewhere about observations, such as the spread through the call center at Daegu, South Korea, which suggest an R0 well below 1.0. However much his number may pertain to the special case of this particular call center, the value of R0 cannot be below 1.0 generally. If it were, the plainly obvious growing epidemic across the U.S. at present would require an utterly improbable set of initial conditions.
Similarly, the large values of R0 (2.0 to 2.6) used by Neil Ferguson, along with estimates of generation duration and other parameters, propelled initial panic. One can tell from the press conferences that Dr. Fauci, Trump’s principal advisor, is still highly influenced by these early estimates.These were guesses, albeit educated ones. Apparently Ferguson is stepping back from these initial estimates. This is just my opinion, but it appears that we, across the Western world, were unprepared to gather the sort of data early to make valuable estimates of R0 at an actionable time — for example rather than daily counts of infections we need counts by generation of spread, and estimates of uncertainty. Estimates of deaths in Britain from 20,000 to 500,000 do nothing to aid in policy prescriptions.
There is no doubt that we will survive this pandemic, but at great cost. A famous quotation seems apropo:
“ If we are victorious in one more battle with the Romans, we shall be utterly ruined.”
Phyrric, 275 B.C.
After this crisis has passed we really need to have a sober evaluation of strategies versus outcome, and decide whether we might have done better. We should decide whether our goals were even sensible. Nic Lewis’s contribution is an example of sober analysis; so is Alec Rawls’s. We do not need a second such victory over exponents.
 Philip Dutre, Thinking and Conscious Machines?, in “A Truly Golden Handbook”, Ed. by Veerle Achten, Geert Bouckaert, Erik Schokkaert, Leuven University Press, 2017.
 Dan Fulton, Failure on the Plains, Big Sky Books, Montana State University Press, 1982.
Throughout Fulton’s early chapters quotations refer to the book counts as “arithmetic progressions” when in fact they are geometrical. The book count data came from Robert Strahorn, one time superintendent of the Union Pacific Railroad.
 This projection, along with the projections of the Federal Power Commission and National Petroleum Council would be featured in Congressional testimony in May 1972 and in a companion paper in (Chapman, et. al., Science, v.78,p.703-708,1972) as Table 1. While the authors stated that these projections might prove too high, they emphasized that “…to the extent that past population growth rates continue, the projections of Table 1 are supported…”
 All electrical consumption data are from EIA spreadsheets.
 Martin Nowak, Evolutionary Dynamics, Belknap/Harvard Press, 2006. Nowak’s definition is not exactly like mine but is functionally the same.