Guest Essay by Kip Hansen
WARNING: This is not a technical essay. There is almost no science in it. It is not about AGW or any issue involved in the Climate Wars. My editor describes it as “chatty”. It does ask two extremely important questions.
We are all constantly bombarded by numbers….in the press, on the radio news, on the TV news, here at WUWT. Numbers as sheer numbers, numbers as graphs, charts, images, and in words and more words. Putting a number with an idea has a magical power over our minds – it makes the idea ‘more true’ – it offers to our minds a sort of proof for ideas and concepts.
In this essay, I look at an important question, one we must all ask – ask ourselves and ask the sources of these numbers – What exactly are they really counting? In our little introductory image (cute, huh?) we see they are counting “counting bears”. 1 bear, two bears. But, what exactly? In the upper panels, they are counting green plastic counting bears. Their count = 1 (and in words – one). In the bottom panels, they are still counting plastic counting bears, but one red bear and one blue bear, or 2 (in words – two) bears altogether. Even more exactly though, the bottom panels have one red bear, one blue bear and zero green bears.
This is not just being fussy. When we have only the information in the top panels, we count one bear (and maybe note that it is green). As far as we know, all bears in this context are the same color, and color doesn’t matter. If these were real seal-eating/fish-eating bears, some might be white, some brown, some grizzled, some a cross-breed mixture. For biologists, the difference is important – refer to the Polar Bear Wars. For a camper on the tundra, one very hungry bear, possibly man-eating, is more than enough, regardless of color.
For this kindergarten example, we see that even in the most elementary types of counting , there are details that may need to be explored and explained.
Just to be clear, all measurement is the same as counting in this regard.
meas·ure ‘meZHər/ verb to ascertain the size, amount, or degree of (something) by using an instrument or device marked in standard units or by comparing it with an object of known size. “the amount of water collected is measured in pints” some synonyms: count, calculate, compute, quantify
So, for all measurements offered to us as information especially if accompanied by a claimed significance – when we are told that this measurement/number means this-or-that — we have the same essential question: What exactly are they really counting?
Naturally, there is a corollary question: Is the thing they counted really a measure of the thing being reported?
For example: Does the drastic reduction in the number of early-morning home-delivery milkmen over the last forty years really mean Americans are drinking proportionally less milk per capita? (For extra credit: The answer is YES and NO. Americans consume about 20% less fluid milk and cream than in 1975, but per capita consumption of all dairy products, including butter, cheese, yogurt, cottage cheese and others, has increased by 12.5%. For more information than you ever wanted to know, see here.)
I have seen the following statement on billboards in our area and in the press: “1 in 5 U.S. children at risk of hunger” . Gee, one might think, that’s terrible in a country as rich as ours – and you’d be right. But the devil is in the details. It takes quite a bit of searching around on the ‘Net to find out who said that, and what it is exactly that they found that gets translated into that headline. The original USDA report is summarized here.
In fact, the original report caused a lot of push back and push-back on the push-back. The push-back link gives an idea of what is being counted here. It would not be what you think.
They did not interview classrooms full of kids to see if they “were at risk of hunger”. They did not count kids that they deemed “at risk of hunger”.
You see, there are kids that sometimes are not sure that there is going to be enough food in the home to make it possible for them to eat whatever and however much they (or their parents) want. In a nutshell, one adult in certain poor families (43,253 households) were questioned about food security, with a set questionnaire. Any family whose adult reports that they were ever worried that they would run out of money for food before the end of the month at any time during the last 12 months is counted as “Food Insecure”.
Questions like these were used to determine Food Security (each followed by the ANSWER that triggers a label of Food Insecure Household):
“In order to buy just enough food to meet the needs of your household, would you need to spend more than you do now, or could you spend less?” MORE
“In the last 12 months did you ever run short of money and try to make your food or your food money go further?” YES
“The food that we bought just didn’t last, and we didn’t have money to get more.” Was that OFTEN, SOMETIMES or NEVER true for your household in the last 12 months?” OFTEN or SOMETIMES
“We relied on only a few kinds of low cost food to feed the children because we were running out of money to buy food. Was that OFTEN, SOMETIMES or NEVER true for your household in the last 12 months?” OFTEN or SOMETIMES
In extreme cases, there are actually some children that actually missed one meal, sometime in the last 12 months, because there wasn’t enough money to buy food. This last condition — ever in the last year was forced to skip a single meal, either adult or child in the household — triggers the federal governments labeling of the family as having “very low food security”.
The true root causes of the problem, in the vast majority of cases, are single-parenthood, ignorance, and parental addictions – drugs, alcohol, tobacco, and gambling (State-run lotteries). A lot of parents living on the edge face the question: “Do I buy cigarettes or cereal for the kids?” or “Beer for me or milk for the kids?” Far too many times, those questions aren’t even asked – drugs, alcohol, cigarettes and lottery tickets get purchased first, then the food is bought with whatever money is left over. Ignorance leads to the idea that Pop-Tarts make an adequate breakfast for school-aged children. But even in those homes, few American children actually go hungry as a general rule.
Author’s Aside: My wife and I did serious humanitarian relief work in the Dominican Republic for eight years recently, followed by another three hurricane-safe seasons in the various Virgin Islands. In all honesty, there are entire villages in which it would be difficult to find a single child who did not suffer “food insecurity” by the USDA definition every week – in fact, many children in the bateys (segregated Haitian immigrant slums) would be found to have been forced to skip at least one meal every day. In the west, over against the Haitian border, the public schools provide both breakfast and lunch in the lower grades. Otherwise, the children would go hungry until suppertime. Our program provided de-worming medication (otherwise you are just feeding intestinal parasites instead of the children) and specially formulated children’s poverty vitamins to supplement this program. They were not just “worried that they might have had to skip a meal in the last 12 months.” Teenage boys are often expected simply to feed themselves by hook-or-crook two meals a day and only offered the evening meal – rice and beans mostly, maybe a bit of chicken – at home.
Don’t get me wrong, when children are going hungry, then others – extended family, friends, community and government – need to step in, see that the children are fed and work to correct the problems that created the situation in that home. I don’t mean to downplay the seriousness of the problem where it really exists. No child should go through childhood hungry and undernourished — anywhere.
The point I wish to emphasize is that whenever we are presented with a very certain sounding number – like “1 in 5 U.S. children at risk of hunger” – it is absolutely necessary to ask the burning question: “What exactly are they really counting?”
The Question: “What exactly are they really counting?” Answer: Families that worried they might or did have trouble keeping adequate food in the home for a well-balanced diet for all members of the household, for any reason, at any one time, even a single day, over the last 12 months.
The Second Question: “Is the thing they counted really a measure of the thing being reported?” Answer: Not in the press reports…the USDA reports what it finds under its own definitions, however, those definitions do not mean what Advocates and the Press imply they mean. And what was counted is certainly not a measure of the thing that the charities imply it means when they use that headline on a billboard to raise money “for hungry children in America”.
As in the above case, when the press use numbers, the general rule of thumb is: They haven’t counted what you think (and probably not even exactly what they say they counted). I go a bit further on any report offered to me by Single-Issue Fanatics and Advocates-of-All-Stripes – whatever they measured or counted; it probably does not really represent the thing they claim it represents.
NOAA’s National Weather Service (NWS) produced a report titled: “Summary of 2014 Weather Events, Fatalities, Injuries, and Damage Costs” (they do one each year). They send it out with a Press release and a nifty chart:
The red bars are for 2014, with 10-yr averages in pale blue and 30 year averages in yellow. (Those without a yellow bar didn’t start being compiled until 2005). News reports resulting from the press release point out that Rip Currents were the big killer for 2014. That characterization got me going on this.
For one thing, 57 deaths in a population of about 319 million people, which experienced a total of 2,596,993 deaths in 2014, is vanishingly small. For example, compare this number to deaths from the flu (despite a spirited campaign to have all of us older folks get flu shots – I got mine!):
There were about a thousand times more flu deaths – despite modern medicine and vaccines – than Rip Current deaths.
So what’s to be investigated here? A tiny, inconsequential number (57) related to weather. It’s that last bit – related to weather – that interested me. How did they know which Rip Current deaths were weather related?
Rip Currents, in general, are not weather related. NOAA correctly says that rip currents are caused by:
“Rip currents are a result of complex interactions between waves, currents, water levels and nearshore bathymetry. These current systems such as alongshore and cross-shore (onshore/offshore) water motion. Along all coastlines, nearshore circulation cells may develop when waves break strongly in some locations and weakly in others. These weaker and stronger wave breaking patterns are most often seen on beaches with a sand bar and channel system in the nearshore zone. A rip current forms as the narrow, fast-moving section of water travels in an offshore direction. Rip currents can also result from a wave’s natural variability or when a current traveling along the shoreline encounters a structure such as a groin or jetty and is forced offshore. “
Did you see the word “weather” in there? I didn’t. I grew up in Los Angeles, California, and spent, during one year in my teens, some part of each of 200 days in a single calendar year on the beach – the proverbial California Surfer Boy. We knew rip currents. We didn’t have no stinkin’ ankle tethers in those days – you wiped out and your board headed for the beach. You, on the other hand, were in the water, swimming, at the mercy of the waves and currents – and boy did we have rip currents. We did not get carried out to sea and drown because we knew the trick. My current winter beach at Cape Canaveral, Florida, has this sign, which gives the trick:
Now, when the waves get bigger, there is more water thrown on the shore and rip currents, if they exist on that beach due to the topology of the bottom and nearby jetties etc., do generally get stronger, but they are not caused by the weather.
This prompted me to write to the NWS by email and ask them how they separated out the rip current deaths caused by weather and the rip current deaths that just happen on nice sunny days.
“I am confused by the inclusion of deaths from “Rip Currents” in Weather Fatalities.
Two points result in my confusion:
- Certainly, there are more than ~ 50 fatalities from rip currents in the United States each year — it is the most frequent cause of ocean beach drownings.
- Rip currents are not weather dependent — according to NOAA’s page on the causes of rip currents [ definition above deleted ]
How is it that Rip Currents are listed as the cause of the greatest number of Weather Fatalities for 2014?”
I received this answer:
“Thanks for your interest in rip currents. You are correct in your first point. Rip current fatalities are difficult to track and are often under-reported by all agencies including the NWS. The US Lifesaving Association estimates there are 100+ rip current fatalities per year in the US and that is considered the best guess at a true number.
We are learning there are many other causes of surf zone fatalities as well but rip currents cause the most. Other causes of surf zone fatalities are rough surf, a phenomenon known as sneaker waves (an unusually large wave in a set which suddenly crashes onshore and catches people off guard), and other currents.
To your 2nd point, the NWS tracks surf zone fatalities to better our understanding of the dangers of the surf zone and improve our products, services and outreach. Hopefully, that reduces the number of fatalities.
Rip currents are indirectly weather dependent. Wind speed and direction create waves and influence near shore circulations and waves/near shore circulations are significant factors in rip current development.
The NWS lists rip current fatalities among other weather fatalities because rip currents are weather related and the NWS provides products, services and outreach on surf zone hazards. Hope this answers your questions and clears your confusion.”
Well, that still left me with a question: What exactly are they really counting? So I asked:
“One final question, which I should have asked the first time: Does the NWS count ALL Rip Current deaths reported as Weather Fatalities? or only those that occur during official weather alerts?”
“The NWS counts any surf zone fatality occurring in its area of forecast responsibility as a weather fatality.”
What exactly are they really counting? They are counting any and all surf zone fatalities that occurs in any state, PR, Guam or the USVI.
A toddler wanders into foot deep water, gets knocked down by a wave, drowns because Mom and Dad have had one too many beers and have fallen asleep in the sun = surf zone fatality. Surfer girl gets hit in the head by the board of surfer boy, loses consciousness and drowns = surf zone fatality. Sneaker wave pushes kid against the sandy bottom, where he panics and sucks water = surf zone fatality. Over-eager boy from Kansas swims out beyond the breakers, showing off, and finds he can’t swim back in, the more he swims towards the shore, the further away it gets = surf zone fatality. Only the last one is actually due to rip current.
All of them appear in the Rip Current column in the Weather Related Fatalities chart and on the Weather Related Fatalities graphic.
So, back to The Question: What exactly are they really counting?
Answer: Any Surf Zone fatality anywhere in the 50 States, PR, Guam, or the USVI.
The Second Question: Is the thing they counted really a measure of the thing being reported?
Answer: No, they report “Weather Related Fatalities” sub-category “Rip Current Fatalities”– they have not counted rip current deaths and what deaths they did count are not necessarily weather related.
Those 57 deaths (or closer to the suspected truer number of 100) are not all deaths which were caused by Rip Currents. Even if they had been, they would not necessarily be Weather Related Fatalities. Even if they used the true name of the thing really counted – Surf Zone Fatalities – they still wouldn’t be Weather Related Fatalities (unless you count that they occurred on days during which there was weather, of any kind). Yet, the National Weather Service publishes official reports stating that there were 57 “weather-related rip current deaths” in 2014.
I have no idea why the NWS insists on counting what it counts or reporting what it reports – there doesn’t seem to me to be any extra profit in it for them when they call them Rip Current Fatalities instead of the true characterization as Surf Zone Fatalities. Lumping all Surf Zone Fatalities under the Weather Related Fatalities umbrella cannot be justified and the reasoning for doing so, as explained to me in the above email, is just nonsensical.
Granted, in the larger picture of government statistics, this is small small potatoes. But, the Rip Currents Affair is a fine example of why we must always ask: What exactly are they really counting? Is it really a measure of the thing reported?
This point has a broader application than the two cases discussed here today. Anyone who follows any science journalism knows what kind of reporting we see.
Wild claims of certainty are made in psychology research – often based on studies done on a couple of dozen university students who are being paid to participate – or recently, a study whose results were based on children’s games being claimed to tell us about the effects of family religiosity on altruism. In the field of psychology, the thing being counted is almost never a measure of the thing being reported, except in the minds of the researchers and like-minded psychologists.
In the health sciences, we hear of the virtues or evils of every type of food or life-style choice – studies based on statistically-derived minute differences in biometric markers for things only vaguely related to the topic being discussed and yet the health press trumpets these findings as proof that Food X is either Good or Bad for us. Near utter nonsense. They don’t actually measure anyone’s health, no less measure the health of two large cohorts, some who eat Food X and some who don’t, in a double-blind study that is capable of actually determining a health effect.
Another aside: The Health Food and Vitamin and Herbal Supplement industries feed this frenzy of invalid attribution of benefits or harms – laughing all the way to the bank with their billions of annual sales in the US alone – 81 billion for health/natural foods and 37 billion for supplements – sales of vitamins that have been found to have no positive effect whatever for the general public who buy them, sales of foods that are no better or worse than any other foods (except one is allowed to pay a lot more for the “healthy” ones) and sales of herbal supplements that often are not actually in the bottle and whose effects on the human body are for the most part unknown. In any other endeavor, the behavior of the companies promoting and selling “health foods”, vitamin supplements, and herbal supplements would constitute criminal fraud.
In climate science?
Well, you decide…opinions vary wildly and emotions run ahead of intellect. But the questions need to be asked, and asked again.
What exactly are they really counting?
Is the thing they counted really a measure of the thing being reported?
# # # # #
Author’s Reply Policy: I will do my best to answer questions about the main point of this essay and its implications in a broader sense. I would love to read your examples of obvious violations of the principle involved – the principle being, of course, that one should state exactly what one is counting and then actually count that and that what one counts should actually be a measure of the thing you claim it is.
If you have a question for me, please address it to me by name, so I can be sure to reply to it.
Please, this is not about the Climate Wars (though there is a lot of this type of thing going around in the climate field – e.g., see here). While you may state your climate opinions here, I will not respond nor will I defend any particular Climate Wars viewpoint.
I hope, most of all, that this essay has made you think about the numbers you read and hear in the news in a slightly different way and to begin asking yourself these two important questions.
# # # # #