Lies, Damn Lies, and Statistics

Separating fact from myth

 

This morning's San Francisco Chronicle had a major headline: 20,000 in state own guns illegally. Being the contrarian I am, my immediate reaction was, "How do they know that?" Actually, in this case it was a real number based on a database of registered gun owners who had lost the right to own guns but whose guns had not been surrendered. However, it got me thinking about how easily we accept statistical data as fact. We forget that in many cases, the studies or research that produced those facts are often based on estimates, assumptions, and best guesses.

Case in point is the frequent claim that 50% of all businesses fail after disasters. Or was that just small businesses? Or maybe it's 40%. Wait - it's 25% of small businesses. Actually, it's 80% of all businesses affected by an major incident will close in 18 months. Or is that 70%? 

Ever wonder where this came from? I've read articles that claimed that this information came from various reputable sources such as FEMA, the Insurance Information Institute, the Department of Labor, and the Institute for Business and Home Safety. Mel Gosling and Andrew Hiles did a short study for Continuity Central in 2010 and were unable to locate any of the references they studied. The realityt is we really don't have a credible source for these oft-quoted "facts".

So what do we know about businesses and disasters? Here's a quote from a 1999 research paper, Business and Disasters: Empirical patterns and unanswered questions by Webb, Tierney, and Dahlhammer:

...although it is commonly assumed on the basis of anecdotal evidence that disasters result in business failures and bankruptcies on a large scale, our research indicates that most businesses, even those that are especially hard hit, do indeed recover following disasters. 

My point is that we frequently accept "facts" without question and repeat them. It is this repetition that makes them appear true.

Another problem with statistics is that they can be manipulated to achieve specific results. Like many of my colleagues, I've been doing a bit of research on mass shootings and it's interesting to see what events are included and excluded from various studies. For example, what weapons were used at the Sandy Hook shootings? Media accounts vary depending on the audience. If you're pro-gun control, it was an assault rifle; if you're pro-gun, two handguns were used. Does it matter in the larger discussion on preventing these types of incidents?

Context is another problem when dealing with statistics. Consider this common statement: 200 die in Labor Day car accidents. Aside from the tragic loss of life, what does this really tell us. Is 200 average, above average or below average for Labor Day? How does it compare with a typical day? During the Northridge earthquake, my plans section was reporting several thousand customers without power until a utility representative pointed out that this number was the normal number of people without power on a typical day. Context is everything.

So what can you do about this problem? Stop accepting everything at face value. Verify your sources. Ask questions. Remember, others are relying on you for accurate information in order to make good decisions. Don't disappoint them.

Join the Discussion

After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.