Confusing Relative and Absolute Risk

firearmThe age of the internet has created an explosion of information, both good and bad. Popular media accounts of a recent study by the University of Nevada-Reno and the Harvard T.H. Chan School of Public Health provide a classic example. The result is a misleading headline that confuses relative and absolute risk. It shows the media bias that exists with gun control.

Reviewing the Study

The researches analyzed mortality data provided by the World Health Organization. Their analysis led them to the conclusion that American mortality caused by firearms is ten times that of other developed countries. But what does that mean? Let’s crunch some numbers.

According to the Centers for Disease Control, there were 33,636 deaths from firearms in 2013. That figure translated into 10.6 deaths per 100,000. Put another way, your chances of dying by a firearm are 0.01 percent. The implication from the popular media headlines of 10 times the risk means you’re going from 0.001 to 0.01 percent. Call me a skeptic, but that doesn’t sound like a much greater chance of being gunned down.

And that is a clear example of irresponsible reporting by the media. The 10 times figure sound scary; the 0.01 percent figure does not. To get the page views, the media opted for the more sensational headline. It relies on the fact that the average reader doesn’t have a handle on statistics.

The Agenda

But, wait, there’s more. The study was founded by an award from The Joyce Foundation. The foundation has admirable goals of better education and a clean environment. However, a mission for gun violence prevention should not include blatant misleading information. It rests on an agenda by popular media to sway public opinion through deception.

I find this aspect especially hypocritical. The anti-GMO sector is quick to point out a study funded by Monsanto as evidence of bias. However, making such a claim commits its own logical error via the appeal to motive fallacy.

Granted, there is a fine line between biased and unbiased reporting. In the case of Monsanto, federal law requires manufacturers to conduct studies of their products. They cannot opt out and wait for a third party to do the testing required of them. The fact that they publish studies isn’t an immediate accusation. They have no choice but to publish.

In the case of The Joyce Foundation, the award was a choice based on the foundation’s own publicly-stated mission. And yes, it cannot control how the media will report its findings. However, the misleading nature of the headlines suggests an attempt to deceive. But, to give everyone the benefit of the doubt, some reporters have admitted being light in the stats department.

The Final Point

There’s one more point to add about the absolute risk of firearms violence. I quoted a figure of 33,636 deaths. However, that figure includes all firearm mortality, including suicide, hunting accidents, accidents when handling/cleaning, justifiable homicide (self-defense), and any other way a firearm could harm. Your chances of dying from gun violence are in reality much lower than 0.01 percent. Let’s all relax.

By Chris DR/

photo credit: Sig P226 via photopin (license)

business statistics

Statistics in Popular Media

Statistics are powerful weapons. Marketers and politicians weld them well when the situation demands it. Unfortunately, this skill often results in misinformation and sometimes, downright lying.

How Statistics Are Misused

One way they deceive is by confusing the scientific process. Statistics summarize experiments and studies to get at the conclusions. But, they are not synonymous. A study is a situation in which data are observed. The investigator cannot control for everything that may influence the results.

An experiment is a controlled situation−ideally. A good experiment has a randomly selected representative sample along with a control group. It is big enough to be meaningful. It is also conducted double-blinded. Neither the researchers nor the participants know what group they are in.

Marketers may use studies and report the results like experiments. The problem with this scenario is that one cannot find causation with studies, only correlation. Yet, when journalists/marketers pick up on these stories, that line is blurred.

Cherry Picking the Results

Another devious tactic involves the results. Don’t like what the study or experiment shows? Throw it out! With over a million papers published yearly, you’re likely to find something you like better.

I’m not suggesting all papers are examples of good science. Stinkers get through each year. Just ask Andrew Wakefield or Gilles-Eric Séralini.

Spotting the Fraudsters

There’s one surefire way to spot a liar fraudster. They break a cardinal rule of science. You probably see this whopper all the time. And it’s so blatantly false. Whenever you read or hear someone claim that something is scientifically proven,  you’re being had.

Science is provisional. This means, according to Merriam-Webster, “existing or accepted for the present time but likely to be changed.” In science, there is no truth with a capitol T.

Some people may find this statement uncomfortable. It’s not meant to be. Rather, this is an accurate assessment of what science is. Don’t be afraid of it. Ignorance is not always bad. The late Richard Feynman offers sage advice.

“I can live with doubt and uncertainty and not knowing. I think it’s much more interesting to live not knowing than to have answers which might be wrong.”

So, the next time a marketers tries to sell his scientifically proven best widget, give him a dose of Mark Twain and tell him,

“A man is never more truthful than when he acknowledges himself a liar.” Chris DR

photo credit: Numbers And Finance via photopin (license)