Data Is as Data Does



Science is supposed to be objective and data-based.

Unfortunately, findings aren’t always fact-based and can be skewed, intentionally or not. Premises, like their conclusions, may be questionable.

What is the risk that reports that dazzle with brilliance disguise attempts to baffle with bull—-? Pretty high, when they come from gun control advocates.

Some examples:

1. “Study shows gun violence surged in Missouri after repeal of gun control laws” (PBS, 2/19/15). Missouri’s handgun permitting law ended universal background checks in 2007. Daniel Webster, of the anti-gun Johns Hopkins Center for Gun Policy and Research, reported “an immediate upward trajectory to the homicide rates in Missouri” (17% more than the average for the U.S.) from 2008 and 2012, “while the national trend was … trending downward”. The investigators controlled for several other potential causes of these changes. They also found that approximately double the previous number of recently sold handguns were being recovered from crime scenes or criminals. What else could this mean, than that easier availability of handguns increased homicide rates and criminal use of handguns?

Well, it could mean the opposite. Why didn’t they compare Missouri’s homicide rate before and after the change in the law? Maybe because during the 5 years previous to 2007 the homicide rate was 32% more than the U.S. average, so changing the law was actually associated with a deceleration of Missouri’s homicide rate. Then from 2006 to 2012, Missouri’s violent crime rate declined 7% faster than in the rest of the country. Even if criminals were more frequently using handguns, this still translated into increasing safety from violent crimes, including homicide, for Missouri residents.

2. What about the recently trumpeted 40% decrease in Connecticut’s gun homicide rate after the imposition in 1995 of universal background checks with may-issue decisions? The source is the same Johns Hopkins Center for Gun Policy and Research, so perhaps the finding should be looked at more closely. Connecticut’s gun homicide rate dropped after 1996 very much like it was dropping all across the country regardless of states requiring background checks or not.

Estimating what would have happened were it not for the introduction of background checks is fraught with untestable assumptions. They effectively assumed things would have gone similarly to Rhode Island’s experience during the same period. But the comparison for non-gun homicides relies mostly on New Hampshire’s experience. Why different comparison sets based on the weapons used? Once one looks beyond 2005 one sees that the direction of Connecticut and Rhode Island’s gun homicide rates dramatically diverge, with Connecticut’s rising and Rhode Island’s dropping. (And by the way, both states’ rates dropped from 1993-1995, before Connecticut required the background checks.) Interesting graphs and analysis are here, here, here, and here.

No, there’s no way to conclude from this study that the law made any significant or persisting difference in Connecticut’s homicide rate. This especially while nationwide there has been every reason to believe that the expansion of concealed carry permitting is associated with no increase and often some decrease in violent crime rates, including homicide.

3. A report just out from the anti-gun Violence Policy Center asserts that “Guns are rarely used to kill criminals or stop crimes”, arguing that the low number of homicides found to be justifiable and annual survey data demonstrate less than 1% of the time people attempt to defend themselves with guns or other weapons. The data in this report is fundamentally unreliable, involving extremely inconsistent reporting of justifiable homicides across jurisdictions, the non-equivalence of those to the vast majority of defensive gun uses, and an unquestioned, almost incidental finding extracted from the National Crime Victimization Survey along with the FBI’s Uniform Crime Reporting Program. Direct investigations by criminologist Gary Kleck convincingly show at least 760,000 defensive firearm uses per year in the United States.

4. In the above examples, data was cherry-picked to justify unrealistic conclusions. But shouldn’t our well-informed national leaders’ statements about “gun violence” in America be authoritative? In two of President Obama’s speeches following the massacre at Emanuel AME church in Charleston, South Carolina, he asserted that murder and violence on that scale doesn’t happen in other advanced nations so frequently. In fact, they certainly do and most of the worst ones have occurred abroad.

5. “I cannot imagine the horror that could have occurred if people were sitting around with concealed weapons, this thing started, and you have a full-scale gunfight … You might not even have three survivors”. So said a supposedly savvy commentator regarding Emanuel—but he’s no expert on this subject. (At least he made no data claims.) Plenty of such attacks are prevented or cut short; see examples from GOA and the Washington Times. The data: “victims who use guns in self-defense have consistently lower injury rates than victims who use other strategies to protect themselves”.

If gun control proponents presented straightforwardly, they would advocate based on the strength of their feelings, like that commentator. But that’s about all they have, besides tortuous logic and selective data syndrome. No wonder they got together in April “to improve their reporting on guns and gun violence”, courtesy of the Dart Center for Journalism and Trauma of the Columbia Journalism School, at a workshop funded by Bloomberg’s Everytown for Gun Safety. Step 2 seems to be inaugurating The Trace online, whose stated mission is “to expand understanding of the policy, politics, culture, and business of guns in America”. Their claim that they “practice journalism” seems as overstated as their belief that there is a “shortage of information” about gun policy.

You don’t need a statistics degree to weigh claims for the validity of research findings. You may have to read between the lines. Think about author bias, why the argument is being made, whether the “facts” and results presented are verifiable and how conclusions are formulated. Common sense and skeptical questioning cut through rhetoric every time.


Robert B Young, MD

— DRGO editor Robert B. Young, MD is a psychiatrist practicing in Pittsford, NY, an associate clinical professor at the University of Rochester School of Medicine, and a Distinguished Life Fellow of the American Psychiatric Association.

All DRGO articles by Robert B. Young, MD.