Guns and Numbers

Written on 2019-04-29

A couple of days ago, a friend of mine pointed out an article on gun control to me, which made the following claim: “Armed citizens are successful 94% of the time at active shooter events”. Written by a firearms training company, it analyses shoot-outs in the United States between 2000 and 2017. Basically, it says that armed citizens are highly effective at stopping shootings (and therefore, that more citizens ought to be armed). Having been asked to fact-check the article, here is what I think.

Before we start

First, the obvious disclaimer: I am not a gun crime expert. But as an ecologist, analysing data is how I make my living. And since this article is exactly that – data analysis – that's what I'm going to be talking about.

When evaluating the validity of any scientific study, there are a couple of questions one needs to ask:

  1. Are the data themselves valid?

  2. Is the analysis of the data valid?

  3. Is the interpretation of the data valid?

  4. Is the conclusion valid?

These are the questions I am going to consider. (By the way, you're probably going to want to read the original article before continuing.)

Are the data valid?

Most of the data (248 out of 283 events) came from a collection previously analysed by the FBI . They defined an active shooter event as “One or more individuals actively engaged in killing or attempting to kill people in a populated area”, although they excluded domestic and gang-related incidents. I did not have the time to trawl through the entire data set myself, but if the FBI used it I will assume it was not plucked out of thin air, and therefore answer the first question with a “yes”. (Although with a very important reservation, see below.)

Is the analysis valid?

On the whole, the answer to this question is also a yes. However, I have a few quibbles.

The article presents its results in a series of graphs, which I find quite admirably pretty. They are not up to scientific standard though, with basic information like the R² value missing on the lines of best fit. There's also a lot of pie charts, which we were told by our professors never to use because they are much worse than other chart types at showing numerical relationships.

On two graphs I'm a bit iffy as to the analysis method the author chose. The first is the “three year moving average” when looking at the percentage of events at which an armed citizen was present. (Why not just use a regression / line of best fit?). The second is the isolated look at incidents with eight or more fatalities when considering gun-free vs. gun-permitted areas. Here, a better approach would have been to show a boxplot with the median and standard deviation of the death toll for attacks in each location type, instead of two separate pie charts. This would have shown the actual distribution of the data and given a much more accurate impression. So these two graphs strike me as slightly dodgy, but on the whole it's not a huge deal.

(One graph I found rather condescending, though, was the one that showed that so far, the “citizen saviours” haven't hurt any innocent bystanders. This is smugly silent about the fact that you don't have to be in a firefight to hurt innocent bystanders, as another analysis by the same company of 300 accidental discharges shows.)

Is the interpretation valid?

So, is the author's interpretation valid? Are armed citizens as effective at stopping shooters as he makes out? This is were we strike what I perceive to be the article's biggest problem.

In short, the author comes up against what scientists call the “small N problem” – too small a sample size. In the end, the 94% he quotes as the success rate of armed citizens is based on exactly 33 events. Unfortunately, a large percentage of a small number is still a small number. Although the title and main claim of the article is technically correct, an equally valid (and less misleading) way of stating the results would be: “Between 2000 and 2017, 31 shootings were stopped or mitigated by armed citizens.” Or, phrased differently: “On average, there are two incidences a year when armed citizens stop or mitigate a shooting.” This suddenly sounds a lot less dramatic.

One especially needs to set this in relation to the total number of shootings in the same time frame. From 2000 to 2017, the Center for Disease Control recorded approximately 217,000 firearm homicides – that's about 12,000 per year. Compared to this, the total number of “active shooter” events considered here (283 events) is laughably small, even with multiple deaths per event. Obviously, the definition of “active shooter” events is much too narrow to give a representative sample of gun crime in the US. (Remember, it includes neither domestic violence nor gang warfare.) So even if armed citizens are effective in this very specific context, that doesn't tell us very much about the vast majority of shootings. The author's analysis of the data may be correct, but he isn't looking at the right data to be able to make a general claim.

To be fair, I should note that he never explicitly claims generality. Nonetheless, the overall and implicit thrust of his argument both here and elsewhere on the website is to show that armed citizens make for safer living. This is clearly not an interpretation that can be drawn from these results alone, and should have been labelled as such.

Is the conclusion valid?

For the sake of argument, let us assume the author has built a strong case for the effectiveness of armed citizens in keeping the peace. What political conclusion does that entail?

Well, the one he and his company are obviously in favour of is to relax gun laws and make sure as many citizens as possible are armed. (Strangely enough, his company sells firearm-related services.)

I see two problems with this conclusion. The first is that increasing the number of guns in circulation doesn't only increase the number of armed well-meaning citizens, but also the number of armed rogue citizens. Which are going to have the greater effect?

The second is a state-philosophical one. One of the most important tasks of any government is to protect its citizens. For this purpose, it establishes what is known as the monopoly on violence. Basically, the only people allowed to exercise physical force in a country are those who are authorised to by the state, such as the police (who are ideally governed by the rule of law). Any other violence is sanctioned with the full force of the law.

Self-defence is a notable exception to this monopoly on violence, reserved for those occasions when the state is incapable of intervening in time to protect its citizen. Therefore, every time somebody acts in self-defence, this means that the state has failed in its duties. So if the state enacts legislation to encourage self-defence, it is handing its citizens are responsibility it should by rights be carrying itself. In effect, it is shirking its duties – or, worded more strongly, admitting a catastrophic defeat.

So yes, under certain conditions, relaxing gun laws may actually be a way to increase citizen safety. But when every man is responsible for his own safety, we are no longer in a civilised country that protects the weak, but in a free-for-all anarchy. Much as Americans romanticise the “Wild West”, I doubt they really want to go back to the chaos and danger of those times. So maybe it would be worth investing in a decent police force, instead?

Tagged as society, politics, statistics


Unless otherwise credited all material Creative Commons License by Daniel Vedder.
Subscribe with RSS or Atom. Powered by c()λeslaw.