ShotSpotter Efficacy Study
In January, 2011, ShotSpotter, Inc asked us how we would conduct an efficacy study to determine how effective was its product at detecting and deterring gunfire. Our primary concern was to understand how we could be objective while taking money from a company.
We told ShotSpotter CEO Ralph Clark that we would do the project but that we would “find what we find, not necessarily what you want us to find.” And he told us that that was just fine with him.
In fact, that was just what he wanted.
We started by deciding that all source material used in the report – the interview questionnaires, the interview transcripts and even the recordings – would be publicly available for review. This would mean that anyone interested could listen to us asking the questions and the answers given by the officers, detectives, dispatchers, analysts and commanders. You can read and hear the answers in the context of the interview.
Because the sample size is so small, we could not do quantitative, but rather were required to do qualitative research. This means that we are unable to say sciency-sounding things like, “73% of users say…”
Instead, we listened to the responses we got, extracted themes which arose when we listened to all the responses, and explored these themes.
The resultant report, we think, is an important document, because we discovered some interesting things about how agencies use ShotSpotter, and how effective the product is.
We are honored and pleased that the report and its findings have been endorsed by the National Organization of Black Law Enforcement Executives, which also believes that we have created a “notable research report”. The report is distributed under the Creative Commons Attribution-Non Commercial-ShareAlike 3.0 Unported License. In that way, it, like all CSG Analysis reports, will reach agencies large and small, regardless of budget.
We were surprised to discover that the strategic value the product provides turns out to be as important to customers as the tactical information it gives them to respond to gunfire – and the tactical information is something the company itself emphasized in its marketing and sales information. The tactical information is very important to responding officers, who use it to find, travel to and respond better to gunfire. But the strategic values of community relations, and creating an overall better understanding of the community’s gunfire problem are, again, at least as important.
Here’s a seemingly obvious statement of the sort one can only make when one conducts a study: the product doesn’t work right when it’s not used properly. Unfortunately, most customers weren’t using it right. Most were not properly re-classifying “activations” – the term used when the product alerts to suspected gunfire – after investigating, so their gunshot activation databases were full of events which were not gunfire. This does not affect tactical response to gunfire alerts, but makes accurate analysis of aggregated gunfire data impossible.
This ex-post-facto analysis is highly valuable. As the report indicates, the ability to state definitively the number of known gunshots in an area, and compare that to, for example, the number of 9-1-1 calls for service reporting gunfire, is an important metric.