Report Shows It’s Not Just ShotSpotter Underperforming When It Comes To Detecting Gunshots
ShotSpotter may be making the most headlines (and losing the most large contracts), but it has competitors in the field of gunshot detection. But, according to this report by Todd Feathers for Wired, its closest competitor isn’t any better than the current market leader.
In February 2023, San Jose began piloting AI-powered gunshot detection technology from the company Flock Safety in several sections of the city, including Gonzalez’s neighborhood. During the first four months of the pilot, Flock’s gunshot detection system alerted police to 123 shooting incidents. But new data released by San Jose’s Digital Privacy Office shows that only 50 percent of those alerts were actually confirmed to be gunfire, while 34 percent of them were confirmed false positives, meaning the Flock Safety system incorrectly identified other sounds—such as fireworks, construction, or cars backfiring—as shooting incidents. After Flock recalibrated its sensors in July 2023, 81 percent of alerts were confirmed gunshots, 7 percent were false alarms, and 12 percent could not be determined one way or the other.
Flock Safety is a big name, but it’s not due to gunshots. Flock’s history involves being one of the first tech companies to sell automatic license plate readers to essential public safety entities like… homeowners associations. Since this first incursion into the always-on ALPR business, Flock has focused on law enforcement agencies, using many of the same tactics deployed by Ring, the leading marketer of doorbell surveillance cameras: hand out cheap tech to cops, tie them into the storage/access ecosystem, and take over the PR work so only the message Flock wants to send makes its way into the mainstream.
At this point, Flock’s gunshot detection product looks to be just as inaccurate as ShotSpotter. The only difference is that ShotSpotter had a head start. The end result is similar, as well. Cities are thinking of ditching gunshot detection tech because it just doesn’t seem to be doing much to combat crime while simultaneously wasting limited law enforcement resources by sending cops to respond to “alerts” that either aren’t gunshots or don’t provide any meaningful information that might result in arrests or prosecutions.
As the Wired article points out, both companies are overselling and under-delivering. Flock claims it has a 90 percent accuracy rate. ShotSpotter claims its algorithm is 97 percent accurate. But real world use of the products have proven both claims false, at least in the areas where cops or cities have bothered to examine the data.
Last year, journalists with CU-CitizensAccess obtained data from Champaign, Illinois, showing that only 8 percent of the 64 alerts generated by the city’s Raven system [Flock Safety’s product] over a six-month period could be confirmed as gunfire.
[…]
This week, New York City’s comptroller published a similar audit of the city’s ShotSpotter system showing that only 13 percent of the alerts the system generated over an eight-month period could be confirmed as gunfire.
In Champaign, Illinois, this resulted in the cancellation of the contract with Flock Safety. In Chicago, the same thing happened with ShotSpotter, although there’s been a recent push by legislators to prevent the contract from expiring in September. In New York City, the recommendation from the comptroller is that the contract not be renewed, at least until the NYPD can demonstrate it’s either instrumental to combating and/or reducing response time by EMS units.
As for San Jose and its less-than-great experience with Flock Safety’s Raven gunshot detection system, the path forward seems less clear. Much like ShotSpotter’s recent change in talking points, Flock Safety claims the real value of gunshot detection is helping victims, not fighting crime. And, for that reason, it appears (even though no one in San Jose is tracking this specific metric) city leaders are hesitant to pull the plug on Raven.
Pointing to the report’s finding that only 6 percent of the confirmed gunshots detected by the system were reported to police via 911 calls or other means, police spokesperson Sergeant Jorge Garibay tells WIRED the SJPD will continue to use the technology. “The system is still proving useful in providing supplementary evidence for various violent gun crimes,” he says. “The hope is to solve more crime and increase apprehension efforts desirably leading to a reduction in gun violence.”
Well, maybe you can’t put a price on hope, but you should be able to establish a baseline measurement to help ensure the money spent on the “hope” Flock Safety’s product will ultimately prove useful doesn’t become a buried line item in the SJPD’s budget that gets funded year after year without demonstrating any measurable return on investment. And if the PD really wants to keep this tech, it should be leading the way in compiling data that shows it’s actually worth keeping. Otherwise, this isn’t hope. It’s faith. And relying on faith is no way to run a government.