
Outdoor microphones that detect gunfire, determine the location and alert the police, all in a matter of moments?
It might sound like something out of a science-fiction movie, but it’s very much a reality — a high-tech crime-fighting tool that’s expected to be deployed in Schenectady by year’s end.
Now, using technology to solve and prevent crimes is a great idea.
And if the city’s gunshot-detection technology works as advertised, it will be a great investment.
Unfortunately, it’s not at all clear that the technology will live up to its promise.
A number of cities have ended their contracts with ShotSpotter, the Bay Area company that developed the gunfire-detection technology, saying the system did little to solve or reduce crime.
“It’s a costly system that isn’t working to the effectiveness that we need it to work in order to justify the cost,” Jasiel Correia, the mayor of Fall River, Mass., said when the city cut ties with ShotSpotter in 2018.
According to The Herald News, Fall River’s daily newspaper, administration officials said the system worked less than 50 percent of the time and “even missed all seven shots that were fired when a man was killed two months ago in downtown Fall River.”
That’s a pretty lousy track record.
What’s worrisome is that Fall River doesn’t appear to be an outlier.
Other cities report a similar lack of effectiveness.
In 2016, the city of Charlotte, North Carolina, ended its $160,000-a-year contract with ShotSpotter, saying it didn’t help them make arrests or identify crime victims.
“The system operated as designed,” a city memo noted. “However, based on its experience with the system, CMPD feels the return on investment was not high enough to justify a renewal.”
A Forbes magazine analysis of ShotSpotter alerts from seven different cities found that there were “lots of calls, but few tangible results. Of the thousands of ShotSpotter alerts in these cities, police were unable to find evidence of gunshots between 30%-70% of the time.”
Is that an acceptable return on investment?
I certainly don’t think so.
If Schenectady is going to invest in ShotSpotter, it needs to consider how to measure effectiveness and what steps to take if the system doesn’t deliver. How many false-positives — loud noises, such as a car backfiring or fireworks exploding, that are mistakenly identified as gunfire — are too many?
Of course, some cities report good experiences with ShotSpotter, and perhaps Schenectady will become one of those cities. (Schenectady plans to deploy ShotSpotter and one other gunfire-detection program; the city will compare the two systems.)
A 2017 Time magazine article on ShotSpotter reported that, “There’s no shortage of happy customers. Agencies in Oakland, Youngstown, Ohio and Wilmington, N.C., have all credited the system with making arrests. Police in Omaha say ShotSpotter has helped reduce gunfire by 45 percent since 2013.”
That’s terrific.
But reports from dissatisfied customers abound, which is … not so terrific.
In this area, Troy adopted ShotSpotter in 2008 and terminated its contract in 2012.
Former Police Chief John Tedesco told Time that the system gave false alerts or failed to report actual gunfire up to one-third of the time. “We weren’t finding physical evidence,” he told the magazine. “It would sometimes take officers to the wrong location.”
The technology has likely improved since Troy decided it wasn’t worth the trouble.
But has it improved enough?
I’d like to believe that it has, and that ShotSpotter will work out wonderfully for Schenectady.
But the plethora of articles detailing ShotSpotter’s failure to live up to expectations suggest that skepticism is warranted.
The city needs to take a hard, long look at ShotSpotter’s track record before committing to an expensive high-tech system that sounds good, but doesn’t always deliver as advertised.
Reach Sara Foss at [email protected]