The city quietly renewed its multi-million dollar contract with ShotSpotter. But a growing body of evidence suggests the technology is ineffective and activists say it leads to deadly over-policing.
On a humid afternoon in late August, dozens of activists gathered at an intersection in Chicago’s Englewood neighborhood to protest the Police Department’s use of ShotSpotter, the gunshot detection system. Days before, news broke that the city had quietly extended its multimillion-dollar contract with the company, outraging residents and some councilmembers.
Alyx Goodwin, one of the event’s organizers, pointed to a light pole bristling with what looked like microphones. They were acoustic sensors used by ShotSpotter to pick up the sound of gunfire and alert police.
“Once you see one, you start to notice them more,” said Goodwin, who works as a deputy campaign director for the Action Center on Race and the Economy, an advocacy group. “It’s not a coincidence that the communities that are over-policed are the same communities that are demanding and in need of stronger infrastructure and things that actually prevent and interrupt violence, like jobs, housing, healthcare, and grocery stores.”
Chicago’s ShotSpotter system is one of the largest in the nation, covering some 117 square miles, largely in predominantly Black and Latinx communities on the South and West Sides. More than 100 U.S. municipalities have made the technology a centerpiece of their efforts to tackle gun violence. But now, activists and leaders around the country are increasingly questioning whether ShotSpotter is money well spent as evidence mounts that the system is ineffective at preventing shootings and may lead to needless — and sometimes deadly — encounters with police.
In few places is the controversy more pronounced than Chicago, where the Police Department is under a court-ordered federal consent decree mandating reforms after a Justice Department investigation uncovered patterns of discrimination and abuse.
On August 24, the city’s Office of Inspector General — a taxpayer-funded, nonpartisan watchdog agency — released a scathing report that showed ShotSpotter was unreliable and possibly dangerous to communities of color. The OIG analysis found that fewer than 10 percent of ShotSpotter alerts led police to evidence of a gun-related criminal offense, and that some officers stopped and patted down people more often in areas where they perceived the alerts to have been frequent. “ShotSpotter alerts rarely produce evidence of a gun-related crime, rarely give rise to investigatory stops, and even less frequently lead to the recovery of gun crime-related evidence during an investigatory stop,” the OIG concluded.
The Chicago Police Department has depicted ShotSpotter as integral to its crime-fighting strategy since the city signed a contract with the company in 2018. At the time, then-Police Superintendent Eddie Johnson called ShotSpotter “a key component” in rebuilding community trust.
In March 2021, a ShotSpotter alert brought police to Little Village, a predominantly Latinx neighborhood on the West Side, where a responding officer fatally shot 13-year-old Adam Toledo. The shooting prompted activists and some city councilmembers to call for canceling the city’s $33 million ShotSpotter contract, unaware that Mayor Lori Lightfoot’s administration had already extended it through 2023. After news of the extension emerged, one alderman vowed to introduce an ordinance requiring council sign-off for the renewal of any contract over $1 million.
Chicago is not the only city where ShotSpotter has come under scrutiny. The Police Department in Charlotte, North Carolina, chose to jettison the technology in 2016, with officials saying it didn’t produce enough results to justify the expenditure. More recently, the San Diego City Council put off deciding whether to renew its contract after activists expressed concerns about the system’s impact on Black and brown residents.
ShotSpotter has never released any detailed data or peer-reviewed analyses of its technology’s benefits, but there is a growing body of independent research, much of it critical. In a study published by the Journal of Urban Health in April, for example, researchers analyzed 17 years of homicide and arrest data from 68 counties that had adopted ShotSpotter and found that it had no significant effect. “Based on the evidence our study produced, I would say that there is no evidence to suggest that the technology reduces firearm homicides,” said Mitchell Doucette, an injury epidemiologist at Johns Hopkins and the study’s lead author.
While ShotSpotter has billed its algorithms as nearly flawless at distinguishing gunshots from other sounds, it also employs analysts to listen in and make sure nothing is misidentified. According to reporting by Motherboard and The Associated Press, those analysts have sometimes changed the location and number of gunshots in reports at the request of police, who then used the massaged data to charge people with gun crimes — in some cases wrongfully.
ShotSpotter has disputed those findings as false and misleading, but there is other evidence suggesting the process is less accurate than the company claims. In an internal performance overview obtained through a Freedom of Information Act request by the nonprofit transparency group Lucy Parsons Labs and shared exclusively with The Trace, ShotSpotter reported that during the first six months of 2021, its system successfully identified gunshots in Chicago more than 97 percent of the time. That rate was calculated based on details reported to ShotSpotter by the Chicago Police Department, but it included thousands of what the company called “probable” gunshots. There were no listed false positives — instances in which sounds identified as gunshots turned out to be something else.
Experts told The Trace that having no false positives at all was unlikely.
“There probably is a higher false-positive [rate] than is actually being reported out here,” said Daniel Lawrence, a research scientist who studies police and policing technology at the CNA Center for Justice Research and Innovation, referring to the performance overview. “I do think that might be the case; seeing zeros across the board, and then you match it up with the OIG report and it raises concerns.”
There were also discrepancies between the performance overview and the OIG’s findings, which were based on a database of confirmed alerts maintained by Chicago’s Office of Emergency Management and Communications. According to the performance overview, between January and June 2021, ShotSpotter detected 20,504 incidents of gunfire in Chicago — 4,495 more than the number of confirmed alerts the OIG found over the same period.
“It doesn’t seem like there’s any scientific basis for how they’re classifying shootings,” said Freddy Martinez, the director of Lucy Parsons Labs.
“It seems that every time [ShotSpotter] goes off, and they don’t know if it was a gunshot or a firework, they label that as ‘probable gunshot.’” Martinez added. “I don’t know how anyone can claim that they’re tracking accuracy.”
In a statement, a ShotSpotter spokesperson disputed that characterization, saying a “very small percentage” of probable gunshots turned out to be false positives. “Helping to save lives is at the core of ShotSpotter’s mission,” the spokesperson said. “Both ShotSpotter and customers in law enforcement prefer to respond to a ‘Probable Gunshot’ than to ignore it and accept the consequences of a missed, violent gunshot — which may result in the loss of human life. Therefore, we strongly support our current process in publishing probable gunshots.”
Chicago Police have continued to defend their use of the system. “ShotSpotter has detected hundreds of shootings that would have otherwise gone unreported,” a police spokesperson said in a statement, adding that it was “among a host of tools used by the Chicago Police Department to keep the public safe and ultimately save lives.”
The OIG’s findings were similar to ones from a previous study by Northwestern University’s MacArthur Justice Center. There, researchers reviewed data on police deployments triggered by ShotSpotter alerts between 2019 and 2021. They found that 89 percent resulted in no report of a gun-related crime, and 86 percent in no crime of any type. In the 21-month period the researchers looked at, they tallied more than 40,000 such “dead-end” deployments.
ShotSpotter alerts often result in “a very, very serious police response to nothing, which really increases the risk to the people in the area,” said Alexa Van Brunt, the MacArthur Justice Center’s director. “The city has a long history and a pattern of using discriminatory tactics,” Van Brunt said. “ShotSpotter is just yet another one that enhances the risk of discrimination and the risk of excessive force.”
Correction: A previous version of this article contained the incorrect affiliation of Daniel Lawrence.