Independent Analysis of the MacArthur Justice Center Study on ShotSpotter in Chicago

Executive Summary

ShotSpotter commissioned Edgeworth Analytics (“Edgeworth”) to review a study by the MacArthur Justice Center (“MJC”) published May 2021 and provide an independent evaluation of the claims contained in it. Based on our analysis, Edgeworth concludes that the MJC study fails to provide a rigorous, balanced, and objective assessment of the use of ShotSpotter in Chicago and is, at best, misleading because of the inappropriate data source used for the study, the selective choice of data and a fundamental lack of understanding as to where ShotSpotter was deployed relative to the highest homicide rate areas of Chicago.

Specifically, we conclude the following:

  1. The OEMC data that was the primary data source used to support the MJC study’s conclusions regarding “unfounded” CPD deployments is an inappropriate source on its own to determine the ultimate outcome of an individual incident and, therefore, is not a reliable measure of ShotSpotter’s efficacy. The MJC study’s interpretation is misleading because the data obtained from the OEMC is not designed to capture and account for any subsequent police action resulting from an initial ShotSpotter alert. The conclusion that the lack of a police report is a measure of ShotSpotter’s accuracy is baseless and misleading.
  2. The MJC study mischaracterizes the placement of ShotSpotter technology as unduly burdening Black and Latinx communities. Specifically, it omits important context – that the placement is based upon areas of need across Chicago as measured by incidents of homicide and gun crime.

In addition to this analysis, Edgeworth has conducted an independent review of ShotSpotter’s claims regarding accuracy in gunshot reporting and false positives—sending an alert of gunfire when none occurred. Specifically, Edgeworth examined ShotSpotter’s representation that its system has an aggregated 97 percent accuracy rate that includes a 0.5 percent false positive rate across all customers over the last two years.  Our review confirmed that (1) these claims are valid and based on actual customer feedback from a broad range of ShotSpotter customers and (2) despite substantial variation in the intensity of reporting potential errors across clients, ShotSpotter’s accuracy rate does not appear to be sensitive to differences in clients’ propensity to report potential errors. The details of this analysis are provided in a separate report.

MacArthur Justice Center Report

The MacArthur Justice Center (“MJC”) obtained Office of Emergency Management and Communications (“OEMC”) data on Chicago Police Department (“CPD”) deployments between July 1, 2019 and April 14, 2021 and prepared a study of calls for service (“CFS”) initiated by ShotSpotter alerts and 9-1-1 calls based on these data.[1] The study’s findings were posted on an MJC-created website and included in an amicus brief filed on May 3, 2021 in Cook County Circuit Court (the “Amicus Brief”). The study’s primary conclusions were that: (1) ShotSpotter-initiated alerts resulted in CPD finding no evidence of a gun-related crime or any crime the majority of the time during the period of study; (2) there were more than 40,000 “unfounded” deployments of CPD; and (3) these “unfounded” deployments were disproportionately in Black and Latinx neighborhoods where ShotSpotter is deployed.

Edgeworth Analytics Review

ShotSpotter commissioned Edgeworth Analytics to review the MJC study and provide an independent evaluation of the analysis contained in it.[2] For our analysis, we reviewed: (1) the MJC study and an Amicus Brief that describes it in detail; (2) the same publicly-available OEMC data MJC used to draw its conclusions, which was provided to ShotSpotter by the CPD, (3) the academic literature; (4) publicly available CPD data; and (5) analyses conducted by ShotSpotter.

What is ShotSpotter?

According to a report from the Brookings Institution, 88 percent of gunshot incidents go unreported to police.[3] ShotSpotter intends to help solve that issue. According to ShotSpotter, the company offers law enforcement agencies an acoustic gunshot detection service that detects, locates, and alerts police to gunfire enabling a precise and rapid response to incidents that likely would have gone unreported to police. The system uses wireless sensors throughout a coverage area to capture loud, impulsive sounds that may be gunfire. The data are transmitted to a central cloud service that classifies each incident with a gunfire probability percentage along with a location determined by triangulation enabled by multiple sensors. Then, specially-trained ShotSpotter employees called “reviewers” located across two ShotSpotter Incident Review Centers listen to the recorded pulses from the sensors that detected the incident audio with playback tools, visually analyze the audio waveforms to see if they match the typical pattern of gunfire, assess the grouping of sensors that participated, and either publish the incident as gunfire or dismiss it as non-gunfire. ShotSpotter said the entire process typically occurs in less than 60 seconds from the time of the gunfire to the time law enforcement is alerted to allow for a timely law enforcement response. The gunfire alerts that are sent to ShotSpotter customers, including the CPD, have three recorded audio snippets that patrol officers can listen to before they arrive on the scene.

Below are examples of gunshot and non-gunshot audio provided by ShotSpotter that were captured by ShotSpotter sensors from various locations nationwide. Each example of gunshots includes the date of the event, the rounds fired, the audio that was shared with the local police department, and a redacted Investigative Lead Summary (ILS) report for the event. For non-gunshot events, each example includes the date of the event, the type of event, and the audio that was shared with the local police department (ILS reports are not generated for non-gunshot events).

Example Audio of Gunshots Captured by ShotSpotter Sensors

Example Audio of Non-Gunshots Captured by ShotSpotter Sensors

Edgeworth Conclusion: OEMC Data Cannot Be Used To Determine If A Shotspotter Alert Is In Fact A Gunfire Incident

At the outset, it is important to recognize that the OEMC is not an arm of the CPD, but instead a distinct office within the government of the City of Chicago. OEMC manages several functions, including 9-1-1 call intake and dispatch in addition to emergency management, traffic management, and other areas, according to OEMC’s website.[4] Consequently, OEMC data do not reflect the ultimate outcomes following subsequent investigations or reports that are created in the hours, days, weeks, and months after a CFS occurs. Only CPD’s own police reports are able to capture the entire outcome of an investigation. This is a misapprehension at the heart of the MJC study as it used OEMC data for its analysis of police deployments based solely on ShotSpotter alerts. The MJC study erroneously interpreted its results to mean that “the ShotSpotter system generates nearly two-thousand alerts every month that turn up absolutely no evidence of gun crime—or any crime at all.”[5] The MJC study concluded that ShotSpotter alerts in Chicago during this time period are “dead ends” that “reinforce[s] racial disparities in policing.”[6]

  1. Disposition Codes Are Not a Reliable Measure of ShotSpotter’s Efficacy

To identify the outcome of a CFS, the MJC study relied on the “final disposition” code that law enforcement officers enter into the OEMC system when recording their findings at the scene of the reported event. The MJC study identified “unfounded” deployments as those where police assign a final disposition code of “Miscellaneous Incident,” which primarily corresponds to “Other Police Service” or “No Person Can Be Found.”[7] However, as noted above, OEMC data is not designed to contain complete or updated information about any investigations about a potential criminal event and so may only contain a small part of a larger case file.

The MJC study said a Miscellaneous Incident code “did not even prompt police to file a case report.”[8] However, this code does not provide information on whether a police report was filed or whether a criminal event occurred. Instead, it indicates the initial response to a CFS, and that is all. If a report is later filed or if there is follow-up to the initial event, there is no update to the disposition code. A possible scenario of such an instance might include police arriving at the scene of a reported “person shot,” but the injured person may have left the scene to seek medical attention. A disposition code of Miscellaneous Incident may be reported to OEMC for the CFS, but a police report may be subsequently filed at a local hospital by officers responding to a call from the hospital. Similarly, police may arrive at the scene of a “shots fired” CFS and find no person of interest or shell casings, but the next day a citizen may report property damage from a gunshot. As these examples illustrate, relying solely on OEMC final disposition data can result in incorrect interpretations of actual events and misleading conclusions about police responses to reports of gunfire.

Therefore, the disposition code alone is not a reliable measure of ShotSpotter’s efficacy, and we conclude the MJC study’s interpretation is misleading because the data obtained from OEMC does not appear to be designed to necessarily capture and account for any subsequent police actions as a result of an initial ShotSpotter alert.

To illustrate this issue, Edgeworth analyzed OEMC data on events where a call was made to 9-1-1 and a person was reported to have been shot in police districts both with and without ShotSpotter coverage. Between July 1, 2019 and April 14, 2021, there were 963 CFS for a “person shot” in police districts without ShotSpotter coverage.[9] Of these, only 49 percent (469) included a final disposition code relating to a gun event.[10] The same percentage of “person shot” CFS in police districts with ShotSpotter deployed included a final disposition code for a gun event — 2,897 CFS for a person shot with 1,430 gun events, or 49 percent. This occurs because the final disposition code reported to OEMC at the scene of a reported event is not necessarily the end of the story. Using the MJC’s flawed logic, one would conclude that CPD responses to 51 percent of the 9-1-1 calls from the public reporting that a person was shot were “unfounded” and generated “dangerous, unnecessary, and wasteful deployments.”[11]

While the OEMC data can potentially provide useful information on initial responses, a Miscellaneous Incident code in the OEMC data is not sufficient to support the conclusion that a deployment was unfounded or that no crime occurred. The OEMC data, which report information on deployments, are not a substitute for case files and police reports that include details not only on the initial response, but also on any subsequent investigation.

  1. Subsequent Identified Criminal Activity Is Unlikely to Be Connected Back to Police Deployment

Information on the time spent on CFS that is contained in the OEMC data help to illustrate why subsequently identified criminal activity is unlikely to be connected back to a police deployment.

Specifically, an OEMC dispatch record captures: (1) the time when the deployment was initiated; (2) the location to which the deployment was made; (3) the reason for the deployment; (4) what was immediately found at the scene; and (5) the time when the deployment was closed. When the deployment is “closed,” what was found (e.g., evidence, a victim, a perpetrator) is reported and the deployment is likely ended.

A core function of OEMC is to deploy an emergency response to an event. Therefore, deployments that do not require an immediate emergency response and result in Miscellaneous Incident reports, where no evidence of a crime is found at the time, are typically short-duration events, regardless of whether ShotSpotter or 9-1-1 calls reporting gunfire initiated the deployment.  In both cases, the median duration of the deployment is 12 minutes, including the time for police to travel to the location.  Figure 1 below shows the distribution of durations for ShotSpotter-initiated deployments recorded as Miscellaneous Incidents.  The short duration of these deployments suggests that Miscellaneous Incidents in the OEMC data are typically concluded in a relatively short period of time and do not track any subsequent investigations or reports.

As our analysis demonstrates, the MJC study’s analysis is misleading as it relies solely on the OEMC data which, by itself, is insufficient to assess ShotSpotter’s effectiveness.

Figure 1
Duration of Miscellaneous Incidents in Minutes
For OEMC Dispatches Initiated by ShotSpotter
July 1, 2019 – April 14, 2021

Source: Chicago OEMC dispatch data.

Edgeworth Conclusion: The MJC Study Mischaracterizes The Deployment Of Shotspotter Technology

The MJC study claimed that ShotSpotter’s pattern of deployment in Chicago is in predominately Black and Latinx neighborhoods and that the “unfounded ShotSpotter alerts…can create a false ‘techwash’ justification for racialized and oppressive patterns of policing in communities of color.”[12] This claim appears to be entirely premised on the MJC study’s improper conclusions addressed above.

ShotSpotter claims that coverage areas are typically determined by law enforcement and elected leadership using objective, historical data that prioritize areas of a city that experience the most gun violence.  Edgeworth has confirmed that ShotSpotter deployments are indeed in the Chicago police districts where violent crime is disproportionately greater.  For example, as shown in Figure 2, CPD homicide data show that the 12 police districts where ShotSpotter is deployed are the 12 police districts with the highest number of homicides between 2012 and 2021.

Similarly, applying OEMC data to 9-1-1 emergency calls (not including ShotSpotter alerts), the 12 police districts with ShotSpotter had more than 120 percent more deployments initiated by 9-1-1 emergency CFS for reports of gunfire (29,317) than the 10 other police districts (13,269) between July 1, 2019, and April 14, 2021.

Figure 2
Homicides by Police District
Districts with ShotSpotter Coverage Areas Highlighted in Red
January 2012 to April 2021

Note:  Police districts where ShotSpotter is deployed are in red and the remaining police districts are in gray. The shares of crime reports involving guns are proportionally the same as homicides by police district over the same period. Therefore, a graph of crime reports involving guns would be very similar to the above graph showing homicides.

Source: City of Chicago Data Portal,


Edgeworth’s analysis of the OEMC data used by the MJC and the conclusions it drew based on those data demonstrates that the MJC study is severely flawed. The OEMC data simply cannot be used to support the MJC’s conclusions about whether gunfire or a gun-related crime occurred because they are an incomplete source of information. The unsupported conclusion that no police report of a crime for a deployment recorded in the OEMC data means no gunshot occurred can lead to incorrect interpretations of actual events and misleading conclusions about police responses to reports of gunfire. Indeed, the MJC’s deeply flawed approach would implicate the 9-1-1 system—a critical, trusted tool for communities and law enforcement across the nation—as generating unnecessary police deployments 51 percent of the time when a person is reported as shot. Finally, the MJC’s assertions regarding the deployment of ShotSpotter in predominantly Black and Latinx neighborhoods fail to consider that the deployment is consistent with an objective, data-based approach of using the ShotSpotter system where homicide and gun crime is most prevalent.


[1] Edgeworth notes that the MJC study focused on a period of time (July 1, 2019 through April 14, 2021) that included frequent and long-term protests, unprecedented gun-related violence in Chicago, and the global pandemic. Notably, the MJC study did not acknowledge that this period is not representative of the typical deployment period, and it did not attempt to demonstrate how this period differs from others. Interestingly, Edgeworth found that, while the raw number of ShotSpotter-initiated dispatches spiked during parts of this period, the rate of dispatches resulting in a crime or gun report remained relatively stable, casting some doubt on MJC’s raw number conclusions as being indicative of any credible conclusion outside of this tumultuous time period.

[2] Based in Washington, D.C., Edgeworth Analytics is a firm of PhD economists who rigorously apply economic principles, hard data, and proven methods for gathering, structuring, analyzing, and applying data to help organizations improve their understanding of the data that drive their businesses.


[4] OEMC website:

[5] Motion for Leave to File Brief as Amici Curiae in Support of Defendant’s Motion for a Frye Hearing, The State of Illinois v. Michael Williams (20 CR 0899601), filed May 3, 2021 (“Amicus Brief”), Exhibit A, p. 2.


[7] Miscellaneous Incidents are identified by final disposition codes beginning with “19.” See, Chicago Police Department, Miscellaneous Incident Reporting Table – CPD-11.484.

[8] Amicus Brief, Exhibit A, p. 8.

[9] Following the MJC’s approach as described in the Amicus Brief, throughout this report, the initial dispatch type coded for an OEMC dispatch record—whether it be an emergency 9-1-1 call or a ShotSpotter alert—is used to determine what initiated the deployment.

[10] Note that the 51% of “unfounded” CFS for a person shot is not comparable to the MJC’s corresponding figure for ShotSpotter because it does not include other reports of gunfire, which constitute over 90% of the relevant CFS.

[11] Amicus Brief, Exhibit A, p. 3.


This website uses cookies to improve functionality and performance. By continuing to use this website, you agree to the use of cookies in accordance with our Privacy Policy.