Methodology for National Analysis of Clearance Rates The Trace and BuzzFeed News Date:​ January 2019 Authors:​ Jeremy Singer-Vine and Sarah Ryley Contact:​ ​jeremy.singer-vine@buzzfeed.com​ and ​sryley@thetrace.org Introduction and overall findings 3 Glossary 5 Overview of Data Sources 8 Return A: “Offenses Known and Clearances by Arrest” 8 NIBRS: National Incident-Based Reporting System 8 SHR: Supplementary Homicide Report 9 Relative coverage of the three datasets 9 Standardized Datasets 10 Analysis of “Return A” Data 12 Data Caveats 12 Missing / zero-ed data 12 Inconsistently reporting agencies 12 Data Preparation 13 Data restructuring 13 ORI fixes 13 Removal of mass murders 14 Population group assignment 14 Non-gun assaults 14 Methodology and Findings 15 Reporting consistency 15 Overall clearance rates over time 17 Decade trends, with interquartile ranges 21 Distributions of clearance rates, and median clearance rate, by decade 22 Analysis of SHR Data Data Caveats 24 24 Inconsistently reporting agencies 24 Lack of explicit clearance/arrest information 25 Missing offender information 26 Lack of reporting of victims’ Hispanic ethnicity 26 Data Preparation 29 1 Unique incident IDs 29 Exclusions — Types of homicides 29 Removal of mass murders 30 Identifying larger-city agencies 30 Firearm involvement 30 Condensing victim race + ethnicity 31 Offender age reported 32 Methodology and Findings 33 Offender-age-reported outliers 33 Offender-age-reported rates over time 34 Offender-age-reported rates by murder weapon 35 Offender-age-reported rates by weapon and victim race/ethnicity 36 Offender-age-reported rates by weapon, victim race/ethnicity, and victim sex 39 Offender-age-reported rates by victim demographics and circumstances 40 Distribution of rates for by agency and decade 41 Analysis of NIBRS Data Data Caveats 45 45 Lack of coverage / representation 45 Incompleteness of arrest data 46 Data Preparation 46 Firearm involvement 47 Injury types 47 Identifying “larger city” agencies 48 Labeling incidents involving law enforcement officers 48 Methodology and Findings 48 Arrests for gun assaults by injury level, 2000-2016 48 Distribution of injuries and arrest rates by agency 49 Disparities in gun arrest rates, by victim race/ethnicity 50 Days until arrest, by offense type and weapon 51 Comparison of Return A, NIBRS, and SHR data 53 Comparing NIBRS clearance rates to Return A clearance rates 53 Comparing SHR offender-age-reported rates to Return A clearance rates 54 Comparing SHR and Return A rates, by NIBRS participation 55 Comparing racial disparities in the two victim-level datasets 57 2 Introduction and overall findings The Trace and BuzzFeed News h ​ ave published​ the raw data, data-standardization code, standardized data, and data-analysis code to support the findings below. This document describes the data and methodologies used by The Trace and BuzzFeed News to analyze national crime data, for the purpose of estimating the following: ● Trends in clearance rates, over time, for murder, gun assault, and non-gun aggravated assault ○ We find that the overall clearance rate for both homicides and aggravated assaults have fallen significantly over the past four decades. Clearance rates for aggravated assaults that were committed with firearms have declined far more substantially than for those not committed with firearms. Our analysis suggests the same is true for murders. ● Disparities in offender-age-reported rates for murders — by weapon type, gender, and race/ethnicity ○ Our analysis indicates that offenders have been identified for lower proportions of murders involving a firearm vs. not; for male victims vs. female victims; and for black/Hispanic vs. white victims. We also find that the firearm racial disparities in offender identification have been growing over recent decades. ● Disparities in arrest rates for gun assaults — by injury level and race/ethnicity ○ Our analysis indicates that, for gun assaults, arrests are also made less often for black/Hispanic victims vs. white victims, and that the disparity is larger for victims with major/minor injuries than for no-injury victims. ● The amount of time that elapses between offense and arrest, for murders and aggravated assault, by weapon type ○ We find that non-firearm offenses are much more likely to lead to an arrest by the next day than firearm offenses. We also find that arrests for gun assaults drop off more quickly than for gun murders. To be sure, numerous factors influence the likelihood a murder or assault will get solved. Some factors are at the agency level, such as its management style, technological resources, relationship with the community, detective caseload, and the evidentiary standards of the prosecutors. Others factors are more case-specific — that is, some offenses are simply harder to solve than others, based on the crime’s motivating circumstances, weapons used, availability of witnesses, et cetera. 3 Additionally, there is little agreement over which specific factors, or categories of factors, have the greatest bearing on an agency’s success in solving cases. Certainly, some cases are easier to solve than others. For instance, an internal study by the Washington, D.C. ​Metropolitan Police Department​ (2001), found that, among cleared cases, domestic cases took an average of just 2.5 days to investigate, versus drug and gang cases, which took an average 59 days and 79 days to investigate, respectively. In developing the methodology and analyses below, The Trace and BuzzFeed News consulted with several experts familiar with the FBI’s data, including (in alphabetical order by last name): Roland Chilton, Emeritus Professor at UMass Amherst; Sean Goodison, deputy director at the Police Executive Research Forum; Brian Harris, former Houston Police Department homicide detective; Larry T. Hoover, director of Sam Houston State University’s Police Research Center; Michael Maltz, Emeritus Professor at the University of Illinois at Chicago; Andrew V. Papachristos, Professor of Sociology at Northwestern University, and director of the Northwestern Network and Neighborhood Initiative; Wendy C. Regoeczi, director of Cleveland State University’s Criminology Research Center; Daniel Webster, director of the Johns Hopkins Center for Gun Policy and Research; Charles Wellford, Professor Emeritus at the University of Maryland, and former director of the Maryland Justice Analysis Center; and Christopher Winship, Diker-Tishman Professor of Sociology at Harvard University. We are grateful for their feedback. 4 Glossary This methodology uses a few uncommon (or uncommonly specific) phrases throughout. Here’s what we mean by them: ● Agency.​ The FBI assigns each law enforcement agency an Originating Agency Identification (ORI) number. In the analyses below, agencies are identified by (and tracked over time using) these ORI numbers.1 ● Agency-years.​ By this phrase, we mean an agency’s reported data for a single year. For example, the New York Police Department’s reported data for 2015 and 2016 represents, together, two agency-years. Likewise, the 2015 reported data for the New York Police Department and the Chicago Police Department represents, together, two agency-years. ● Clearance by arrest.​ ​Per the FBI​: “In the UCR Program, a law enforcement agency reports that an offense is cleared by arrest, or solved for crime reporting purposes,​ when three specific conditions have been met​. The three conditions are that at least one person has been: Arrested; Charged with the commission of the offense; [and] Turned over to the court for prosecution (whether following arrest, court summons, or police notice).” (emphasis added) ○ The FBI continues: “In its clearance calculations, the UCR Program counts the number of offenses that are ​cleared, n ​ ot the number of​ persons arrested​. The arrest of one person may clear several crimes, and the arrest of many persons may clear only one offense. In addition, some clearances that an agency records in a particular calendar year, such as 2016, ​may pertain to offenses that occurred in previous years.​” (emphasis added) ○ A case that was cleared by arrest does not necessarily mean the case led to a conviction, or that the case was even prosecuted. ○ It’s ​important to note​ that the policies of the local District Attorney’s Office can affect an agency’s “clearances by arrest”. For example, some DAs will only issue arrest warrants if they feel the police have compiled enough evidence that a conviction is near-certain; whereas, on the other end of the spectrum, police in some jurisdictions can make arrests on homicides based on “probable cause”. For this reason, one should be cautious when comparing clearance rates, or 1 In some cases, mergers of law-enforcement agencies complicate the one-to-one correspondence of agency-to-ORI. For instance, in 2003, the Louisville Division of Police (ORI: KY05602) and the Jefferson County Police Department (KY05601) merged to form the Louisville Metro Police Department (KY05680). In other mergers, the merged agency will retain one of its predecessors’ ORIs. We identified as many police mergers as we could and ran alternative analyses in which they all historically shared the same ORI. The effect was negligible; to avoid unnecessary complexity, we have left all ORIs as they are in the FBI’s raw data. 5 even arrest rates, between individual agencies, or even between different time periods for the same agency, as DAs have significant influence on this variable. ● Clearance by exceptional means.​ ​Per the FBI​ (emphasis added): “In certain situations, elements beyond law enforcement’s control prevent the agency from arresting and formally charging the offender​. When this occurs, the agency can clear the offense exceptionally. Law enforcement agencies ​must meet the following four conditions​ in order to clear an offense by exceptional means. The agency must have: Identified the offender; Gathered enough evidence to support an arrest, make a charge, and turn over the offender to the court for prosecution; Identified the offender’s exact location so that the suspect could be taken into custody immediately; Encountered a circumstance outside the control of law enforcement that prohibits the agency from arresting, charging, and prosecuting the offender.” ○ Some ​commonly cited circumstances​ leading to an exceptional clearance include: Death of the offender, for example in a murder-suicide, double-murder, police killing, or a retaliatory killing; the offender is located in another country and cannot be extradited; or in a non-fatal case, if the victim refuses prosecution. ● Clearance.​ If not specified otherwise, when we say “clearance,” we’re counting all incidents reported as clearanced by arrest or by exceptional means. ● Clearance rate.​ An agency’s “clearance rate” is the total number of incidents cleared, divided by the total number of incidents that occured. In the FBI’s Return A data, an agency reports the total number of incidents it has cleared in a given year, regardless of the year the incident occured. For example, if an agency had 10 homicides within its jurisdiction in 2016, cleared five of those homicides, and an additional three homicides from prior years, it’s “clearance rate” for 2016 would be 70%. In our analysis of the FBI’s Return A data, we follow the same convention. However, in our analysis of the FBI’s incident-level NIBRS data, an agency’s “arrest rate” is the share of cases from that year that were marked as cleared by arrest. ● Gun assault / firearm assault / aggravated assault involving a firearm.​ We use these phrases interchangeably with ​FBI’s definition​ of “Aggravated Assault—Firearm (4a),” which includes “all assaults in which a firearm of any type is used or is threatened to be used.” (As the definition suggests, not all gun assaults involve an injury. In the NIBRS analysis below, we examine injury rates for gun assaults.) ○ Non-gun aggravated assault.​ By this, we mean all aggravated assaults excluding those committed with a firearm. (Note: This does not include what the FBI categorizes as “simple assaults,” which are assaults that did not involve a weapon and only resulted in minor injuries that required only the usual first-aid treatment.) 6 ● Larger-city agency.​ In each of the datasets we analyze below, the FBI associates each agency with a population “group.” There are separate population groups for municipal police departments, sheriff’s departments, state police, and so forth. ○ In the analyses below, we use the phrase “larger-city agency” to refer to police departments assigned any of the following four population group codes: ■ 1A = Cities with population of 1,000,000+ ■ 1B = Cities w/ pop. 500,000–999,999 ■ 1C = Cities w/ pop. 250,000–499,999 ■ 2 = Cities w/ pop. 100,000–249,000 ○ Unless otherwise specified, the “larger-city agency” designation is based on the agency’s FBI population for ​2016​ (or, for a small number of agencies not present in the 2016 data, the next most recent year). ● Urban police department.​ To avoid jargon in the published article, we occasionally use “urban police department” to refer to the larger-city agencies above. 7 Overview of Data Sources Our analysis of national clearance rates draws three main data sources, all from the U.S. Federal Bureau of Investigation’s ​Uniform Crime Reporting Program​. This section provides a brief overview of these data sources. More details can be found later in this document. Return A: “Offenses Known and Clearances by Arrest” The FBI’s Return A data (“Monthly Return of Offenses Known to the Police”) is the most commonly cited federal source on crime statistics. It is the most nationally-representative, but least detailed, of the three federal datasets analyzed here. Law enforcement agencies voluntarily send monthly offense and clearance totals to the FBI, either directly or through a state agency. (Although the program is voluntary, the vast majority of larger-city agencies do report Return A data, and most have provided it consistently since the 1960s.) The offenses are classified into 27 categories, including murder and aggravated assaults involving a firearm. The FBI then checks the numbers for consistency and validity — sometimes requesting that agencies fix or explain apparent discrepancies — and then compiles the data into annual files. Although the data is titled “offenses known and clearances by arrest,” the clearance numbers also include clearances by “exceptional means.” The data goes back to 1960, but very few agencies reported clearances in the early 1960s. For our analysis, we used Return A data ​from 1965 to 2017​, the most recent year available. NIBRS: National Incident-Based Reporting System The National Incident-Based Reporting System (NIBRS) is much more detailed than the Return A data, but also less nationally representative and available for many fewer years. Unlike the Return A data, which provides only aggregate offense and clearance counts, NIBRS captures data in individual incidents, offenses, victims, and arrests. The data goes back to 1991, but relatively few large departments used it at the time — or even as recently as the early 2000s. Over the years, more large agencies have joined, but the data still isn’t nationally representative. ​As of 2016​, only 6,849 of the roughly 18,500 law enforcement 8 agencies (37%) participated in NIBRS; and only 94 of the 307 eligible larger-city agencies (31%) did so. Our analysis examines data from ​1991 to 2016​, all years for which full data is currently available, but generally focuses on a more recent timeframe. (In late December 2018, we received partial NIBRS data for 2017 from the FBI; the deadline for agencies to submit their data has not passed, so many agencies are not accounted for in this file.) SHR: Supplementary Homicide Report The Supplementary Homicide Report (SHR) data provides more detail about individual murders than the Return A data, and is submitted by a greater proportion of agencies than the NIBRS data. Crucially, it includes demographic information about victims, as well as the weapon(s) used in each homicide. Although SHR data is available going back to 1962, data from 1962 to 1975 does not contain offender demographics, which are necessary for our analysis. Because the raw data from 1976 to 1979 uses a somewhat different format than later years, we focus our analysis on data from 1980 to 2017​, the most recent year available. Relative coverage of the three datasets The chart below shows the proportion of all larger-city agencies that reported at least 1 murder to the FBI via the three data programs (as a percentage of larger-city agencies that have ​ever reported). 9 Figure 1. Note: The gap in reporting percentages for SHR vs. Return A is largely due to Florida (which no longer submits SHR data) and Illinois (where only the Chicago and Rockford police departments have reported for the majority of recent years). Standardized Datasets The raw Return A, NIBRS, and SHR data are formatted entirely differently from one another, use different terminology, different variables, and different data structures. To facilitate the combination and comparison of these three datasets, we created “standardized” versions of them all. The bulk of that work focused on creating a common set of variables for the NIBRS and SHR datasets, and converting those datasets from their original structures into a one-row-per-victim structure. (The Return A data is much less detailed, and required less work to standardize.) For instance, the standardized NIBRS and SHR datasets both contain a “firearm_ind” variable that indicate whether a firearm was used in the offense. 10 Although the standardized datasets are to some degree tailored to our analyses, we believe that they could also be broadly useful to other researchers, so we are ​sharing the following on GitHub​: ● The standardized datasets ● Data dictionaries describing each column of each standardized dataset ● The code we used to convert the raw Return A, NIBRS, and SHR data into these standardized formats 11 Analysis of “Return A” Data Our analysis of the Return A data examines the following: ● ● Clearance rates for murder, gun assault, and non-gun assault, over time. The distribution of per-agency clearance rates by decade. Data Caveats Note: Additional details regarding the limitations of Return A, NIBRS, and SHR data can be found in “​Bridging Gaps in Police Crime Data,​ ” Maltz (1999), Bureau of Justice Statistics, and “​Missing Data and Imputation in the Uniform Crime Reports and the Effects on National Estimates,​ ” Lynch and Jarvis (2008, Journal of Contemporary Criminal Justice). Missing / zero-ed data The Return A data does not distinguish between missing values and zero-counts. That is: If an agency’s Return A data indicates “0” clearances for a given offense for a given month or year, that could mean either (a) that the agency affirmatively reported zero clearances, or (b) that the agency skipped reporting clearances for that month or year. For high-crime agencies, the answer is obvious. For lower-crime agencies, the answer is not as clear. To account for this problem, the analyses below (unless otherwise noted) include an agency’s annual Return A data for a particular type of offense ​only if the agency reported at least 1 clearance for that offense. Inconsistently reporting agencies Not all agencies, unfortunately, have reported offenses and/or clearances consistently over the years. For instance: ● The Chicago Police Department’s data has not included any clearances since the mid-1990s. ● The New York Police Department’s Return A data includes zero gun assaults and zero cleared murders for all years between 2003 and 2012. 12 When examining long-term trends, the appearance/disappearance of an agency’s data — especially for a large agency — could, in theory, have a misleading effect. So, for most of the analyses below, we included only ​consistently-reporting agencies​ — agencies that reported at least one clearance for 34 of the 38 years since 1980. For analyses with longer timeframes, we used a threshold of 48 of the 53 years since 1965​.​. In general, we found that, while excluding inconsistently-reporting agencies changed some specific numbers, doing so did not change the overall trends. Data Preparation Data restructuring For this analysis, we began with raw Return A data files from 1960 to 2017, which we obtained directly from FBI. (Note: These are the 7,385-character-wide fixed-width files; their layout differs from the files available from the National Archive of Criminal Justice Data, although they contain the same core data.) We converted the raw data to an analysis-friendlier format, by doing the following: ● Converting the fixed-width “wide” layout (one row per agency-year combination) to a “tidy” CSV files (one row per agency-year-crime-month combination). ● Converting the FBI’s negative-number notation (e.g., -2 is represented as “00000J”) into actual negative numbers. ORI fixes We also fixed a few agency identifiers (ORIs) which appeared to have been mislabeled since they had more than one entry in a given year: ● ● ● VA021SP appears incorrectly labeled as “VA02101” in 1967 and 1970, based on the “Agency Name” field. VA02901 appears incorrectly labeled as “SC02901” in 1972, based on the “Agency Name” and “State Code” fields. MD00210 appears to have a duplicate entry in 1996. We discarded the second entry. (For gun assaults and murders, the reported values are all 0, however, for both entries.) 13 Removal of mass murders We adjusted murder offense and clearance counts to remove the following incidents from the data: ● ● ● 1995 Oklahoma City bombing 2016 Orlando Pulse shooting 2017 Las Vegas shooting Note: Deaths from the September 11, 2001 terrorist attacks are not reflected in the Return A data. Population group assignment The total population that an agency serves may have changed significantly over time, so we created an additional field categorizing each agency based on its ​current​ FBI-defined population “group” — the group reported in their 2016, or next-most-recent, Return A data. We labeled agencies as​ “larger city agencies”​ (and, elsewhere, “urban police departments”) if they were assigned any of the four following group codes: ● ● ● ● 1A = Cities with population of 1,000,000+ 1B = Cities w/ pop. 500,000–999,999 1C = Cities w/ pop. 250,000–499,999 2 = Cities w/ pop. 100,000–249,000 Non-gun assaults For each agency-year, we also calculated the total number of “non-gun assault” offenses and clearances, by summing the aggravated assault counts from the following three categories: ● ● ● “with a knife” “other weapon” “with hands, feet, etc.” 14 Methodology and Findings Reporting consistency The chart below shows the distribution of larger-city agencies by the number of years they reported at least one clearance since 1980, disaggregated by crime type. The red line represents ~90% of the number of available years — i.e., 34 of 38 years since 1980. Murder Gun assault Figure 2. 15 Of the 312 larger-city agencies in the data, 188 (60%) met that threshold for reporting murders, and 234 (75%) for gun assaults. All but 17 of the agencies for murder, and all but 28 for aggravated assault, were also “consistent” during the longer timeframe — 48 of the 53 years since 1965. Consistently-reporting agencies, however, also tend to be larger ones, and so account for a disproportionate number of reported crimes — roughly 80% of both murders and gun assaults among larger-city agencies in 2017: Figure 3. 16 Notably, the upward trend for both murder and assault, as well as the bump in assault percentages, are largely due to reporting inconsistencies at three agencies: NYPD, Chicago PD, and Baltimore PD — all three of which stopped reporting assault counts for many years. (Other agencies have done the same, but these three appear to have the largest effect.) Without them, the pattern looks like this: Figure 4. Overall clearance rates over time For our first analysis, we calculated the overall clearance rates for murder, all aggravated assault, gun assault, and non-gun assault. We find that overall clearance rates for murders and gun assaults have declined substantially over the past five decades. To test the validity of our findings, we ran the same calculations under six different scenarios: ● Including all agencies, regardless of type, size, or reporting consistency 17 ● Only larger-city agencies (i.e., municipal police departments with jurisdictions containing 100,000+ people) ● Only larger-city agencies that have consistently reported since 1965 (i.e., 1+ clearance for 48 of 53 years) ● Only larger-city agencies that have consistently reported since 1980 (i.e., 1+ clearance for 34 of 38 years) The results: All agencies Only larger-city agencies 18 Only larger-city agencies consistently reporting since 1965 Only larger-city agencies consistently reporting since 1980 Figure 5. 19 To compare gun and non-gun assaults since 1980, we focused on larger-city agencies, limiting both the gun and non-gun trend to agencies that have reported consistently since 1980 for ​gun assault​: Figure 6. 20 Decade trends, with interquartile ranges The charts below show decade-by-decade clearance rates for murder, gun assault, and non-gun assault. The solid lines indicate the aggregate clearance rates (for all consistently-reporting, larger-city agencies), while the lighter areas around them indicates the interquartile range (i.e., the range between the 25th percentile and the 75th percentile). The aggregate rates are much closer to the 25th percentile, because the largest/highest-crime agencies also typically have some of the lowest clearance rates. ​Note: Agencies below restricted to those consistently reporting since 1965.​ Figure 7. 21 Distributions of clearance rates, and median clearance rate, by decade Next, to provide more detail regarding the declining clearance rates for murder and gun assaults, we plotted the distribution of clearance rates for these offenses, on an agency-by-agency basis, for each decade. We restricted the universe of agencies to ​larger-city agencies that have consistently reported since 1965. The chart on the following page shows the results, with the median clearance rates marked by the red lines. It shows that median clearance rates have also declined, though slightly less than the overall clearance rates have. It also appears that, for murder, agencies’ clearance rates have become substantially more varied: The standard deviation in murder clearance rates has nearly doubled, from 12% in the 1980s to 23% in the 2010s. 22 Distribution of Agency Clearance Rates by Decade Only Larger?City Agencies Consistently Reporting Since 1965 decade 1960 offense assau t_gun decade 1960 offense murder 10% 20% 30% 40% 50% 60% 20% 90% 90%10020% 90% 90%100% Clearance Rate Clearance Rate decade 1920 offense assau t_gun decade 1920 offense murder .will 10% 20% 30% 40% 50% 60% 20% 90% 90% 100% 0% 10% 20% 30% 40% 50% 60% 20% 90% 90%100% Clearance Rate Clearance Rate decade 1990 offense assault_gun decade 1990 offense murder .10% 20% 30% 40% 50% 60% 20% 90% 90% 100% Clearance Rate Clearance Rate decade 1990 offense assault_gun decade 1990 offense murder .ass 1a'ss miss sass aa'ss sass sass sass Bass Bass 1aass ass 10% 20% sass sass sass sass sass sass sass 1aass Clearance Rate Clearance Rate decade 2000 offense assault_gun decade 2000 offense murder .05110% 20% 30% 40% 50% 60% 20% 90% 90%100% 0% 10% 20% 30% 40% 50% 60% 20% 90% 90%100% Clearance Rate Clearance Rate decade 2010 offense assau t_gun decade 2010 offense murder .10% 20% 30% 40% 50% 60% 20% 90% 90%100% 0% 10% 20% 30% 40% 50% 60% 20% 90% 90%100% Clearance Rate Clearance Rate Figure 8. 19603 data includes only 1965-1969; 20105 data includes only 2010-2017. Analysis of SHR Data The FBI’s Supplementary Homicide Report contains incident-level details on homicides from 1962 through 2016, but does not include clearance status. However, the data from 1962 to 1975 does not contain offender demographics, which are necessary for our analysis. In addition, the raw data from 1976 to 1979 uses a slightly different format than later years. For those reasons, we focus our analysis on data from ​1980 to 2017​. Our analysis of the SHR data examines the following: ● Offender-age-reported rates, over time, for the following groupings: ○ Firearm vs. non-firearm incidents ○ Firearm vs. non-firearm incidents, for black/Hispanic victims vs. white victims ○ Firearm vs. non-firearm incidents, for male vs. female victims ● Offender-age-reported rates by the proportion of victims, by agency, identified as black/Hispanic. ● Offender-age-reported rates by listed circumstance, and by victim race/ethnicity Data Caveats Inconsistently reporting agencies As with the Return A data, many agencies have not consistently reported incident-level homicide information to SHR. Unlike in the Return A data, however, there are long periods of time where entire states are not included in the data. For instance, between 1980 and 2017, these states’ law enforcement agencies reported no murders to SHR for at least one full year: ● ● ● ● ● ● Florida — 25 years Washington, D.C — 10 years Kansas — 6 years Montana — 4 years Maine — 2 years Wisconsin, New Hampshire, Kentucky, Iowa, and Alabama — 1 year each 24 Of the 293 larger-city agencies in the data, 69% (202 agencies) have reported at least 1 murder in ​at least 34 of the 38 years​.​ ​We label this group as “consistently reporting.” Figure 9. Lack of explicit clearance/arrest information Unfortunately, the SHR data contains no field indicating whether an offender was arrested and/or whether the offense was considered “cleared.” Instead, we use ​offender age​ as a proxy for whether ​whether an offender was identified​. This approach has been used by other researchers (e.g., Snyder, ​National Juvenile Violent Crime Trends, 1980-1994​, as ​cited in Maltz, 1999​.) Other analyses, such as ​those from the Murder Accountability Project​, have used the offender’s listed sex as a proxy for identification — if the sex is listed as “U” (“Unknown”), the offender is considered to be unidentified. We chose age instead, for two reasons: ● The SHR data suggests that age is the more-limiting factor. For the years examined: ○ Both​ age and sex are listed for 69% of offenders ○ Sex is listed but age is not​ for 5% of offenders ○ Age is listed but sex is not​ for 0.08% of offenders ○ Neither​ age nor sex are listed for 26% of offenders 25 ● Aside from the data, it seems more likely that police, without knowing the identity of an offender, would be able to determine the offender’s sex than the age. There are two main limitations to this approach: ● Identifying an offender is not the same as arresting the offender, or clearing the offense. ● Some agencies appear to report offender demographics inconsistently or incompletely. See the next subsection for a discussion of this issue. Missing offender information In some years, certain agencies — such as the the Washington, D.C., Metropolitan Police in 1990 and 1994 — report implausibly few offenders’ ages (i.e., in a far smaller percentage of murders than the percentage of murders they reported as cleared in the Return A data those years). This issue does not appear to be particularly widespread, but we discuss it at greater length in the sections below. Lack of reporting of victims’ Hispanic ethnicity In the SHR data, race and ethnicity (Hispanic or Not Hispanic) are two separate fields. The ethnicity field is often blank, stemming from two main issues: The first and primary reason is that many agencies simply do not submit ethnicity data to SHR. A secondary reason is that when a police department uses NIBRS to submit incident data, it no longer submits SHR data separately. Instead, the FBI automatically converts the police department's NIBRS data into SHR data. That conversion process, however, does not carry over any ethnicity data. More than 70% of the larger-city agencies’ SHR data does not include ethnicity information for any victim for at least 5 of the 38 years, and some of the largest police departments in the country have not included ethnicity data for at least 10 of the last 20 years, including: ● ● ● ● Chicago PD Baltimore PD New Orleans PD Memphis PD 26 This chart displays the percentage of larger-city agencies for which there is no ethnicity information in a given year. (The data for 1987 appears to be missing victims’ ethnicity information entirely.) Figure 10. 27 The chart below displays the distribution of larger-city agencies by the number of years, out of 38, for which ethnicity data ​is​ available. (No agency has ethnicity data for all 38 years, because there is no ethnicity information available for any agency in 1987.) Figure 11. In the SHR data where both race and ethnicity are included, the race of Hispanic/Latino victims is almost always white (98%), with the second-most frequently occurring race as black (1.8%). So, in instances where an agency’s data does not contain victim ethnicity information for an entire year, that year’s data most likely includes many white victims who would otherwise be labeled as Hispanic. For this reason, the lack of Hispanic ethnicity reporting could skew the disparity in the offender-age-reported rates between white and black victims. For instance, in 2016, when Chicago resumed reporting victim ethnicity data, the offender age was reported for 37% of non-Hispanic white victims, 27% of Hispanic white victims, and 22% of black victims. If no ethnicity data were available, as was the case in Chicago in previous years, the rate for white victims would have appeared to have been 30%. By not reporting Hispanic ethnicity, the black-white disparity would have shrunk from 15 percentage points to 8 percentage points. 28 Agencies that have consistently reported victim ethnicity data, however, also appear to be different in several important ways than agencies that do not. For instance, of the agencies for which victim ethnicity data is consistently available (i.e., at least 34 of 38 years): ● ● Are almost all in California (51 agencies), Texas (21), or Arizona (6). The only other three agencies are in North Carolina, Nebraska, and Oregon. Have a much smaller proportion of victims whose race is listed as black: 39% at these agencies vs. 73% at all others since 2010. For our analysis, we combined the race and ethnicity fields to distinguish between Hispanic white and non-Hispanic white victims whenever possible. Data Preparation For this analysis, we began with raw, fixed-width SHR data files from 1980 to 2016, which we obtained directly from FBI. We loaded each one, and then used the National Archive of Criminal Justice Data’s ​codebook for the 2015 data​ to extract each field and, where applicable, to translate data-codes into their English equivalents. Unique incident IDs The SHR data does not contain any single field that uniquely identifies a homicide incident. Instead, it labels incidents by the agency, year of offense, month of offense, and sequential order of the incident for that month (e.g., “2” for the second homicide of the month). For each incident, we created an unique identifier by combining the following fields: "ORI CODE", "YEAR", "MONTH OF OFFENSE", and "INCIDENT NUMBER" (the aforementioned sequential number). After doing so, 632,884 rows had a truly-unique identifier, while 36 “unique” identifiers appeared in either two or three rows. (None appeared in more than three rows.) We deduplicated those rows by keeping only the row with the most recent date in the “LAST UPDATE” field, leaving us with 632,920 rows — with one row per incident. Exclusions — Types of homicides The SHR incidents are categorized, via the “TYPE OF OFFENSE: HOMICIDE” field, into two main types: 29 ● ● “Murder and non-negligent manslaughter” “Manslaughter by negligence” In the analyses below, we exclude all “Manslaughter by negligence” incidents. For incidents classified as “Murder and non-negligent manslaughter,” we also excluded those in which the “CIRCUMSTANCE” field is any of the following, for ​any​ of the offenders: ● ● ● ● “Felon killed by police” “Felon killed by private citizen” “Institutional killings” “Abortion” Given those constraints, the analyses below examine a total of 598,956 incidents. Removal of mass murders We adjusted murder offense and clearance counts to remove the following incidents from the data: ● ● 1995 Oklahoma City bombing 2017 Las Vegas shooting Note: Deaths from the September 11, 2001 terrorist attacks are not reflected in the SHR data. The SHR data contains no recent data from law enforcement agencies in Florida, so the 2016 Pulse shooting in Orlando is not reflected in it. Identifying larger-city agencies Similar to the other datasets, the SHR data includes a “population group” field, which uses the same groupings as the Return A data above. And similar to the other analyses, we identified “larger city” agencies as municipal police departments whose jurisdictions had a population of 100,000 people or more in 2016, or the year most recently reported to SHR before that. Firearm involvement 30 We classified offenses by whether or not they involved a firearm. For each offender, the SHR data indicates the type of weapon used. If the data indicated that any of the offenders used one of the following weapons, we labeled the homicide as involving a firearm: ● ● ● ● ● 11 — ”Firearm, type not stated” 12 — ”Handgun - pistol, revolver, etc” 13 — ”Rifle” 14 — ”Shotgun” 15 — ”Other gun” Condensing victim race + ethnicity For each victim of each incident, the SHR data includes both a “RACE” and an “ETHNIC ORIGIN” field. The options for race are the following: ● ● ● ● ● ● Asian Black or African American American Indian or Alaska Native Native Hawaiian or Other Pacific Islander White Unknown The options for ethnic origin are the following: ● ● ● Hispanic or Latino Not Hispanic or Latino Unknown or not reported We condensed these two variables into a single race+ethnicity variable, with the following possible values: ● ● ● ● ● ● ● H — Hispanic or Latino, any race A — Asian, not Hispanic or Latino, or ethnicity unknown B — Black or African American, not Hispanic or Latino, or ethnicity unknown I — American Indian or Alaska Native, not Hispanic or Latino, or ethnicity unknown P — Native Hawaiian or Other Pacific Islander, not Hispanic or Latino, or ethnicity unknown W — White, not Hispanic or Latino, or ethnicity unknown U — Race unknown, ethnicity either unknown or not Hispanic or Latino 31 For many analyses, we further condensed this variable into one of three values: ● ● ● Black/Hispanic White Other (which includes victims of other races, and victims for which race/ethnicity was listed as unknown) Offender age reported As noted above, we are using offender-age-reported as a proxy for whether the police were able to identify the offender. For each offender listed in the data, there is an “AGE” column. A value of “00” indicates that the age is unknown, according to the SHR codebook. For each incident, we determine whether ​any​ of the offenders’ ages are listed. If they are, we label the incident as “offender reported.” 32 Methodology and Findings Offender-age-reported outliers The chart below shows the number of annual reports — one for each agency, for each year — which X% of offenders’ ages were reported — looking only at large-city agencies with 10+ murders in a given year. There are some agency-year outliers, with especially low or high rates of offender ages reported. Figure 12. These outliers, however, do not account for a large proportion of victims. Removing agency-years where offender ages were reported for less than 10% of murders, or more than 98% of murders reduces the number of murders reported by consistently reporting larger-city agencies from 363,422 to 351,919 — a reduction of only 3%. Because, in many cases, it is difficult to discern whether an outlier represents a reporting error, we did not remove these agency-years from our analysis. A bit more detail: 33 ● Four of the agency-years below 10% correspond to the Washington, D.C. police department (in 1988, 1989, 1990, and 1994). The remaining sub-10% agency-years correspond to annual reports with fewer than 50 incidents reported. ● Nine of the agency-years below 20% correspond to the New Orleans Police Department. ● Among agency-years with 100% offender-age-reported rates, all but two correspond to agencies that reported fewer than 50 incidents. (Those two are from the Tampa, Fla. police department in 1992 and 1994 — years in which their Return A data suggest they cleared approximately 61% and 66% of murders, respectively.) ● Other high rates in the SHR data do align with the reported statistics in the Return A data. In 2008, for instance, the Milwaukee Police Department’s Return A data indicated a clearance rate of 93%. In the SHR data for that year, an offender age was listed for 97% of murders. Offender-age-reported rates over time Overall, for consistently reporting (i.e., at least 1 murder reported in 34 of 38 years since 1980), larger-city agencies: Figure 13. 34 Offender-age-reported rates by murder weapon Disaggregating the rate by whether a firearm was used in the murder, it’s clear that there have been very different trends for firearm and non-firearm murders — with firearm murders accounting for the entirety of the decline since the 1980s: Figure 14. 35 Offender-age-reported rates by weapon and victim race/ethnicity Since the 1990s, another divergence has occurred, in which police have become substantially less likely to identify suspects of firearm murders if the victim was black or Hispanic, versus if the victim was white. Figure 15. To be sure, a disparity in clearance rates doesn’t necessarily mean that police are biased against black and Hispanic victims. Police typically say they investigate all cases with equal vigor, and often point to “external” factors, such as differences in the difficulty level of certain types of cases, victims who won’t cooperate, and witnesses who won’t share information. 36 We tested the trends above under other scenarios, with similar results: Excluding domestic/family violence incidents* * The SHR data includes a victim-offender relationship field. Many of the possible values for that field — e.g., “husband,” “girlfriend,” “stepdaughter” — indicate intimate partner or family violence, although these relationships are typically only recorded when other details about the offender (such as age) are also reported. Including all agencies, regardless of reporting consistency Including all agencies, regardless of size, type, or consistency Figure 16. 37 The disparity is likely greater than it appears, since, as noted above, many agencies don’t  report Hispanic ethnicity. For agencies that do report Hispanic ethnicity, the rates for Hispanic  victims are closer to those for black victims than for white victims.      Figure 17. 38 Offender-age-reported rates by weapon, victim race/ethnicity, and victim sex Although the trends for male victims are very similar to the overall trends. The trends for women, however, are quite different and the racial/ethnic disparities are smaller: Male victims Female victims Figure 18. One possible factor influencing the differences: Women are more likely than men to be victims of domestic violence. 39 Offender-age-reported rates by victim demographics and circumstances To examine whether the disparities were driven by particular types of homicides, or perhaps different underlying distributions of homicide types, we examined the SHR’s “circumstance” field, which the data associates with each offender. For each victim, we identified the first-listed offender’s circumstance as the “main circumstance.” The circumstance field should be interpreted with caution, since a large proportion of offenses have their circumstances listed simply as “other” or “undetermined” (many of which may have, in fact, been motivated by one of the more specific circumstances). Even so, it is notable that a disparity (of varying size) persists across nearly all major “circumstance” types. Figure 19. ​These charts include all victims reported by consistently-reporting, larger-city agencies, regardless of whether their ethnic origin information is missing. 40 The “Circumstances undetermined” category both (a) harbors one of the largest disparities, and (b) is the largest and fastest growing “circumstance” selections: Figure 20. Distribution of rates for by agency and decade We also examined agency-level offender-age-reported rates for ​firearm murders​, to see if the disparities were widespread or confined to a few agencies, and how that changed over time. In the 1980s, the median offender-age-reported rate for ​white and black/Hispanic victims​ was roughly the same — about 75%. In the 2010s (2000–17), however, those two median rates are quite different — 70% for white victims, versus 55% for black and Hispanic victims. The median disparity between offender-age-reported rates for black/Hispanic and white victims has grown over that time period to between, too. In the 1980s, the median agency had virtually 41 no disparity. In the present decade, however, the median agency has a 13 percentage-point disparity. The median disparities are smaller than the national aggregate disparity in part because of an uneven racial distribution of victims; agencies with lower offender-age-reported rates tend to have a higher proportion of black/Hispanic victims: Figure 21.​ Includes only agencies with 50+ black/Hispanic and 50+ white victims. 42 The charts below show the distribution of agencies’ offender-age-reported rates for black/Hispanic victims (first column) and white victims (second column), and the percentage point spread between those rates (third column), by decade, from the 1980s through the present decade (including years 2010 through 2017). Each decade is a row, and the bars represent the share of agencies that fall in each percentage-point range. The median is represented by the red line. Again, we ran the analysis two different ways — including all consistently reporting, larger-city agencies with at least 10 black/Hispanic and 10 white victims each decade. Firearm murders: Figure 22. ​Note: Includes 105 agencies. 43 We also repeated the first analysis for ​non-firearm​ murders​ and found that the median disparity has been virtually zero since the 1990s: Figure 23. ​Note: Includes 81 agencies. 44 Analysis of NIBRS Data As noted above, firearm-involved aggravated assaults do not necessarily involve injury to the victim. The category, according to the FBI, includes “all assaults in which a firearm of any type is used​ or is threatened to be used​.” (Emphasis added.) In the Return A data, we observed a long-term decline in clearance rates for gun assaults. But what if that decline were driven largely by non-injury assaults, while police departments continued to solve more serious gun assaults at higher rates? To test this, we turned to the FBI’s National Incident-Based Reporting System (NIBRS) dataset. Although the NIBRS program has the lowest agency participation among the three datasets examined here and only goes back to 1991, it includes detailed incident-level information — including whether the victim of a gun assault (or any other type of incident) was injured. Our analysis of the NIBRS data examines the following: ● ● ● ● The proportion of gun assaults that result in no/minor/major injuries. Arrest rates for gun assaults, by whether they involved no/minor/major injuries.2 Disparities in gun arrest rates, by victim race/ethnicity Days until arrest, by offense type and weapon Data Caveats Lack of coverage / representation The NIBRS data go back to 1991, but relatively few large departments used it at the time. There are very few large agencies whose NIBRS data goes back even to the 2000s, which hinders any analysis of national, long-term trends. Over the years, more large agencies have joined, but the participating agencies still aren’t nationally representative. ​As of 2016​, only 6,849 of the roughly 18,500 law enforcement agencies (37%) participated in NIBRS; and only 94 of 307 larger-city agencies (31%) did so. (Among the largest agencies using NIBRS: Detroit, Milwaukee, Memphis, Kansas City, Cincinnati, Louisville, Denver, Fort Worth, Seattle.) 2 Unlike the Return A data, NIBRS makes it possible to distinguish between arrests and exceptional clearance. The NIBRS analysis focuses on arrests, for two main reasons: (1) for the sake of comparison with our local-agency analysis, which also focuses on arrests, and (2) because some agencies have atypicaly high exceptional clearance rates. 45 Incompleteness of arrest data Each arrest in NIBRS includes an arrest date. A small number of larger-city agencies have no arrests for major offenses (homicide, sexual assault, robbery, aggravated assault) dated 100 days after a crime’s occurrence — suggesting that the agency stops updating the arrest information after a shorter-than-typical period. For instance, the Cambridge (Mass.) Police Department reported more than 4,000 major incidents since 2000, but the longest time elapsed between an incident and an arrest was 50 days. The analyses below exclude four larger-city agencies for which this appears to be a problem. Data Preparation For this analysis, we used annual, fixed-width NIBRS data files provided by the FBI. For each year of data, we took the following steps: 1. Loaded the tables containing the following information: agency metadata characteristics (“Batch Header”), general incident information (“Administrative”), offense details (“Offense”), victim details (“Victim”), and arrestee details (“Arrestee”). 2. Selected all incidents for which the most serious UCR offense listed was as one of the following: ○ 09A — "Murder/Nonnegligent Manslaughter" ○ 09B — "Negligent Manslaughter" ○ 09C — "Justifiable Homicide" ○ 11A — "Rape" ○ 11B — "Sodomy" ○ 11C — "Sexual Assault With An Object" ○ 11D — "Fondling (Indecent Liberties/Child Molesting)" ○ 120 — "Robbery" ○ 13A — "Aggravated Assault" 3. Selected all victims associated with that top offense for those incidents. (In the NIBRS data, victims can be associated with a subset of the incident’s full list of offenses.) The analyses below use only victims for whom the top offense was either 09A or 13A; victims of other offenses were retained in the data, in case of usefulness for future analyses. 46 Firearm involvement Next, we classified offenses by whether or not they involved a firearm. Each NIBRS offense can include up to three weapon types (“TYPE WEAPON/FORCE INVOLVED”). If any of the weapon codes were one of the following, we labeled the offense as involving a firearm: ● ● ● ● ● ● ● ● ● ● 11 — ”Firearm (type not stated)” 11A — ”Firearm (type not stated), automatic” 12 — “Handgun” 12A — “Handgun, automatic” 13 — “Rifle” 13A — “Rifle, automatic” 14 — ”Shotgun” 14A — ”Shotgun, automatic” 15 — ”Other Firearm” 15A — “Other Firearm, automatic” Some incidents may involve the use of a firearm for only a subset of the incident’s offenses. When the analysis below refers to firearm/gun assault incidents, it refers only to such incidents where (a) aggravated assault was the most serious offense, according to UCR’s offense hierarchy, and (b) a firearm was used for that aggravated assault. . Injury types Next, we grouped the following NIBRS injury types into three categories (“none”, “minor”, and “major”): Major: ○ ○ ○ ○ ○ ○ Minor: ○ None: ○ ○ “Unconsciousness” “Apparent Broken Bones” “Severe Laceration” “Loss of Teeth” “Possible Internal Injury” “Other Major Injury” “Apparent Minor Injury” “None” NULL value (field not populated) 47 In the NIBRS data, victims can be labeled with up to five injury types. We classified victims by the most serious category of injury listed. Based on those classifications, we identified all incidents in which a victim had suffered a major/minor injury from an aggravated assault involving a firearm (“gun assaults”). We classified incidents by the most serious category of injury listed. Identifying “larger city” agencies The NIBRS data includes a “population group” field, which uses the same groupings as the Return A data above. Similar to the Return A analysis, we identified “larger city” agencies as those whose jurisdictions were cities of 100,000 people or more in, as most recently reported to NIBRS. Labeling incidents involving law enforcement officers The “TYPE OF VICTIM” field in the victim data distinguishes between “individual” victims and “law enforcement officer” victims. We used that field to classified incidents as to whether they involved a law enforcement officer. (Law enforcement officers account for just 0.6% of firearm assault victims in the data since 2000.) Methodology and Findings For all of the analyses below, we limited the NIBRS data to incidents that (a) occurred from 2000 through 2016, (b) were reported to a larger-city agency, and (c) did not include a law enforcement officer as a victim. For analyses meant to capture present-day trends, we limited the analyses to the most recent five years, 2012–16. For incident-level analyses, incidents are classified by the most serious category of injury incurred by any victim. Arrests for gun assaults by injury level, 2000-2016 First, we calculated how often aggravated assaults with firearms were reported to have resulted in an injury. Then, we calculated the arrest rate for each injury level (none/minor/major). We 48 counted an incident as leading to an arrest when the “TOTAL ARRESTEE SEGMENT” field in the “Administrative” table for the incident was greater than zero. We found that gun assaults resulting in a major injury accounted for only roughly 22% of all gun assaults, but that the arrest rate was roughly the same as those that resulted in a minor injury or no injury. Inj. Category # Agencies # Incidents % Incidents Arrest Rate Major 98 54,289 22% 22% Minor 98 39,596 16% 26% None 98 150,368 62% 25% Table 1. For about 2% of agency-years with 10+ firearm assault victims, none will have been reported to have been injured. (The rate is lower when you raise the total-victim victim threshold — and among larger-city agency-years with at least 100 firearm assault victims, all include at least one injury.) Out of precaution, we reran the calculations to ​exclude ​agency-years in which ​no gun assault injury was reported​. The effect was minimal: Inj. Category # Agencies # Incidents % Incidents Arrest Rate Major 98 54,289 22% 22% Minor 98 39,596 16% 26% None 98 149,813 61% 24% Table 2. Distribution of injuries and arrest rates by agency To get a sense of the agency-level variation in the rates above, we plotted the distribution of incident proportions and arrest rates by injury category. They tell largely the same story as the aggregate metrics: 49 Figure 24. Disparities in gun arrest rates, by victim race/ethnicity Similar to the approach we took with the SHR data, we use the NIBRS data to calculate the disparities in arrest rates for black/Hispanic vs. white victims among participating agencies — for firearm aggravated assaults, which the SHR data cannot address. We analyzed arrest rates and disparities at the aggregate level (i.e., all agencies, combined) and at the agency median. 50 Aggregate disparities: ​Among larger-city agencies during 2012–16, the likelihood that there would be an arrest for an injury-causing gun assault was about about 13 percentage points lower for black/Hispanic victims than for white victims. Inj. Category # Victims Arrest Rate Black/Hisp. White Black/Hisp. White Difference Major 21,053 3,341 22.3% 35.7% 13.4 ppt Minor 11,957 3,350 25.6% 37.9% 12.3 ppt None 73,634 23,615 22.4% 30.8% 8.4 ppt Table 3. Median disparities:​ Among larger-city agencies with at least 50 black/Hispanic and 50 white victims for a given injury category, the median disparity in arrest rates was similar for major-injury gun assaults (13 percentage points) but smaller for minor-injury gun assaults (6 percentage points) than the aggregate disparities. Inj. Category # Agencies Median Arrest Rate Difference Major 23 13.4 ppt Minor 18 6.1 ppt None 58 3.5 ppt Table 4. Days until arrest, by offense type and weapon Most of the analyses so far have examined whether or not an arrest is made (or a case cleared, or an offender identified). But the granularity of the NIBRS data also enables us to examine ​how quickly​ cases are cleared by arrest. We categorized each incident into five groups: ● ● ● ● Arrest occurred by the next day Arrest occurred within two weeks (2-14 days) Arrest occurred within 30 days (15-30 days) Arrest occurred after 30 days 51 ● No arrest recorded Then, for the most recent 10 years of NIBRS data (2007–16), we calculated the percentage of incidents that fell into each group, for four categories: ● ● ● ● Murder - Gun Murder - Other weapon (including physical force) Aggravated assault - Gun Aggravated assault - Other weapon (including physical force) (All aggravated assaults are included, regardless of injury level.) The chart below represents the results, with two main findings: ● ● Roughly the same proportion of gun assaults and gun murders lead to an arrest by the next day, but the arrest rate then declines much more quickly for gun assaults than for gun murders. Non-gun offenses led to arrests more quickly than gun offenses. Figure 25. 52 Comparison of Return A, NIBRS, and SHR data To test the validity of our findings — and especially the findings from SHR’s offender-age-reported rates — we compared data from all three federal datasets to one another. Comparing NIBRS clearance rates to Return A clearance rates Since the mid-1990s, the ​percentage of murders cleared​, as reported in NIBRS, has been very close to the ​clearance rate​ reported via Return A (by these NIBRS agencies): Figure 26. The chart above shows annual trends; below, aggregated into five-year periods: Figure 27. ​X-axis indicates the first year of the five-year period; “2015,” however, includes only 2015 and 2016, the most recent year available. 53 Comparing SHR offender-age-reported rates to Return A clearance rates Although the offender-age-reported rate from SHR is an imperfect proxy for clearance rates, the annual trends track Return A’s clearance rates quite closely. The chart below shows those respective rates for larger-city agencies that, in a given year, reported at least one murder and one clearance/offender-age-reported through both SHR and Return A. Figure 28. And, again, grouped into five-year periods: Figure 29. 54 Comparing SHR and Return A rates, by NIBRS participation As NIBRS participation increases, so too has the proportion of murders in SHR that were submitted by NIBRS agencies. (When an agency submits NIBRS data, its SHR data is automatically created from it.) The chart below shows that proportion over time. Figure 30. 55 This is relevant because the relationship between Return A clearance rates and SHR offender-age-reported rates is different for NIBRS-submitting versus non-NIBRS submitting agencies. The chart below shows this phenomenon; since 2000, these agencies, on the whole, have an SHR offender-age-reported rate ​above​ their Return A clearance rate — the reverse of the pattern among agencies that have never submitted data to NIBRS. Figure 31. The chart above distinguishes splits agencies by whether they have ​ever​ submitted data to NIBRS, but does not take into account when agencies ​began​ submitting data to NIBRS. To examine the latter, we calculated SHR offender-age-reported rates for the 75 larger-city agencies have submitted at least 20 SHR murders via NIBRS and 20 SHR murders ​not​ via NIBRS. Indeed, when agencies submit their data through NIBRS, offender ages are identified at a higher rate, relative to the agencies’ Return A clearance rate, than in non-NIBRS years. SHR OAR rate minus Return A clearance rate Percentile Non-NIBRS Years NIBRS Years 25th -7pt 9pt 50th -2pt 14pt 75th 1pt 20pt Table 5. 56 In the years that these agencies did ​not​ submit via NIBRS, their offender-age-reported rates were, at the median, 2 percentage points lower than their Return A clearance rates. In the years that they ​did​ submit via NIBRS, their offender-age-reported rates were, instead, 14 percentage points ​higher​ at the median than their Return A clearance rates. This phenomenon seems likely to be driven by NIBRS reporting requirements, which are more rigorous than SHR’s and likely lead to more complete information. Comparing racial disparities in the two victim-level datasets In this section, we re-examine the racial disparity analysis for NIBRS-submitting and non-NIBRS-submitting agencies, and find that, although NIBRS participation does appear to increase overall offender-age-reported rates, it does not appear to substantially affect the racial disparities observed. Given the differences in NIBRS and SHR reporting requirements, we inspected racial disparities in NIBRS percent-cleared and SHR offender-age-reported rates for various groupings: ● ● Percent of murders cleared, via NIBRS SHR offender-age-reported rate: ○ For all larger-city agencies ○ For agency-years in which an agency submitted via NIBRS ○ For agency-years in which an agency did ​not​ submit via NIBRS Note:​ Because the universe of NIBRS-submitting agencies changes annually, the composition of these groupings changes over time. Additional note​: Due to how the FBI processes NIBRS data, the SHR data for NIBRS-submitting agencies contains no ethnicity information. For this reason, the analyses in this section compare rates for black and white victims, regardless of ethnicity. The chart below displays the results. As can be seen, the racial disparities observed in present-day SHR data are also present in the NIBRS data — a sign that the offender-age-reported rate from SHR, while not a perfect proxy, reflects real disparities (as opposed simply to quirks in reporting practices). 57 Figure 32. The specific numbers for 2016 — the most recent year for which both NIBRS and SHR data are available: ● Black-white disparity in ​offender-age-reported rates​ in ​SHR ​among​ all larger-city agencies​: 19 percentage points. (44% vs. 63%.) ● Black-white disparity in ​offender-age-reported rates​ in ​SHR ​among non​-NIBRS-submitting, larger-city agencies: ​20 percentage points. (38% vs. 58%.) ● Black-white disparity in ​offender-age-reported rates​ in ​SHR ​among​ NIBRS-submitting, larger-city agencies: ​21 percentage points. (59% vs. 80%.) ● Black-white disparity in ​clearance rates​ in ​NIBRS ​among​ SHR-submitting, larger-city agencies: ​24 percentage points. (40% vs. 63%.) NIBRS-submitting agencies have higher offender-age-reported rates (overall) in SHR than non-NIBRS submitting agencies, but the increase is not substantially disproportionate for black and white victims (+21ppt and +22ppt, respectively), which keeps the overall increase from distorting the disparity. —— END —— 58