Abstract
Law enforcement agencies increasingly use big data analytics in their daily operations. This review outlines how police departments leverage big data and new surveillant technologies in patrol and investigations. It distinguishes between directed surveillance—which involves the surveillance of individuals and places under suspicion—and dragnet surveillance—which involves suspicionless, unparticularized data collection. Law enforcement’s adoption of big data analytics far outpaces legal responses to the new surveillant landscape. Therefore, this review highlights open legal questions about data collection, suspicion requirements, and police discretion. It concludes by offering suggestions for future directions for researchers and practitioners.
Keywords: policing, big data, law, crime, predictive policing
The declining cost of collection, storage, and processing of data, combined with new sources of data like sensors, cameras, and geospatial technologies, mean that we live in a world of near-ubiquitous data collection. All this data is being crunched at a speed that is increasingly approaching real-time, meaning that big data algorithms could soon have immediate effects on decisions being made about our lives.
—Executive Office of the President of the United States (2014)
INTRODUCTION
In the age of big data, individuals are contributing to a growing digital trove of data as they go about their daily lives. If a person uses their smartphone, makes a purchase with a credit card, or likes something on social media, they leave a digital trace that can be stored and linked to other data points. The mass digitization of information combined with the exponential growth of computational power has resulted in an explosion of so-called big data analytics. Big data analytics have been taken up in a wide range of fields, including finance (Pasquale 2015), labor (Ajunwa et al. 2017), social science (Lazer & Radford 2017), social media (Gillespie 2014), journalism (Christin 2018), marketing and credit (Fourcade & Healy 2017), and policing (Brayne 2017, Ferguson 2017).
Law enforcement agencies are starting to use big data in a range of daily operations and surveillance activities, including patrol, investigation, and crime analysis. Police use of big data is the subject of contentious debate in policy, media, legal, regulatory, and academic circles. However, advances in data analytics far outpace social scientific research and legal responses to the new data landscape. This article draws on a growing body of work on law enforcement’s use of big data to examine whether and how the collection, analysis, and deployment of big data are transforming law enforcement activities, and to what legal consequence. It argues that although big data policing represents, in part, a recapitulation of existing police practices, big data analytics are associated with certain fundamental transformations in police activity. Forms of dragnet and directed surveillance represent a migration of law enforcement operations toward intelligence activities (Brayne 2017). Big data policing poses challenges for existing legal frameworks governing police activity. Therefore, the legal implications of law enforcement’s use of big data span across criminal, constitutional, administrative, and privacy law.
INSTITUTIONAL ADOPTION OF BIG DATA ANALYTICS
Digital information is being produced at an unprecedented rate. Over 90% of the data in the world have been created in the last two years. As data have proliferated, so too have definitions of what constitutes big data. One of the most-cited definitions is Laney’s (2001) description of the three V’s of big data: volume, variety, and velocity. This review takes big data to be
a data environment characterized by four characteristics: It is vast, fast, disparate, and digital. First, big data analytics involve the analysis of large amounts of digital information…Second, big data typically involves high frequency data observations and fast data processing. Third, big data is disparate—it comes from a wide range of institutional sensors and involves the merging of previously separate data sources. Fourth, big data is digital. The mass digitization of records facilitates the merging and sharing of records across institutions, makes storage and processing easier, and makes data more efficient to analyze and search remotely. These four characteristics…enable the use of advanced analytics—such as predictive algorithms or network analysis—and complex data display—such as topical-, temporal-, or geo-analysis. (Brayne 2017, p. 980, emphasis in original)
A key feature of the big data landscape is “function creep” (Innes 2001, p. 8): the ability of data initially collected for one purpose to be used for another often unintended or unanticipated purpose. When digital data can be easily stored and shared, “the value of information no longer resides solely in its primary purpose” (Mayer-Schönberger & Cukier 2013, p. 153). Records initially introduced with one intention are repurposed, refined, and expanded for new problems, institutions, or applications.
Existing social scientific work emphasizes different motivations institutional actors have for using big data. From a technical perspective, big data is a means by which organizational actors may improve efficiency through improving prediction, filling analytic gaps, and more efficiently allocating scarce resources. The institutional perspective (DiMaggio & Powell 1983, Meyer & Rowan 1977), by contrast, does not assume organizational structures stem from technical imperatives (Scott 2004). Instead, it highlights the role of culture, suggesting organizations operate in technically ambiguous fields in which they adopt big data analytics in response to wider beliefs about what organizations should be doing (Willis et al. 2007). Using big data may confer legitimacy; if other institutions are using it for decision making, there may be institutional pressure to conform. Of course, these perspectives are ideal types and are not mutually exclusive. Research suggests law enforcement adopted big data analytics in response to both technical and institutional pressures.
Big data holds appeal for law enforcement as a means of increasing efficiency and accountability. It may improve the prediction and preemption of behaviors by helping law enforcement deploy resources more efficiently, ultimately helping prevent and intercept crimes, and thus reducing crime rates. It also holds potential as an accountability mechanism and response to criticisms law enforcement organizations face over discriminatory practices. For example, data-driven policing is being offered as a partial antidote to allegations and findings of excessive force, unlawful stops and arrests, and civil rights violations [e.g., see Smith & Austin 2015, US Dep. Justice 2015 (2001), US Dep. Justice Civil Rights Div. 2015; Floyd et al. v. City of New York et al. (2013)].
Despite recent public attention on law enforcement’s use of big data, police use of data is not a new phenomenon. Fifty years ago, the 1967 President’s Commission on Law Enforcement and Administration of Justice encouraged the adoption of new technologies to improve efficiency and fairness in the criminal justice system. At that time, policing was characterized by random patrol, rapid response, and reactive investigations (Sherman 2013). Based on emerging research, practitioners and researchers grew increasingly aware that existing police strategies such as random patrol and rapid response had little effect on crime, catalyzing a shift from reactive to more proactive, evidence-based forms of policing, such as hot spots policing (Braga & Weisburd 2010, Sherman et al. 1989). In the 1990s and early 2000s, CompStat—a managerial model for identifying crime patterns, quantifying and incentivizing police activity, and directing police resources—spread from New York City to police departments across the United States and abroad (Weisburd et al. 2003).
The attacks on 9/11—widely viewed as a case of information sharing failure in the intelligence community—spurred the development of “intelligence-led policing” (Ratcliffe 2008). It catalyzed federal, state, and local law enforcement officials to join forces to improve criminal justice data collection and information sharing. Federal agencies provided considerable funding to local law enforcement agencies to collect a wide range of new data, as they were viewed as on the front lines of the domestic war against terror (Waxman 2009). Federal funds were allocated for the construction of fusion centers—multiagency, multidisciplinary surveillance organizations that aggregate data from public and private sources (Pasquale 2015). Federal and local agencies partnered with technology companies to enhance their data collection and analysis capabilities. For example, Palantir is a software company initially partially funded by In-Q-Tel, the Criminal Intelligence Agency’s (CIA’s) venture capital firm, that designs analytic software originally used in national defense but now used by commercial customers, such as J.P. Morgan; federal agencies, such as the CIA, the Federal Bureau of Investigation, Immigration and Customs Enforcement, and the Department of Homeland Security; and local law enforcement agencies, such as the Los Angeles Police Department (LAPD). Injections of federal funding are integral to the adoption and development of big data policing initiatives. For example, the Smart Policing Initiative encourages local police departments and researchers to use evidence-based, data-driven tactics. Likewise, in 2011, the US Department of Justice awarded the LAPD a $3 million grant to conduct a multiyear analysis of predictive policing.
BIG DATA SURVEILLANCE
Law enforcement uses big data in activities ranging from patrol to investigation, crime analysis, and risk management for two broad categories of surveillance: directed and dragnet. The key distinction between directed and dragnet surveillance is that whereas directed surveillance is focused on individuals and places under suspicion, dragnet surveillance is unparticularized and gathers information on everyone. Surveillance is defined here as simply “the collection and analysis of information about populations in order to govern their activities” (Haggerty & Ericson 2006, p. 3).
Directed Surveillance
One of the most widespread directed surveillance practices is algorithmic predictive policing. According to a 2014 survey of 200 police departments, 38% of responding departments were using predictive policing, and 70% of departments indicated they planned to use it by 2017 (Police Exec. Res. Forum 2014). In predictive policing, algorithms—broadly defined as “formally specified sequence(s) of logical operations that provides step-by-step instructions for computers to act on data and thus automate decisions” (Barocas et al. 2014, p. 3)—are being used to guide police decision making about whom and where to police. It is informed by a large body of empirical work that demonstrates that crime is not randomly distributed across people or places. Rather, research emphasizes the importance of place-based environmental conditions (Brantingham & Brantingham 1981, Ratcliffe et al. 2011, Sampson et al. 1997), situational decision making (Keizer et al. 2008, Matsueda et al. 2006), chronic offenders (Braga et al. 2001, Uchida & Swatt 2013), and social networks (Papachristos et al. 2015).
Place-based predictive policing.
Place-based predictive policing involves using historical crime data as training data in an algorithm to predict when and where future crime is likely to occur. The largest predictive policing company is PredPol. PredPol’s algorithm uses data from police departments’ record management systems on location, type, and time of crimes as inputs to predict future crime. It is predicated on the near-repeat model, which suggests that once a crime occurs in a location, the immediate surrounding area is at increased risk for subsequent, similar crimes (Mohler et al. 2015). PredPol produces 500 × 500 ft2 boxes overlaying small areas of division maps. Patrol officers are encouraged to spend time in predictive boxes, a strategy referred to as risk-based deployment. Deployment is based on available time, such as when officers are not responding to calls or “booking a body.” Officers record their self-reported minutes in the predictive boxes on their in-car computers. Although data drive deployment, what the police do once in the predictive box, and how long they stay there, remains within their discretion (Brayne 2017). PredPol is currently used in almost 60 departments, the largest of which is the LAPD.
Another major predictive policing software is HunchLab, produced by Azavea. HunchLab uses risk terrain modeling to account for the interaction of social, behavioral, and physical risk factors. In contrast to PredPol’s parsimonious model, HunchLab’s models include a much wider range of variables (Brayne et al. 2015), such as population density; location of bars, churches, and transportation hubs; and census data. As of 2017, HunchLab is being used in the Philadelphia Police Department, Miami Police Department, and New York Police Department (NYPD), among others.
Advocates of predictive algorithms argue that by relying on unbiased—or mechanical—assessments, algorithms may help deploy resources more efficiently and objectively (Daston & Galison 2007). In a study of police use of big data in Los Angeles, Brayne (2017, pp. 989–90) described an LAPD captain’s explanation that relying on data, rather than human interpretation of crime patterns, helps him deploy resources more efficiently:
There’s an emotional element to it, and you think right now with crime being this low, a cluster could be three or four crimes. Clusters used to be 10, 12 crimes. Now three or four and they jump on it, you know. So, there could be overreaction. Because, there’s, you know, I mean it’s a human doing it. And they cannot sort out what’s noise.
However, research on algorithmic fairness suggests that rather than eliminating human bias, bias may be an “unintentional emergent property” of the data collection and analysis process itself (Barocas & Selbst 2016, p. 671). For example, if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as being at high risk for future crime, and officers will be deployed to those areas and will thus be more likely to detect crimes in those areas, thus creating a self-fulfilling statistical prophecy in which crime rates increasingly reflect enforcement practices. Moreover, crime data are incomplete—estimates of unreported crime range from 17% to 67%, depending on the crime (Langton et al. 2012)—and are not missing at random. Crimes that take place in public are more likely to be detected, individuals that do not trust the police are less likely to report crimes (Desmond et al. 2016), and police focus their attention and resources on black communities at a disproportionately high rate relative to drug use and crime rates (Beckett et al. 2005). In other words, sampling bias in crime data may lead to a ratchet effect that reinforces discrimination. However, once they are inputted as data, the predictions appear impartial (Brayne 2017). Engineers at predictive policing software companies are increasingly aware of this feedback loop and are making efforts to interrupt it. For example, HunchLab has introduced a degree of randomness to its algorithm, occasionally directing officers to medium-risk locations instead of only high-risk locations.
To date, few peer-reviewed studies evaluate the efficacy of place-based algorithmic policing (Kennedy et al. 2011, Mohler et al. 2015). Mohler and colleagues (2015) conducted a randomized controlled field trial and found that PredPol’s algorithm outperforms crime analysts in predicting future crime, and that using algorithmic forecasting to direct patrols led to small but statistically significant reductions in crime volume. However, independent future research is needed, as the authors include cofounders of and stockholders in PredPol.
Person-based predictive policing.
Whereas location-based predictive policing is typically used to predict property crime, person-based predictive policing is more commonly used to predict violent crime. Law enforcement uses data to identify individuals or groups most likely to be involved in crimes as victims, offenders, or both. Person-based predictive policing is premised on the idea that violent crime is concentrated among a small percentage of individuals in a population. Therefore, focusing police resources on the highest-risk individuals should efficiently reduce crime. Person-based predictive policing encompasses a wide range of analytic approaches, such as social network analysis and regression analysis of risk factors.
Criminologists have long theorized about the role of social networks in shaping violence [e.g., Thrasher 2013 (1927), Whyte 1969 (1943)]. However, formal network analysis has become incorporated into social scientific research on violence and law enforcement interventions only in recent decades (e.g., Papachristos 2009). One of the first networked interventions was Operation Ceasefire, sponsored by the National Institute of Justice (NIJ). Informed by research that demonstrated violence was concentrated both spatially and within groups of people (Braga 2003, Cook & Laub 2002), researchers and practitioners in Boston took a data-driven approach to reducing gun violence. They used a combination of formal network analysis and qualitative research methods to develop a focused deterrence strategy targeting specific gangs and gang members (Braga et al. 2001, Kennedy 1997, Kennedy et al. 1997). In the past two decades, social network analysis (Wasserman & Faust 1994) was used to understand the spatial diffusion of homicide (Cohen & Tita 1999), gang leadership and group structure (Papachristos 2009), turf boundaries (Brantingham et al. 2012), and contagion and reciprocity (Papachristos 2009, Papachristos et al. 2015). The network turn in social scientific research on gang violence has informed law enforcement interventions (Papachristos & Kirk 2015). In their recent review of social networks and gang violence reduction, Sierra-Arévalo & Papachristos (2017) suggest that interventions should not be limited to law enforcement. Rather, they argue, social service providers, teachers, and other community experts can play an important role in targeted interventions for those at high risk of victimization. They also suggest analyzing not only coarrest data but also data on nongang ties—such as attachment to family, school, or employment that may inoculate individuals against exposure to violence—to better understand variation in victimization (Sierra-Arévalo & Papachristos 2017).
In one pilot program implemented in Chicago—an NIJ-funded collaboration between the Chicago Police Department and Illinois Institute of Technology—individuals at high risk of being involved in future crime were identified using a proprietary model of coarrest networks (Saunders et al. 2016). As part of a broader predictive policing strategy, individuals with the highest algorithmic risk scores for gun violence were placed on a Strategic Subjects List (SSL), which was disseminated by central command. District commanders would use their discretion and decide on an intervention strategy in their district. The most common strategy was for officers—alongside a social worker and community member—to go to the home addresses of individuals on the SSL to make a custom notification visit. A quasi-experimental evaluation of the strategy found individuals on the SSL were no more or less likely to become a victim of a homicide or shooting than those in the comparison group, but they were more likely to be arrested for a shooting (Saunders et al. 2016).
The LAPD implemented a different kind of point system. Operation LASER (Los Angeles Strategic Extraction and Restoration Program) began in 2011 and was federally funded through the Smart Policing Initiative. The strategy includes both place-based and offender-based models. The offender-based strategy was first implemented in a low-income, historically high-crime division in South Bureau. The strategy involves first plotting crimes in the division then identifying a problem crime—such as armed robbery—and generating a list of “chronic offenders” by gathering intelligence daily from patrols, the Parole Compliance Unit, field interview cards, traffic citations, release-from-custody forms, crime and arrest reports, and criminal histories. A point value is calculated for each individual: five points for a violent criminal history, five points for known gang affiliation, five points for prior arrests with a handgun, five points if they are on parole or probation, and one point for every police contact (Uchida & Swatt 2013). One officer explained,
We said ok, we need to decide who’s the worst of the worst…we need something to pull them apart. So this was the important one, and this is really what gives the importance of FI-ing someone [filling out a field interview card] on a daily basis instead of just saying, okay, I saw that guy hanging out, I’m gonna give him two weeks and I’ll go FI [fill out a field interview card] him again. It’s one point for every police contact. (Brayne 2017, p. 987)
Field interview, or police contact, cards are important intelligence tools for law enforcement. Although they vary by department, they generally include information such as names, addresses, physical characteristics, vehicle information, gang affiliations, and criminal history (Brayne 2017). In Los Angeles, there is also an open-ended section titled Additional Info where officers can enter information about “Additional Persons, Booking No., Narrative, etc.” Officers are trained to fill out a card as soon as they interact with someone in the field because, as one supervisor explained, “these things come into play later on in ways you could never imagine” (Brayne 2017, p. 987). Field interview cards were one of the first data sources the LAPD integrated into the Palantir platform. When entered into the system, every field interview is tagged with a date and time stamp and GPS coordinates.
Like location-based predictive policing, person-based predictive policing can lead to a ratchet effect. It can generate a feedback loop in which an individual with a high risk score is more likely to be stopped, and that police contact further increases the individual’s score. Quantified police practices can place individuals already under suspicion under new and deeper forms of surveillance, while appearing to be objective or, in the words of one captain, “just math” (Brayne 2017, p. 997).
Person-based predictive policing strategies can be used in conjunction with one another. For example, risk scores can be used in conjunction with social networks. Figure 1 provides a deidentified mockup of a network diagram in Palantir.
Figure 1.

The individual in the middle, Guy Cross, is a person with direct police contact. Radiating out from him are all the entities he is related to, including people, cars, addresses, and phone numbers. The relationships between entities—such as through cohabitation, employment, or an intimate relationship—are indicated on the connecting lines. Adapted from Brayne 2017.
To be in the “secondary surveillance network” (Brayne 2017, p. 992), individuals do not need to have direct contact with the police; they simply need to have a connection to the central person of interest. Individuals in the surveillance network are subject to collateral data collection, meaning data that were collected on them in non–law enforcement contexts can be integrated into the system and linked to other data points. Once data are inputted into the system, officers can receive real-time alerts on, for instance, whether individuals are stopped by the police or interact with other government agencies whose data are integrated with the system.
Dragnet Surveillance
Dragnet surveillance, in contrast to directed surveillance, refers to surveillance technologies that gather information on everyone, rather than merely those under suspicion. One of the most pervasive dragnet surveillance tools is the Automatic License Plate Reader (ALPR). ALPRs can be static (e.g., at an intersection) or mobile (e.g., mounted on police cars). They take two pictures of every vehicle that passes through their line of vision—one of the car and one of the license plate—and record the time, date, and GPS location. Law enforcement can supplement their own ALPR data with privately collected readings, such as those gathered by repossession agents. ALPR data can provide a map of the distribution of vehicles throughout the city and, in some cases, may enable police to track individuals’ routine travel patterns or infer where they live or work based on where their car is repeatedly parked (Brayne 2017). There are several ways for law enforcement to use ALPR data: They can compare scans against heat lists of outstanding warrants or stolen cars (Joh 2016), they can place a “fence” around a location of law enforcement interest and track cars that go near that location, or they can simply store ALPR data for potential use during a future investigation. For example, one sergeant described an incident in which someone disposed of a dead body near a tourist attraction where there was an ALPR. By isolating those ALPR readings within the timeframe during which the body dump could have occurred, law enforcement could narrow their focus to three plates—one from Utah, one from New Mexico, and one from Compton. As the sergeant explained, assuming the Compton car was most likely to be involved, they
ran the plate, saw the name it was registered under, searched the name in CalGang (gang database), saw that the individual was affiliated with a gang currently at war with the victim’s gang, and used that information to establish probable cause to obtain a search warrant, go to the address, find the car, search the car for trace evidence, and arrest the suspect. (Brayne 2017, p. 993)
Domain Awareness Systems are another example of dragnet surveillance. In the largest Domain Awareness System, the NYPD partnered with Microsoft to collect information from closed-circuit surveillance cameras, ALPRs, radiation sensors, and other sensors to match with police databases (Joh 2014). Law enforcement employs advanced analytics, such as pattern-recognition algorithms, to detect threats in video footage, such as unattended bags (Ferguson 2017).
Law enforcement supplements police data with data originally collected in external, nonenforcement contexts. Big data companies such as Acxiom, CoreLogic, and Datalogix are part of a multibillion-dollar data broker industry buying, aggregating, and selling arrest records, criminal records, warrants, property records, purchase behavior data, neighborhood data, and social media data (Ferguson 2017, Pasquale 2015). Private data brokers sell personal data to law enforcement (Ferguson 2017, p. 12), as do private companies such as pizza chains and contact lens companies (Brayne 2017). This means that a growing number of individuals with no police contact are included in police-accessed databases. Police in Fresno, for example, piloted a service called Beware that analyzes consumer information compiled by data brokers to provide officers real-time red, yellow, and green threat scores for addresses and individuals.
Investigations and Prosecutions
Police use data proactively to predict crime and increase situational awareness, but they also use it in investigations and prosecutions (Patton et al. 2017). For example, an LAPD detective described an “automatic data grazing” system in the testing phases that seeks out similarities in cases that cross jurisdictional boundaries that one investigator would previously have likely missed due to jurisdictional data silos (Brayne 2017).
According to a survey administered by the International Association of Chiefs of Police, more than 96% of police agencies use social media in some capacity (Int. Assoc. Chiefs Police 2015). Of all the evidence found in New York City indictment documents, 48% involves social media activity or communication (Lane & Ramirez 2016). Operation Crew Cut—the NYPD’s initiative to monitor suspected gang members’ social media, in some cases before any crime was committed—uses social media links and interactions to infer real-world relationships between individuals (Lane 2015, Lane & Ramirez 2016, Patton et al. 2017). That said, although some “social media policing” (Trottier 2012) involves big data analytics—such as mining Twitter data to detect gang activity—there remains considerable investigatory value in small data and manual searches.
After an investigation, big data can be used for intelligence-driven prosecution. In the Manhattan District Attorney’s Office, an experimental prosecution unit, the Crime Strategies Unit, builds cases on the “primary crime drivers” in a neighborhood (Ferguson 2017, p. 42). A “target tracker” on each individual populates the system with photos, criminal histories, and other personal information. The targets cannot be arrested on existing evidence, but if they are ever arrested, an alert automatically triggers a process whereby all prior data points on the individual are sent to the prosecutor’s office so they can leverage more information at an earlier stage than they previously would have been able to, such as for enhanced bail applications and pretrial detention, additional charges, and harsher sentencing recommendations (Ferguson 2017).
IMPLICATIONS FOR LAW
Law cannot keep pace with the new technologies and data sources being introduced into police operations. A growing chorus of legal scholars suggest Fourth Amendment law is inadequate for governing police work in the age of big data. Open legal questions remain about the level of suspicion required for an individual’s data to be entered into police systems, whether the compilation of data points adds up to a traditional search, and how police discretion is informed by algorithmic predictions.
As illustrated by ALPRs and other dragnet surveillance tools that make possible everyday mass surveillance at an unprecedented scale, the threshold for inclusion in police databases is lower in the age of big data. The ongoing nature of license plate readings represents a proliferation of prewarrant surveillance. Rather than law enforcement starting to gather information on individuals only once they come under suspicion, “information is routinely accumulated and files are lying in wait” (Brayne 2017, p. 1000). Individuals’ daily lives, now codified as data, can be marshaled as evidence retroactively once they come under suspicion.
The fact that once in a database a suspect can be repeatedly surveilled raises the question of whether new surveillance technologies such as ALPRs, which facilitate the constant analyzing and reanalyzing of data, should be treated as searches subject to the Fourth Amendment. Some legal scholars argue Fourth Amendment rights need to be entrenched. For example, Kerr (2011) describes a dynamic called equilibrium adjustment in which new limits are placed on police surveillance. Others argue that perhaps Fourth Amendment law is ill suited to govern police activity in the big data age. For example, Renan (2016) argues the traditional paradigm of Fourth Amendment law is transactional, focusing on the one-off interaction between law enforcement and a suspect. However, police surveillance in the digital age is increasingly programmatic. It involves ongoing, cumulative, and sometimes suspicionless data collection and use (Renan 2016). Renan (2016, p. 1058) suggests that administrative law may be a more appropriate legal framework for governing cumulative surveillance than criminal procedure. Similarly, Slobogin (2016) suggests that police agencies should be governed by the same administrative law principles as other government agencies. He argues that when police create policies aimed at largely innocent categories of actors (such as routinely collecting and using ALPR readings), they should be required to engage in notice-and-comment procedures to ensure public input and avoid arbitrary rules (p. 135).
Currently, there is no federal legislation regulating ALPRs. Rules regulating the storage of ALPR data vary by state. For example, in California there is legislation creating a 60-day retention limit on ALPR data, unless the data are being used as evidence in a felony case. However, this legislation applies to only one law enforcement agency in the state: the California Highway Patrol.
Law enforcement faces practical constraints when conducting surveillance. However, many of those limits are becoming less relevant in light of big data and new dragnet surveillance tools. Justice Alito’s concurring opinion in the Supreme Court’s decision in United States v. Jones, the case regarding the GPS tracking of a single suspect over 28 days, highlights this challenge. He writes, “The greatest protections of privacy [until now have been]…practical,” because “traditional surveillance for any extended period of time was difficult and costly and therefore rarely undertaken” [see also Justice Sotomayor’s concurring opinion in United States v. Jones (2012)]. With dragnet surveillance tools, law enforcement can at any point build up a case against an individual, whereas they previously needed to meet suspicion requirements. Joh (2014) explains the logic behind the concern that long-term police surveillance of an individual, even in public, might constitute a Fourth Amendment search. She writes, “The premise here, sometimes referred to as the ‘mosaic theory,’ is that the danger to Fourth Amendment privacy lies in the aggregation of discrete bits of data, even if each piece standing alone would not be subjected to constitutional protections” (p. 60, emphasis added). For example, using predictive policing scores in conjunction with ALPR data and network diagrams may grant authorities a level of insight into an individual’s life that might constitute a search and thus require a warrant (Brayne 2017).
Riley v. California, the landmark US Supreme Court case regarding the warrantless search and seizure of the digital contents of a cell phone during arrest, brings the legal challenges of technological advances into further relief. The Court unanimously held that the warrantless search and seizure of the digital contents of a cell phone during arrest is unconstitutional because smartphones are qualitatively and quantitatively different than flip phones or other objects in an individual’s pocket. In the opinion, Chief Justice Roberts writes,
Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans “the privacies of life.” The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the Founders fought. Our answer to the question of what police must do before searching a cell phone seized incident to an arrest is accordingly simple—get a warrant. (p. 32)
Function creep—the repurposing of data—is yet another issue not well addressed by the Fourth Amendment. The Fourth Amendment is primarily interested in the legitimacy of how information is acquired, not how it is repurposed. Joh (2014, p. 63) explains, “If the acquisition is permissible, how the police use that information thereafter is generally not subject to an additional Fourth Amendment challenge.” She suggests courts could shift the focus of the Fourth Amendment from data collection to its intended uses by the government (p. 64). Although particularly salient considering law enforcement’s increasing access to an unprecedentedly wide range of personal data, these are not entirely new concerns. Joh (2014) points out that more than 20 years ago, Krent (1995, p. 81) suggested that the government repurposing of information collected earlier could be deemed unreasonable, and that courts might consider whether “the original seizure…would have been reasonable had the additional governmental objective been known” initially.
Existing privacy laws—such as the Privacy Act of 1974—are increasingly anachronistic. They largely concern custodians of the data, not the data themselves. Privacy laws now should account for function creep, protecting individuals from the potential future secondary uses of their data. Secondary uses of data thus undermine the current patchwork of federal privacy laws that govern the governmental collection of personal identifiable information and challenge conventional consent practices. Moreover, there are statutory exemptions enabling law enforcement to obtain certain data without a court order or subpoena (Ferguson 2017, Murphy 2013). In some circumstances, it is simply easier for law enforcement to buy privately collected data than to collect it directly themselves, because there are few constitutional protections, few reporting requirements, and fewer appellate checks on private sector surveillance and data collection (Pasquale 2015, p. 203).
Relatedly, it may be necessary to revisit the third-party doctrine in light of new data-sharing practices made possible by the mass digitization of records. According to United States v. Miller and Smith v. Maryland, the third-party doctrine maintains that “when an individual voluntarily shares information with third parties, like telephone companies, banks, or even other individuals, the government can acquire that information from the third-party absent a warrant without violating the individual’s Fourth Amendment rights” (Exec. Off. Pres. 2014, p. 33). However, in her concurring opinion in Jones, Justice Sotomayor argues the third-party doctrine is “ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks.” This question is at the heart of Carpenter v. United States, the pending case before the US Supreme Court concerning the warrantless seizure and search of historical cell phone records that show the location and movements of a cell phone user.
Questions of data collection aside, police use of data to determine suspicion also challenges the traditional paradigm of Fourth Amendment law. The reasonable suspicion requirement is predicated on specific and articulable observable facts about a suspect—in other words, small data. However, combining small data with big data—such as a predictive policing forecast—may make it easier for law enforcement to meet the reasonable suspicion standard in practice (Brayne 2017, p. 1000). The Supreme Court held that police observation in a “high crime area” can be a factor in deciding whether the officer has reasonable suspicion or probable cause (Ferguson 2017, p. 76). The Court never defined precisely what constitutes a high crime area, but it is reasonable to assume that the area inside a predictive policing box might qualify. Ferguson (2017, p. 77) writes, “If walking through a predicted red box changes my constitutional rights to be free from unreasonable searches and seizures, then a higher level of scrutiny might need to be brought to bear on the use of technology.” Similarly, what if an individual is identified as a chronic offender in Los Angeles or on the SSL in Chicago? That may provide the specific, individualized, and articulable facts needed to meet the reasonable suspicion standard. Indeed, the individual may be more likely to be involved in violence even if they have not done anything suspicious in that moment (Ferguson 2017, p. 56). Can a stop be “predicated on the aggregation of specific and individualized, but otherwise noncriminal, factors” (Ferguson 2015, p. 330), such as ALPR readings or a geocoded field interview card near the scene of a crime? Ferguson (2015, p. 336) argues that if the police use big data to reach the threshold of reasonable suspicion, the “courts should require a higher level of detail and correlation using the insights and capabilities of big data.” That is, if cases even make it to court. As Kohler-Hausmann (2018, p. 258) points out,
Fourth Amendment jurisprudence is built on the premise that substantive rights against unlawful police stops, searches, and seizures are secured by the mechanism of excluding unlawfully seized evidence and arrests. [But] [t]he overwhelming amount of police work is low-level enforcement activity, not serious violent felony arrests, where the exclusion of necessary evidence would seem salient and professionally important.
In other words, the opportunity to exclude inadmissible evidence rarely presents itself because law enforcement use of big data is only infrequently scrutinized in a trial context.
Finally, the proliferation of prewarrant surveillance tools creates new opportunities for parallel construction: the process of law enforcement obtaining evidence through informants or warrantless surveillance, and then creating an alternative explanation for how the evidence was found. Officer “[h]unches that would be insufficient grounds for obtaining a warrant can be retroactively backed up using existing data and queries can be justified in hindsight after data confirm officer suspicions” (Brayne 2017, p. 1001).
FUTURE DIRECTIONS
Despite advances in law enforcement data collection and analysis, we know relatively little about whether the police have increased efficiency and improved fairness in their daily operations. Evidence on the efficacy of predictive policing is mixed and depends on the type of predictive policing under evaluation (e.g., Mohler et al. 2015, Saunders et al. 2016), most violent and property crime clearance rates have remained fixed (Braga et al. 2011), and homicide clearance rates have declined (Ridgeway 2018). Future research should build the evidence base on these issues to inform police policy and practice. However, social scientific research on law enforcement’s use of big data should not be limited to crime- and enforcement-related outcomes.
A rich body of work demonstrates the social consequences of marking someone in the criminal justice system (Becker 1963, Brayne 2014, Harris et al. 2010, Kohler-Hausmann 2013, Manza & Uggen 2006, Pager 2007, Rios 2011, Western 2006, Western & Pettit 2005). Future research should explore whether and how the marking and subsequent sorting process has changed in light of big data. On the one hand, police use of big data may be a means of reducing bias, inefficiency, and inequality. On the other hand, it may facilitate police overreach and tech-wash discriminatory practices as neutral or objective. Social scientists are particularly well situated to study the relationship between data collection, analysis, and deployment and individual, group, and organizational outcomes. Big data is being used at all stages of the criminal justice system, from policing to pretrial risk assessment, parole eligibility, and community supervision. Questions about how algorithmic decision-making practices may magnify or reduce cumulative disadvantage as individuals move through the system may yield important insights to scholars not just of the criminal justice system but also of stratification, technology, and organizations more broadly.
Challenges to studying the social determinants and consequences of algorithmic decision making in law enforcement include difficulty accessing law enforcement agencies and the fact that many of the big data tools police use are proprietary. However, companies such as HunchLab and PredPol have now published their algorithms, which is a first step toward increasing transparency and opening previously private systems to scrutiny (Robinson 2018). Another potentially fruitful avenue for studying law enforcement’s use of big data may be to analyze technology procurement. For example, public information requests may yield information on contracts between law enforcement agencies and technology firms, thus shedding light on the transactional side of big data surveillance.
Perhaps the most transformative feature of police use of big data is the opportunity these data provide to systematically analyze police practices. High-frequency observations and the proliferation of data-collection sensors associated with algorithmic policing are not only means by which more civilians come under police surveillance but also means by which police themselves come under increased surveillance (Brayne 2017). Aggregating data on past police actions may generate opportunities for oversight and make it possible to audit law enforcement practices (e.g., Goel et al. 2016, Hetey et al. 2016). However, accountability does not flow automatically from transparency. Big data can help city officials, academics, and the public understand police performance only if data on law enforcement activity are made available to parties external to law enforcement.
Finally, it is worth nothing that although some of the surveillance modalities made possible by big data are new, many of the questions about the relationship between technology and social structure are not. Questions raised here echo those from decades-old scholarship in science and technology studies, for example. Therefore, future work should focus not only on change associated with big data but also on continuity.
ACKNOWLEDGMENTS
I thank Issa Kohler-Hausmann, Andrew Ferguson, Andrew Selbst, and the participants at the Big Data and Criminal Justice Round Table at the Ohio State University Moritz College of Law for their thoughts and feedback. All errors are my own.
Footnotes
DISCLOSURE STATEMENT
The author was employed as a postdoctoral research associate at Microsoft Research New England from July 2015 to June 2016.
LITERATURE CITED
- Ajunwa I, Crawford K, Schultz J. 2017. Limitless worker surveillance. Calif. Law Rev 105(3):735–76 [Google Scholar]
- Barocas S, Rosenblat A, boyd d, Gangadharan SP, Yu C. 2014. Data & civil rights: technology primer. Presented at Data & Civil Rights: Why “Big Data” Is a Civil Rights Issue, Washington, DC, Oct. 30. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2536579 [Google Scholar]
- Barocas S, Selbst A. 2016. Big data’s disparate impact. Calif. Law Rev 104:671–732 [Google Scholar]
- Becker H. 1963. Outsiders: Studies in the Sociology of Deviance. New York: Free Press [Google Scholar]
- Beckett K, Nyrop K, Pfingst L, Bowen M. 2005. Drug use, drug possession arrests, and the question of race: lessons from Seattle. Soc. Probl 52(3):419–41 [Google Scholar]
- Braga AA. 2003. Serious youth gun offenders and the epidemic of youth violence in Boston. J. Quant. Criminol 19(1):33–54 [Google Scholar]
- Braga AA, Flynn EA, Kelling GL, Cole CM. 2011. Moving the Work of Criminal Investigators Towards Crime Control. Washington, DC: Natl. Inst. Justice [Google Scholar]
- Braga AA, Kennedy DM, Piehl AM, Waring EJ. 2001. Reducing Gun Violence: The Boston Gun Project’s Operation Ceasefire. Washington, DC: Natl. Inst. Justice [Google Scholar]
- Braga AA, Weisburd DL. 2010. Policing Problem Places: Crime Hotspots and Effective Prevention. New York: Oxford Univ. Press [Google Scholar]
- Brantingham PJ, Brantingham PL, eds. 1981. Environmental Criminology. New York: Sage [Google Scholar]
- Brantingham PJ, Tita GE, Short MB, Reid SE. 2012. The ecology of gang territorial boundaries. Criminology 50(3):851–85 [Google Scholar]
- Brayne S 2014. Surveillance and system avoidance: criminal justice contact and institutional attachment. Am. Sociol. Rev 79(3):367–91 [Google Scholar]
- Brayne S 2017. Big data surveillance: the case of policing. Am. Sociol. Rev 82(5):977–1008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brayne S, Rosenblat A, boyd d. 2015. Predictive policing. Presented at Data & Civil Rights: A New Era of Policing and Justice, Washington, DC, Oct. 27. http://www.datacivilrights.org/pubs/2015-1027/Predictive_Policing.pdf [Google Scholar]
- Christin A 2018. Counting clicks: quantification and variation in web journalism in the United States and France. Am. J. Sociol 123(5):1382–415 [Google Scholar]
- Cohen J, Tita G. 1999. Diffusion in homicide: exploring a general method for detecting spatial diffusion processes. J. Quant. Criminol 15(4):451–93 [Google Scholar]
- Cook PJ, Laub JH. 2002. After the epidemic: recent trends in youth violence in the United States. Crime Justice 29:1–37 [Google Scholar]
- Daston L, Galison P. 2007. Objectivity. New York: Zone Books [Google Scholar]
- Desmond M, Papachristos AV, Kirk DS. 2016. Police violence and citizen crime reporting in the black community. Am. Sociol. Rev 81(5):857–76 [Google Scholar]
- DiMaggio PJ, Powell WW. 1983. The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am. Sociol. Rev 48(2):147–60 [Google Scholar]
- Exec. Off. Pres. 2014. Big data: seizing opportunities, preserving values. Rep, White House, Washington, DC. https://obamawhitehouse.archives.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf [Google Scholar]
- Ferguson AG. 2015. Big data and predictive reasonable suspicion. Univ. Pa. Law Rev 63(2):327–410 [Google Scholar]
- Ferguson AG. 2017. The Rise of Big Data Policing. New York: N.Y. Univ. Press [Google Scholar]
- Floyd et al. v. City of New York et al., 959 F. Supp. 2d 540 (2013) [Google Scholar]
- Fourcade M, Healy K. 2017. Seeing like a market. Socioecon. Rev 15(1):9–29 [Google Scholar]
- Gillespie T. 2014. The relevance of algorithms. In Media Technologies: Essays on Communication, Materiality, and Society, ed. Gillespie T, Boczkowski P, Foot K, pp. 167–94. Cambridge, MA: MIT Press [Google Scholar]
- Goel S, Rao JM, Shroff R. 2016. Precinct or prejudice? Understanding racial disparities in New York City’s stop-and-frisk policy. Ann. Appl. Stat 10(1):365–94 [Google Scholar]
- Haggerty KD, Ericson RV. 2006. The New Politics of Surveillance and Visibility. Toronto: Univ. Toronto Press [Google Scholar]
- Harris A, Evans H, Beckett K. 2010. Drawing blood from stones: legal debt and social inequality in the contemporary United States. Am. J. Sociol 115(6):1753–99 [Google Scholar]
- Hetey RC, Monin B, Maitreyi A, Eberhardt JL. 2016. Data for Change: A Statistical Analysis of Police Stops, Searches, Handcuffings, and Arrests in Oakland, Calif., 2013–2014. Stanford, CA: SPARQ Stanford [Google Scholar]
- Innes M. 2001. Control creep. Sociol. Res Online 6:1–10 [Google Scholar]
- Int. Assoc. Chiefs Police. 2015. 2015 Social Media Survey Results. Alexandria, VA: Int. Assoc. Chiefs Police. http://www.iacpsocialmedia.org/wp-content/uploads/2017/01/FULL-2015-Social-Media-Survey-Results.compressed.pdf [Google Scholar]
- Joh E. 2014. Policing by numbers: big data and the Fourth Amendment. Wash. Law Rev 89:35–68 [Google Scholar]
- Joh E. 2016. The new surveillance discretion: automated suspicion, big data, and policing. Harvard Law Policy Rev 10(1):15–42 [Google Scholar]
- Keizer K, Lindenberg S, Steg L. 2008. The spreading of disorder. Science 322:1681–85 [DOI] [PubMed] [Google Scholar]
- Kennedy DM. 1997. Pulling levers: chronic offenders, high-crime settings, and a theory of prevention. Valpso. Univ. Law Rev 31:449–84 [Google Scholar]
- Kennedy DM, Braga AA, Piehl AM. 1997. The (un)known universe: mapping gangs and gang violence in Boston. In Crime Mapping and Crime Prevention, ed. Weisburd D, McEwan T, pp. 219–62. New York: Crim. Justice [Google Scholar]
- Kennedy LW, Caplan JM, Piza E. 2011. Risk clusters, hot spots, and spatial intelligence: risk terrain modeling as an algorithm for police resource allocation strategies. J. Quant. Criminol 27(3):339–62 [Google Scholar]
- Kerr O. 2011. An equilibrium-adjustment theory of the Fourth Amendment. Harvard Law Rev. 125:476–543 [Google Scholar]
- Kohler-Hausmann I 2013. Misdemeanor justice: control without conviction. Am. J. Sociol 119(2):351–93 [Google Scholar]
- Kohler-Hausmann I 2018. Misdemeanorland: Criminal Courts and Social Control in an Age of Broken Windows Policing. Princeton, NJ: Princeton Univ. Press [Google Scholar]
- Krent H 1995. Of diaries and data banks: use restrictions under the Fourth Amendment. Tex. Law Rev 74:49–100 [Google Scholar]
- Lane J 2015. The digital street: an ethnographic study of networked street life in Harlem. Am. Behav. Sci 60:43–58 [Google Scholar]
- Lane J, Ramirez F. 2016. Beyond admissibility: the prosecutorial affordances of social media use. Paper presented at the International Communication Association Convention, Fukuoka, Japan, June 9–13 [Google Scholar]
- Laney D 2001. 3D data management: controlling data volume, velocity, and variety. Application Delivery Strategies Blog, Feb. 6. https://blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety.pdf [Google Scholar]
- Langton L, Berzofsky M, Krebs C, Smiley-McDonald H. 2012. Victimizations Not Reported to the Police, 2006–2010. Washington, DC: Bur. Justice Stat. [Google Scholar]
- Lazer D, Radford J. 2017. Data ex machina: introduction to big data. Annu. Rev. Sociol 34:19–39 [Google Scholar]
- Manza J, Uggen C. 2006. Locked Out: Felon Disenfranchisement and American Democracy. New York: Oxford Univ. Press [Google Scholar]
- Matsueda RL, Kreager DA, Huizinga D. 2006. Deterring delinquents: a rational choice model of theft and violence. Am. Sociol. Rev 71(1):95–122 [Google Scholar]
- Mayer-Schönberger V, Cukier K. 2013. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt [Google Scholar]
- Meyer JW, Rowan B. 1977. Institutionalized organizations: formal structure as myth and ceremony. Am. J. Sociol 83(2):340–63 [Google Scholar]
- Mohler GO, Short MB, Malinowski S, Johnson M, Tita GE, et al. 2015. Randomized controlled field trials of predictive policing. J. Am. Stat. Assoc 110(512):1399–411 [Google Scholar]
- Murphy E 2013. The politics of privacy in the criminal justice system: information disclosure, the Fourth Amendment, and statutory law enforcement exemptions. Mich. Law Rev 111:485–546 [Google Scholar]
- Pager D 2007. Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration Chicago: Univ. Chicago Press [Google Scholar]
- Papachristos AV. 2009. Murder by structure: dominance relations and the social structure of gang homicide. Am. J. Sociol 115:74–128 [DOI] [PubMed] [Google Scholar]
- Papachristos AV, Kirk DS. 2015. Changing the street dynamic: evaluating Chicago’s group violence reduction strategy. Criminol. Public Policy 14(3):525–58 [Google Scholar]
- Papachristos AV, Wildeman C, Roberto E. 2015. Tragic, but not random: the social contagion of nonfatal gunshot injuries. Soc. Sci. Med 125:139–50 [DOI] [PubMed] [Google Scholar]
- Pasquale F 2015. Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard Univ. Press [Google Scholar]
- Patton DU, Brunton DW, Dixon A, Miller RJ, Leonard P, Hackman R. 2017. Stop and frisk online: theorizing everyday racism in digital policing in the use of social media for identification of criminal conduct and associations. Soc. Media Soc 3:1–10 [Google Scholar]
- Police Exec. Res. Forum. 2014. Future Trends in Policing. Washington, DC: Police Exec. Res. Forum [Google Scholar]
- Ratcliffe JH. 2008. Intelligence-Led Policing. Cullompton, UK: Willan [Google Scholar]
- Ratcliffe JH, Taniguchi T, Groff ER, Wood JD. 2011.The Philadelphia foot patrol experiment: a randomized controlled trial of police patrol effectiveness in violent crime hotspots. Criminology 49:795–831 [Google Scholar]
- Renan D 2016. The Fourth Amendment as administrative governance. Stanford Law Rev. 68:1039–129 [Google Scholar]
- Ridgeway G. 2018. Policing in the era of big data. Annu. Rev. Criminol 1:401–19 [Google Scholar]
- Riley v. California, 573 US ___ (2014) [Google Scholar]
- Rios V 2011. Punished: Policing the Lives of Black and Latino Boys. New York: N.Y. Univ. Press [Google Scholar]
- Robinson DJ. 2018. The challenges of prediction: lessons from criminal justice. I/S J. Law Policy Inf. Soc In press. https://ssrn.com/abstract=3054115 [Google Scholar]
- Sampson RJ, Raudenbush SW, Earls F. 1997. Neighborhoods and violent crime: a multilevel study of collective efficacy. Science 277(5328):918–24 [DOI] [PubMed] [Google Scholar]
- Saunders J, Hunt P, Hollywood JS. 2016. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. J. Exp. Criminol 12:347–71 [Google Scholar]
- Scott R 2004. Reflections on a half-century of organizational sociology. Annu. Rev. Sociol 30:1–21 [Google Scholar]
- Sherman L 2013. The rise of evidence-based policing: targeting, testing, and tracking. Crime Justice 42:377–451 [Google Scholar]
- Sherman LW, Gartin PR, Buerger ME. 1989. Hot spots of perdatory crime: routine activities and the criminology of place. Criminology 27:27–56 [Google Scholar]
- Sierra-Arévalo M, Papachristos AV. 2017. Social networks and gang violence reduction. Annu. Rev. Law Soc. Sci 13:373–93 [Google Scholar]
- Slobogin C 2016. Policing as administration. Univ. Pa. Law Rev 165:91–152 [Google Scholar]
- Smith v. Maryland, 442 U.S. 735 (1979) [Google Scholar]
- Smith M, Austin RL Jr. 2015. Launching the police data initiative. White House Blog, May 18. https://obamawhitehouse.archives.gov/blog/2015/05/18/launching-police-data-initiative [Google Scholar]
- Thrasher FM. 2013 (1927). The Gang: A Study of 1,313 Gangs in Chicago. Chicago: Univ. Chicago Press [Google Scholar]
- Trottier D 2012. Policing social media. In Social Media as Surveillance: Rethinking Visibility in a Converging World, ed. Trottier D, pp. 135–54. Abingdon, UK: Ashgate [Google Scholar]
- Uchida CD, Swatt ML. 2013. Operation LASER and the effectiveness of hotspot patrol: a panel analysis. Police Q 16(3):287–304 [Google Scholar]
- United States v. Jones, 132 S. Ct. 945, 565 U.S (2012) [Google Scholar]
- United States v. Miller, 307 U.S. 174 (1939) [Google Scholar]
- US Dep. Justice. 2015 (2001). L.A. Consent Decree. Washington, DC: US Dep. Justice [Google Scholar]
- US Dep. Justice Civil Rights Div. 2015. Investigation of the Ferguson Police Department. https://www.justice.gov/sites/default/files/opa/press-releases/attachments/2015/03/04/ferguson_police_department_report.pdf [Google Scholar]
- Wasserman S, Faust K. 1994. Social Network Analysis Methods and Applications. Cambridge, UK: Cambridge Univ. Press [Google Scholar]
- Waxman M 2009. Police and national security: American local law enforcement and counter-terrorism after 9/11. J. Natl. Secur. Law Policy 3:377–407 [Google Scholar]
- Weisburd D, Mastrofski SD, McNally AM, Greenspan R, Willis JJ. 2003. Reforming to preserve: COMPSTAT and strategic problem-solving in American policing. Criminol. Public Policy 2:421–56 [Google Scholar]
- Western B 2006. Punishment and Inequality in America. New York: Russell Sage Found. [Google Scholar]
- Western B, Pettit B. 2005. Black-white wage inequality, employment rates, and incarceration. Am. J. Sociol 111(2):553–78 [Google Scholar]
- Whyte WF. 1969 (1943). Street Corner Society Chicago: Univ. Chicago Press [Google Scholar]
- Willis J, Mastrofski SD, Weisburd D. 2007. Making sense of COMPSTAT: a theory-based analysis of organizational change in three police departments. Law Soc. Rev 41(1):147–88 [Google Scholar]
