The killings of Ahmaud Arbery and Breonna Taylor have reignited the national conversation on police violence and racial discrimination. On February 23rd, 2020 Arbery, 25 was shot and killed by the son of a former police officer during his afternoon jog. Less than a month later, Taylor, 26 was killed during a police raid of her home with a search warrant that contained the name of a man who was already in custody by the time she was pronounced dead.
Both events exemplify the hypervisible violence that has forged America’s ongoing legacy with racialized police brutality. In the words of civil rights attorney Benjamin Crump, these deaths are senseless killings.
Yet, as the nation mourns, an invisible network of predictive policing technology — which forefronts modern law enforcement strategies — continues to cast a broad spectrum of targeted violence against the country’s most vulnerable populations.
Rather than eliminate the fatal flaws of the criminal justice system, these technical innovations replicate the violent discrimination of the past.
The technological evolution of policing is data-driven and opaque.
Data-based policing is not new. In his book Digitize and Punish, University of Illinois professor Brian J. Jefferson marks the invention of Herman Hollerith’s electromechanical tabulator in the late nineteenth century as the beginning of the bureaucratic data complex. For a country experiencing an influx of European immigrants categorizing demographic information became the solution to a host of problems.
It would not be long until the fixation with data found its way into the criminal justice system as well. In 1927, during the International Association of Chiefs of Police, the Uniform Crime Reporting system was developed. For the first time ever crimes against person and property were to be recorded by police departments on a nationwide scale.
In the 1960’s President Johnson dramatically increased the use of data-based policing to support his ‘War on Crime’. The influential role of predictive policing technologies would continue throughout the Clinton and Obama administrations as part of their community policing platforms.
Despite a glaring lack of transparency, police departments across the country now utilize a myriad of these data-based strategies. Overly broad gang databases, invasive drone surveillance, smart cameras equipped with facial recognition software, and algorithm-based predictive policing software like PredPol all comprise an opaque network of law enforcement mechanisms that have ushered in a new era of police power.
The objectivity of data is a lethal myth.
The computer algorithms used to anticipate crimes function at both the individual level and the geographic level. At the individual level, algorithms can create ‘heat lists’ of people who are deemed more likely to commit a crime in the future. In some cities, citizens who end up on the list will receive communication warning them of their crime-prone status, with little explanation as to how they ended up there or what they can do to be removed.
In the geographic model, computer systems create ‘hot spots’ where crime is anticipated and police departments can allocate patrol units accordingly.
Advocates for the preemptive approach claim the objectivity of data makes the system a bias-free and efficient method for reducing crime.
The RAND Corporation report on predictive policing opens with, “Smart, effective, and proactive policing is clearly preferable to simply reacting to criminal acts.”
But, the reality of predictive policing is not so clean cut.
First, recent research shows the best-case scenarios for preemptive models provide only a short-term decrease in crime. In other instances these methods have shown no correlation or diminishing returns.
Additionally, in a 2017 article, Professor Jefferson outlines the disproportionate representation of people of color within some criminal databases. Using Chicago Police department data, Jefferson reports that 93 percent of arrests for narcotics were of Black and Hispanic populations, 96 percent of arrests for nonnegligent manslaughter were categorized as black or Latinx and 93 percent of all recorded murders occurred in police districts that were of the same minority designations.
What is touted as a completely scientific process is tainted by non-scientific data entry. But why is this the case?
For one, police departments are under extreme pressure to justify the funding they receive. In a survey of former police chiefs published in the New York Times, respondents overwhelmingly stated that intense pressure to make arrests led to the manipulation of data to produce results.
NYU Law professors Rashida Richardson, Jason Schultz and Microsoft researcher Kate Crawford detail extreme data manipulation by the Baltimore Police Department in 2019 that includes, “falsifying police reports and unconstitutional searches… that sent innocent people to jail.”
Compounding the problem of corrupt data entry is a well-documented history of implicit racial bias. And that is the heart of the problem. By shifting the onus of arrests onto a computer’s algorithm, predictive policing obscures the racialization of police activity. Data-based, predictive analytics merely reinscribes race into an objective looking data point.
The illusory effect of technical objectivity creates police forces that are highly uncritical of the data they receive. This allows officers and city officials alike to belittle or outright ignore discriminatory and racist practices. Rather than predicting the future of crime, predictive policing relies on a series of biased data points, entrenching a violent status quo. Computers can’t be racist, so people trust the results.
The new era of community policing, powered by predictive technologies is incapable of reconciling the past.
If the appearance of efficiency and objectivity has made predictive policing important in city hall, community policing has made it irreplaceable. Branded as a way to decrease crime through citizen engagement, community policing emphasizes more collaboration between the police and the communities they patrol. Interactions with community members become a part of the data set as police attempt to learn more about community problems from a resident’s perspective.
But if citizen’s arrests result in the carte blanche murder of innocent joggers a more critical examination of who makes up ‘the community’ is needed.
These initiatives at the ground level look much different than the policy rhetoric would articulate. Often engagement is limited to business owners, neighborhood association presidents, and other individuals with social capital. Less economically productive members of the community don’t have the same access or trust level with the police and often become labeled as part of the community problems.
Additionally, community policing can’t solve the social issues that underwrite crime. According to the Prison Policy Initiative, people who are arrested multiple times are more likely to be poor, unemployed, and have less than a high school education. Even if all data were accurate, all police officers were well-intentioned and properly trained, police officers can’t solve the societal ills that facilitate crime.
Until attempts are made to improve the overburdened schools and undervalued economies, the communities hit hardest by computer-based over-policing will continue to endure disproportionate levels of police contact and mass incarceration.
In the end, community policing is a nicely packaged police expansion program that turns social problems into public relations campaigns.
We have to pursue new alternatives.
Speaking on decarceration and prison abolition, political activist Angela Y. Davis wrote, “Dangerous limits have been placed on the very possibility of imagining alternatives.” And that sentiment rings true as liberal and conservative reformers alike believe investing in a broken system will eventually bring about change.
What is needed are demands for real transparency and a collective consciousness on the ways police technology impacts marginalized bodies.
Concrete examples of this type of change do occur. In New York City the Communities United for Police Reform (CUPR) devised an informal surveillance network to monitor policy brutality and sought legalistic solutions to end it as well. As a result, CUPR won successful class-action lawsuits that increased police accountability. Their work culminated in the passage of the 2012 Community Safety Act which required gender identity, gender expression, immigration status, and sexual orientation to be considered legally protected categories in court proceedings. These actions helped spread awareness and protect communities particularly vulnerable to abusive police power.
Additionally, the Chicago based Citizens Police Data Project was launched in 2015 by the guerilla journalists known as the Invisible Institute. The project made police disciplinary reports publicly available information. As a result of their work — although not without a lengthy legal process — disciplinary data for the Chicago police department was required to be publically available for the first time. The end result was the creation of the country’s largest public database of police disciplinary information.
Moving beyond the law also means finding unique ways to use these technological systems against themselves. Cultural studies scholar, Andrea Miller suggests, “Thinking within rather than against the insensible nature of policing can prove a more effective battle.”
An example of thinking within the system that exists now is the White Collar Crime Early Warning System that uses predictive policing technology as a way to counter-surveil the crimes committed by those left out of the current crime paradigm; the white and wealthy, white-collar criminals.
Thinking within a predictive policing paradigm, but changing the populations being surveilled illustrates the problematic nature of the preemptive policing paradigm. Activism like this has the potential to mobilize even the most complicit defenders of the current system.
In less than a month, two more innocent lives were taken as a result of an expansive and punitive police network. Unfortunately, this violence is not new, and absent real change will continue.
But, there is some hope. By highlighting the problematic nature of the seemingly benevolent technological advancements in law enforcement, individuals and communities have the leverage to expose the insidiousness of modern policing. We can protect and serve the ones currently denied that right. But, change has to come from within and people have to invest more of their attention to the way marginalized bodies are affected by technological innovations. The alternatives must be limitless or the senseless killings will never end.
Chris Flowers is an independent researcher and freelance creator from Little Rock, Arkansas. He writes about social justice issues, education, and politics. He can be reached at firstname.lastname@example.org