Connect with us

Best Lakeland Storage | One Month Free‎

Best Lakeland Storage | One Month Free‎

Covid-19 Proves It’s Time to Abolish ‘Predictive’ Policing Algorithms


Technology

Covid-19 Proves It’s Time to Abolish ‘Predictive’ Policing Algorithms

As summer comes to a close, local governments are returning to their council chambers and facing massive pressures. Municipal budgets are losing hundreds of millions in revenue in the wake of coronavirus. Meanwhile, a generational uprising is pushing our government to divest from militarized, racist policing, calling instead for the resources that our neighborhoods have…

Covid-19 Proves It’s Time to Abolish ‘Predictive’ Policing Algorithms thumbnail

As summer comes to a close, local governments are returning to their council chambers and facing massive pressures. Municipal budgets are losing hundreds of millions in revenue in the wake of coronavirus. Meanwhile, a generational uprising is pushing our government to divest from militarized, racist policing, calling instead for the resources that our neighborhoods have been starved of for generations—the resources that actually increase safety.

The broad-based support for these calls sends hopeful signals about where our cities and country are headed. But if we want to get there, we must take care not to repeat the mistakes of the past.

WIRED OPINION

ABOUT

Hannah Sassaman is the policy director at Movement Alliance Project, a movement organization focused at the intersection of race, technology, and inequality in Philadelphia. She is a former Soros Justice Fellow focusing on community organizing around predictive technologies in the criminal legal system.

During the last great economic crisis this country faced, in 2008, local policymakers sought to save money while making their communities “safer” with new tech-based solutions. In the years since, police departments, probation officers, and courts have embedded this technology—like crime-predicting algorithms, facial recognition, and pretrial and sentencing software—deep inside America’s criminal legal system, even as budgets have risen and police forces have grown. But instead of actually predicting and reducing crime and violence, these algorithms promote systems of over-policing and mass incarceration, perpetuating racism and increasing tensions between police and communities.

Designers claim that predictive policing can save money through “smart” targeting of police resources, but algorithms meant to foresee where crime will occur only justified massive and often violent deployment to neighborhoods already suffering from poverty and disinvestment. Ultimately, these algorithms didn’t reduce the money taxpayers spend on the cops. In fact, as departments across the country installed predictive policing, police budgets continued to grow, especially as a percentage of overall municipal spending. At the same time, the criminal legal system grew more punishing, especially for Black and brown people. The accused communities of color caught up in predictive policing were then judged by another set of algorithms when taken for their arraignments in court: “pre-trial algorithms.” This software sorts accused people into “risky” and “non-risky” categories— keeping those who have yet to be tried or convicted incarcerated for longer, wrecking their chances to mount a defense, and defying the American notion of presumed innocence.

But the justification for all of this so-called “predictive policing” crumbles when you look at the criminal and legal data coming out of the first months of Covid.

Real. Secure. Self Storage!

Brand New Self Storage from an Experienced Provider!

Learn More...

As the grip of coronavirus tightened in Philadelphia, for example, incarcerated people, families, organizers, and legal system actors pushed the courts to release over a thousand people from jails where social distancing is near-impossible. At the same time, police officers, afraid of overcrowding jails while the courts were shut down and catching the coronavirus, stopped making low-level arrests. Police forces nationwide took similar approaches.

In city after city where these changes were made, local authorities are seeing many kinds of crime drop. While certain types of violence in many cities—including in Philadelphia, where I am—are slowly rising as unemployment climbs and poverty deepens, there’s no data supporting the belief that emptying jails and limiting arrests causes violence in our communities. The National Council of State Courts shows that the two important data points pretrial systems track—whether or not someone returns to court, and whether or not they get arrested again before facing trial—have both plummeted nationally. Research and our lived experience during the Covid-19 outbreak is proving that you can arrest and incarcerate far fewer people in our communities without compromising safety or spending unnecessary money to lock them up.

These promising signs underscore the importance of breaking with algorithmic decision­making, whether through “predictive policing” or other algorithms used in the criminal legal system. As our local governments return to even emptier coffers and major municipal budget pressures, we should quickly abolish these models across all criminal legal system contexts. Some cities have already begun to act: Chicago, after years of following a similar strategy as Philadelphia, dumped its notorious algorithmic “hot list” after admitting that the tool hadn’t reduced violence, while it had increased racist policing. And Santa Cruz banned predictive policing this summer, with Santa Cruz police chief Andy Mills describing the biased data and impacts of these algorithms as “a blind spot I didn’t see.”

Communities impacted by over-policing and mass incarceration, as well as legal scholars and researchers, have raised concerns about these technologies for many years. The Movement Alliance Project, where I serve as policy director, recently worked with MediaJustice to study how risk-assessment tools are actually deployed in hundreds of cities nationwide. After dozens of conversations with governments that deploy these algorithms, we found that the majority don’t even document how these tools impact the number of people incarcerated or the racial breakdowns in their jails.

In the face of a movement equating racial justice with defunding, disarming, and abolishing policing around the country, the Black data scientists who have for years researched the racist impact of facial recognition have finally won major bans on their use in some cities. The rest of the nation needs to follow on surveillance and racist policing technologies across the entire criminal legal system. As our cities and states get ready to reopen courts and jails, we must reject current efforts to revive these failed predictive models. Instead of massively expanding resources for Black communities, cities are once again testing predictive algorithms that target young Black people. With police returning to their beats, they are refilling their jails and looking back to algorithms to reduce overcrowding. Cities like Pittsburgh are considering repurposing police algorithms to predict child welfare issues, and PredPol is marketing its tools to help with Covid-19 enforcement—something neither tool was calibrated to do.

We have a narrow window to ban this form of “e-carceration” and to keep new tech out of the hands of police, instead of following the path the country took after the last economic crash. The racist “broken-window policing,” applauded in the early 2000s as a data-driven best practice, hasn’t disappeared—it has merely evolved into another system based on cherry-picked data and rooted in anti-Blackness. Instead of using bail or algorithms to decide who is worthy of freedom, officials should follow the data, which shows we can safely release the vast majority of people pretrial. And instead of using algorithms to ramp up new iterations of predictive policing, local governments must listen to the movements on the streets, and invest in the resources that will truly improve safety in our neighborhoods.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at opinion@wired.com.


More Great WIRED Stories

AAA Storage in Lakeland Florida

Top Stories

To Top