Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Is artificial intelligence making racial profiling worse?

Source: cbsnews.com

Throughout its history, the LAPD has found itself embroiled in controversy over racially biased policing. In 1992, police violence and the acquittal of four police officers who beat black motorist Rodney King culminated in riots that killed more than 50 people. Many reforms have been instituted in the decades since then, but racial bias in LA law enforcement continues to raise concerns. A 2019 report found that the LAPD pulled over black drivers four times as often as white drivers, and Latino drivers three times as often as whites, despite white drivers being more likely to have weapons, drugs or other contraband.

New technological tools employed by the department could be aggravating the problem. In an effort to further reduce crime, the LAPD has turned to big data.

Traditionally, police have stepped in to enforce the law after a crime has occurred, but advancements in artificial intelligence have helped create what are called “predictive policing” programs. These algorithm-driven systems analyze crime data to find a pattern, aiming to predict where crimes will be committed or even by whom. The idea is to stop crime before it happens by directing police to locations or people to target — following the hard, supposedly unbiased data. In the last decade, some of the largest police departments in the country have turned to predictive policing to reduce crimes in their communities, and the LAPD has helped to pioneer the trend.

In 2011, the LAPD instituted a program they helped develop called PredPol, a location-based program that uses an algorithm to sift through historical crime data and predict where the next vehicle theft or burglary may occur. PredPol can precisely target areas as small as 500 by 500 feet. On the surface, using objective data to predict crime risk seems like a promising way to prevent subjective judgments or implicit bias about where to deploy police. But critics were quick to point out its flaws, asserting that using historical crime data may actually make matters worse.

Although the data itself just amounts to a collection of numbers and locations, the police practices that led to the data’s collection may be fraught with bias. Andrew Ferguson, a law professor and predictive policing expert, says this amplifies historical practices. 

“If you unthinkingly develop a data-driven policing system based on past police practices, you’re kind of going to reify past police practices,” he said.

A group called the Stop LAPD Spying Coalition has focused on ending LAPD’s use of predictive policing for almost a decade. In a 2016 letter posted online, the group explained its opposition: 

“It is widely known and well documented that police stop, detain, frisk, and arrest Black and Brown people overwhelmingly; therefore, the Black and Brown community will have a greater appearance in this historic crime data. This fact alone should put the validity of historic crime data into question. Because historic crime data is biased through the practice of racialized enforcement of law, predictive policing will inherently reinforce and perpetuate this structural racism.”

Because PredPol targets locations rather than specific individuals, some departments argue that it can’t lead to racial profiling. “With PredPol … we are looking and we’re targeting crime,” said Police Chief David Reynoso of the city of El Monte, located 13 miles east of LA “We’re not biased towards any certain group of people. Really we’re taking off the human factor. … We’re not profiling people, but rather profiling crime.” 

The LAPD declined an interview with CBSN Originals to discuss its use of PredPol, but provided a statement that said in part: 

“PredPol is deployed city-wide and is a place-based tool, not offender based. … The study by Brantingham et al. (2018) showed … that arrests by racial-ethnic group were NO DIFFERENT when using algorithmic predictions (PredPol) compared to existing best practice. This indicates that PredPol does not introduce any new biases and that it does not lead to over (or under) policing of minority communities.” 

Under pressure from the community, the LA Police Commission directed the LAPD Office of the Inspector General to conduct an audit of PredPol and another predictive policing program in March 2019. The audit found that 74% of visits officers made to PredPol “hotspots” occurred for less than a minute, and indicated there was insufficient data to determine PredPol’s effectiveness at reducing crime in LA.

The review of PredPol does not probe or make any mention of potential racial disparities in the program. But during the public comment period at a police commission meeting reviewing the audit, a community member voiced his concerns that location-based predictive policing is a covert way to justify racial profiling.

“Place-based policing is not inherently benign. Place is a proxy for race. Especially in this city which is so segregated,” he said. In a question directed toward the police commissioners’ panel, the man continued, “You said yourself, ‘PredPol is designed to protect property.’ You all value property more than lives?”

Despite continued protests from the community, the LAPD decided to keep PredPol in operation, but rolled out changes in October 2019 emphasizing a “community focus.” At a police commission meeting that month, Los Angeles Police Chief Michel Moore defended the use of location-based predictive policing, while acknowledging that the community and the LAPD won’t always agree. 

“I do believe that the effort of location-based strategies, of identifying places that form concentrations of crime, locations which are suffering from instances of violence and serious crime, are methods we should be aware of,” the chief said.

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence