The Student News Site of Rock Bridge High School

Bearing News

The Student News Site of Rock Bridge High School

Bearing News

The Student News Site of Rock Bridge High School

Bearing News

Biased data exacerbates racial inequalities

Art+by+Lorelei+Dohm.
Art by Lorelei Dohm.

Technology is widely regarded as objective; programmed to efficiently perform traditional human tasks without human bias. In this digital age, such impartiality has become invaluable to society. Countless industries have ingrained technology, including artificial intelligence (AI), into day-to-day operations previously only designated to humans. One field that has taken advantage of technology’s efficiency and objectivity is the policing and criminal justice system. In recent years, law enforcement has increasingly used big data and AI to ‘predict’ crimes before they occur, according to the National Institute of Justice. This practice is known as predictive policing. 

There are two main types of predictive policing algorithms. The first is a location-based algorithm, which draws on data about crime records and their links to different places and events to determine where and when crimes are most likely to happen, helping police departments decide where to allocate their resources. Certain times and places, such as crowded events, are statistically more likely to have criminal activity. Tools such as PredPol, one of the most commonly used location-based algorithms, create maps using these statistics that update throughout the day with predictions on which areas are more dangerous, giving law enforcement a better idea of where to patrol more heavily.

The second type is a pretrial risk assessment algorithm, and it uses data about socioeconomic status, age, gender and more to determine how likely a specific person is to commit a crime, according to The Leadership Conference on Civil and Human Rights. One tool, Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), is used in states such as California, New York and Florida to help judges make decisions about sentencing and pretrial release. 

COMPAS assigns each defendant a statistical score from 1 to 10 on how likely they are to reoffend and therefore how high or low risk they are. This means that two defendants accused of the same crime could receive different sentences because one was deemed higher risk by a formula due to factors outside their control.

The death of many Black people at the hands of the police and the subsequent uprise of the Black Lives Matter movement earlier this year shed light on the racism ingrained into the system. Activists and researchers have also begun to question the technology and algorithms that the policing and criminal legal systems rely on and whether it is actually unbiased.

These algorithms have garnered criticism as some cities use crime records data that may include racially biased practices known as “dirty policing,” including arrests and traffic stops without just cause. According to a 1999 report from the ACLU, these incidents disproportionately affect Black people. Data from these occurrences are then used by predictive policing systems, in turn skewing the algorithms and making them more likely to label more Black people and communities as dangerous, according to a 2019 study conducted by the AI Now Institute. 

While risk assessment algorithms do not explicitly use race as a factor when making predictions, they use socioeconomic status, zip code and other variables that are closely correlated with race, often leading to racially biased predictions. A 2016 study by ProPublica tested COMPAS and found that Black defendants were almost twice as likely as white defendants to be labelled high risk, yet empirical data shows they do not actually reoffend. Similarly, white defendants were found to be mislabeled as low risk more often than Black defendants. Junior Abbie Sivaraman said she disagrees with the use of predictive policing algorithms, and they exacerbate existing racial injustices rather than mitigating bias as was intended.

“There are already many problems with people being wrongly accused of crimes due to their race or their upbringing, and I don’t think [predictive policing] is a fair way to narrow down suspects in a crime,” Sivaraman said. “We should be taking steps to reduce the bias, not implementing a system where racial bias factors into important decisions regarding incarceration and policing. People’s lives shouldn’t be ruined over a poorly thought out system.”

Opponents of predictive policing say that disguising this technology as objective while ignoring the biased data it’s built on is not only naive but also exacerbates existing prejudice against marginalized communities. According to an article for the American Sociological Review by Sarah Brayne, a sociology professor at University of Texas-Austin, the belief that policing algorithms are fully impartial may provide the justice system with a justification for over-policing and mass incarceration.

“Social dynamics inform the historical crime data that are fed into the predictive policing algorithm,” Brayne said. “However, once they are inputted as data, the predictions appear impartial; human judgment is hidden in the black box under a patina of objectivity.”

Vincent Southerland, an adviser to the AI Now Institute and Executive Director of the Center on Race, Inequality and the Law at New York University School of Law, agrees that policing technology is a harmful reflection of human biases. He believes that these algorithms have no place in policing, especially given technology’s roles in oppressing Black people throughout history.

“Technological tools have consistently been used to mark, sort, and surveil Black people and single them out for harmful treatment in the criminal legal system,” Southerland said. “I think that the technology in the criminal system can be deeply harmful, especially because it can reflect back the racism, bias, and inequality that shapes the world around us. Given that, it is hard to imagine having these technologies in place in ways that are going to drive the system to behave in more fair and just ways.”

While modern predictive policing algorithms are still fairly new, data systems have a long history of being weaponized against Black people in different ways. Practices such as redlining, for example, used data to classify Black communities as high financial risks, justifying banks’ systematic exclusion of Black people from financial services. 

“Quantified practices may thus serve to exacerbate inequalities in stop patterns, create arrest statistics needed to justify stereotypes, and ultimately lead to self-fulfilling statistical prophecies,” Brayne said. “Moreover, as police contact is the entry point into the criminal justice system, the digital feedback loops associated with predictive policing may ultimately justify the growth and perpetuation of the carceral state.”

Society is in an era of rapidly advancing technology, and it’s important now more than ever to realize that technology is only as ‘good’ or ‘bad’ as the people behind it. Everyone has an essential role to play in recognizing their individual human biases and how they contribute to the technology we increasingly rely on. This can ensure that data and AI are used in ethical ways, especially in the context of the criminal justice system and addressing racial inequalities.

“Understanding each step of data collection and analysis is crucial for understanding how data systems—despite being thought of as objective, quantified, and unbiased—may inherit the bias of their creators and users,” Brayne said. “As an institution historically implicated in the reproduction of inequality, understanding the intended and unintended consequences of machine-learned decisions and new surveillant technologies in the criminal justice system is of paramount importance.”

Do you think technology can be racially biased in other ways? Let us know in the comments below.

Leave a Comment
More to Discover

Comments (0)

All Bearing News Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *