Using AI to analyze phone calls from prisons could increase racial bias in policing

Using AI to analyze phone calls from prisons could increase racial bias in policing

A US Congressional committee is pressuring the Justice Department to explore the federal use of artificial intelligence to analyze inmate phone calls.

The panel has called for more research into the potential of technology to prevent suicide and violent crime . The system transcribes telephone conversations, analyzes the tone of voice and detects certain words or phrases preprogrammed by the officials.

Algorithmic biases

Bill Partridge , a police chief in Alabama, where the system is already in use, said officers solved unsolved homicide cases after AI singled out prisoners discussing the crimes.

Advocates argue that technology can protect inmates and aid police investigations, but critics have voiced alarm: They fear that using to interpret conversations will lead to life-changing mistakes, misunderstandings and racial biases.

A 2020 Stanford University study found that speech-to-text systems used by Amazon, Google, Apple, IBM, and Microsoft had error rates that were nearly twice as high for black people as for white people. These disparities could reinforce racial disparities in the criminal justice system. Research shows that black men in the United States are six times more likely to be incarcerated than white men .

Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, notes that the technology could "automate racial profiling" and violate privacy rights.