IBM scraps facial recognition tech over racial profiling concerns

9 Jun 2020

Image: © JHVEPhoto/Stock.adobe.com

IBM plans to scrap its facial recognition and analysis software, saying it opposes the use of technology for mass surveillance or racial profiling.

On Monday (8 June), IBM CEO Arvind Krishna said that the company is scrapping its general purpose facial recognition and analysis software products, and called for a “national dialogue” on whether the technology should be used by law enforcement agencies.

Krishna made the announcement in an open letter to the US Congress, where he raised concerns about discrimination and racial bias, and outlined detailed policy proposals to advance racial equality in the US.

The IBM boss also said that Congress needs to create “more open and equitable pathways” for all Americans to acquire marketable skills and training, with the need being particularly acute in communities of colour.

Opposing the use of facial recognition

In the statement, Krishna wrote: “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparency.”

Krishna, who stepped into the role of CEO in April, said that vendors and users of AI systems have a shared responsibility to ensure that AI is tested for bias, particularly when it is used in law enforcement.

He added that national policy should encourage and advance uses of technology that bring greater transparency and accountability to policing, such as body cameras and modern data analytics techniques.

“IBM would like to work with Congress in pursuit of justice and racial equity, focused initially in three key policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities,” Krishna said.

According to the New York Times, several US cities including San Francisco have banned law enforcement from using facial recognition tools, citing concerns about privacy and false matches.

However, the New York Times detailed how in other parts of the US, such as Florida, the technology is used as a “part of daily policing”, providing a sheriff’s office in Pinellas County with access to 30m images, including driving licences, mug shots and juvenile booking photos.

In recent years, concerns have been raised about facial recognition technology in terms of surveillance, privacy, consent, accuracy and automation bias.

Use of body cameras

Krishna voiced support for the Justice in Policing Act, which was introduced by US Democrats on the same day that he published his open letter. The police reform bill is looking to expand requirements for body cameras, but limit the use of facial recognition in connection with them.

The American Civil Liberties Union (ACLU) responded to the bill saying: “We need to invest in technologies that can help eliminate the digital divide, not technologies that create a surveillance infrastructure that exacerbates policing abuses and structural racism.”

The large-scale implementation of body cameras in the US began in 2014 in a bid to increase transparency and police accountability. However, a 2017 study suggested that police-worn body cameras don’t reduce instances of force or reduce citizen complaints about excessive force.

When the study was released, Harlan Yu of social justice and civil rights organisations Upturn asked NPR: “If cameras don’t decrease use of force, don’t decrease the number of misconduct complaints and don’t change officer behaviour, then what are we adopting cameras for?”

In an interview with Yale News, co-author of the study Alexander Coppock said: “It is definitely true that there are instances that we can imagine where the body-worn camera is going to make a difference. They might be too rare to be significant.

“We tracked 2,200 officers in this study. It is an enormous experiment. We should be able to measure really small effects, but we just didn’t get anything statistically significant.”

Kelly Earley was a journalist with Silicon Republic

editorial@siliconrepublic.com