Civil rights groups ring alarm bells over Amazon AI that can now see fear

15 Aug 2019

Image: © Mike Zakharov/Stock.adobe.com

Amazon’s Rekognition software is now able to see fear, but some groups are worried about how this will be used in the years to come.

Amazon is the latest company to face the ire of civil rights groups over technology that is finding its way into the hands of governments. The tech giant recently announced that an update to its Rekognition software – designed for facial detection and analysis – has added a new emotion to its repertoire.

“With this release, we have further improved the accuracy of gender identification,” Amazon said in a statement. “In addition, we have improved accuracy for emotion detection (for all seven emotions: happy, sad, angry, surprised, disgusted’, calm and confused) and added a new emotion: fear.”

The software is currently used for analysing large databases of images and videos, with customers ranging from media production companies to police forces. Now, however, with the addition of fear as a detectable emotion, organisations have raised alarm bells over its use by law enforcement and have gone as far as to call for a stop to its use by police.

The American Civil Liberties Union (ACLU) recently claimed that Rekognition falsely identified 26 Californian lawmakers as being matches to criminal mugshots on its public database of 25,000 photos. The experiment run by the group was attempting to highlight errors in the facial recognition software and encourage the passing of a bill in the US state that would ban its use on cameras worn by police officers.

‘Amazon is going to get someone killed’

Another organisation has also voiced concerns about the technology, with the digital rights advocacy group Fight for the Future claiming Amazon is building “the dystopian surveillance state of our nightmares”. In a statement, the group’s deputy director Evan Greer did not hold back in her criticisms of the new update.

“Amazon is going to get someone killed by recklessly marketing this dangerous and invasive surveillance technology to governments,” she said.

“Facial recognition already automates and exacerbates police abuse, profiling and discrimination. Now Amazon is setting us on a path where armed government agents could make split-second judgements based on a flawed algorithm’s cold testimony.”

However, as reported by GeekWire, an Amazon spokesperson responded to claims made by the ACLU, saying it is “once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines”.

“As we’ve said many times in the past, when used with the recommended 99pc confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking.”

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com