UK watchdog warns against the use of ‘immature’ emotion analysis tech

27 Oct 2022

Image: © Axel Bueckert/Stock.adobe.com

The ICO said the data use involved in emotion analysis is ‘far more risky’ than traditional biometric tech and could create issues around bias, inaccuracy and discrimination.

The UK’s data privacy watchdog has warned organisations to consider the public risks of emotion analysis technologies before implementing them.

The Information Commissioner’s Office (ICO) said it will investigate firms that do not act responsibly or pose a risk to vulnerable people by improperly bringing in these biometric systems.

According to the ICO, emotion analysis systems process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture. These systems rely on storing and processing a range of personal data, such as subconscious behavioural or emotional responses.

The watchdog said this type of data use is “far more risky” than traditional biometric technologies used to identify people, with a greater risk of bias, inaccuracy and discrimination.

ICO deputy commissioner Stephen Bonner said developments in the biometrics and emotion AI market are “immature” and may never work.

“While there are opportunities present, the risks are currently greater,” he added. “At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”

Bonner said the only “sustainable biometric deployments” are those that are fully functional, accountable and backed by science. Speaking to The Guardian this week, he said emotion analysis technologies do not seem to be backed by science.

‘We are yet to see any emotion AI technology develop in a way that satisfies data protection requirements’
– STEPHEN BONNER

In the ICO warning, Bonner added that the watchdog will continue to “scrutinise the market” and identify stakeholders that seek create or deploy these technologies.

“We are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area,” Bonner said.

The ICO is developing guidance on the wider use of biometric technologies such as facial, fingerprint and voice recognition systems.

The watchdog plans to release this guidance in spring 2023 to help businesses using this technology, while highlighting the importance of data security.

“Biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used,” the ICO said in a statement.

In 2019, a biometric system used by banks, UK police and defence firms was breached, resulting in the personal information of more than 1m people being discovered on a publicly accessible database.

The EU is also developing regulations for AI and biometric systems. But ICCL technology fellow Dr Kris Shrishak recently spoke to SiliconRepublic.com about the challenges of regulation when it comes to facial recognition technology.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com