EU lawmakers are calling for a full ban on biometric surveillance

10 Nov 2022

Image: © Prostock-studio/Stock.adobe.com

The MEPs and a coalition of 76 NGOs agreed to not support the upcoming AI Act if it does not include a ban on the use of biometric surveillance.

A group of MEPs and NGOs have spoken on the threat biometric surveillance technology poses to democracy and the risks of discrimination toward certain groups.

10 members of the European Parliament were joined this week by representatives of the Reclaim Your Face coalition, which consists of 76 NGOs across the EU that are pushing for a strong ban on biometric surveillance.

Biometric technology such as facial recognition scanning is already being used by law enforcement groups, particularly in China and the US. Earlier this year, the Department of Justice revealed plans to allow for the use of facial recognition tech by An Garda Síochána under planned regulation.

At the Parliament event on 8 November, speakers noted the negative impacts biometric surveillance can cause and called for a full ban to be implemented in the upcoming AI Act.

This act aims to address the risks associated with specific uses of AI through a set of complementary and proportionate rules. Elements of the proposed act have been criticised by European watchdogs for not going far enough with certain technology, such as live facial recognition in public places.

In October, a European Parliament meeting on the draft AI Act was largely focused on the scope of the regulation and its restraints toward the use of biometric recognition, Euractiv reports.

Risks of abuse and discrimination

Moderating the event, MEP Patrick Breyer said there is a “chilling effect” that biometric surveillance could have on our society.

“People who constantly feel watched cannot freely and courageously stand up for their rights and for a just society,” Breyer said. “This is not the diverse society I want to live in, and in which I want my child to grow up.”

Breyer spoke of issues surrounding discrimination when this technology is used and how certain countries around the world are using biometric surveillance for autocratic purposes.

“It is no coincidence that China, the inventor of the social credit system, is heavily relying on this technology, including in Hong Kong when the students were protesting for democracy,” Breyer said.

“It’s no coincidence also that Iran wants to use this technology to enforce laws on mandatory wearing of hats. It’s no coincidence that Russia is using this technology to hunt down anti-war protesters and also fleeing conscripts for its war in Ukraine.”

The Reclaim Your Face movement is being coordinated by the European Digital Rights association (EDRI). This group’s policy advisor, Ella Jakubowska, spoke about how biometric technology is already being used in the EU and causing discrimination issues.

“We’ve seen homeless people and migrants subjected to smart surveillance, which punishes them instead of helping them,” Jakubowska said. “Sports fans treated as if they are criminals, wrongfully identified and even fined for trying to attend a football match. Protesters and journalists tracked and suppressed.

“We also know that there are companies right now in Spain, the Netherlands, the Czech Republic and probably more that are selling biometric technologies which offers to automatically profile people based on their emotions and behaviours, their gender and ethnicity,” Jakubowska said.

A Vice article in 2019 claimed police in the Netherlands had a database of 2.2m images, representing 1.3m people who were suspected of committing a crime that would lead to four years or more in jail time.

GDPR is not enough

One representative shared details on how his biometric data was being used without his consent by a US company.

Matthias Marx is a member of the Chaos Computer Club, a large hacker association and a member of EDRI. In 2020, Marx sent a data subject access request to Clearview AI, a controversial facial recognition company headquartered in the US.

Marx filed the request shortly after a New York Times investigation revealed details of the company’s tracking and surveillance tools. He found out that Clearview AI was processing his biometric data without his consent and lodged a complaint to the Hamburg data protection authority.

“What followed is a very long fight that is not over yet,” Marx said. “What we learned after this two-and-a-half-year journey is that GDPR does not protect against biometric surveillance.”

Clearview AI told Marx his biometric data had been deleted, but Marx claimed that “they could just rescan my face in another photo they find, so this issue is not solved”.

“We need strong regulation, we need a ban of biometric surveillance,” Marx said. “We should act now because even European [companies like Clearview AI] have started to pop up.”

Clearview AI has been facing pressure from organisations and watchdogs around the world. For example, authorities in Australia and Canada have ordered the company to stop collecting images of the citizens in these countries.

Dr Kris Shrishak of the Irish Council for Civil Liberties previously told SiliconRepublic.com that it may be difficult for regulators to enforce rulings against Clearview AI, as it is headquartered in the US and does not appear to have offices in other countries.

Outside of the EU, governments and organisations have begun to take more notice of the issues surrounding biometric technology.

Last month, The UK’s Information Commissioner’s Office (ICO) warned organisations against the use of “immature” emotion analysis technology. The watchdog said these systems are “far more risky” than traditional biometric tech and could create issues around bias, inaccuracy and discrimination.

The ICO is also developing guidance on the wider use of biometric technologies such as facial, fingerprint and voice recognition systems, which it plans to release next year.

In June, Microsoft limited access to parts of its facial recognition tech and removed certain capabilities that detect controversial features such as a person’s age, gender and emotional state. The company said this decision was part of a broader push for ethical AI use.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com