Police need legal code of practice to use facial recognition, watchdog says


1 Nov 2019

Image: PA Media

The Information Commissioner’s Office published its findings from an investigation into the use of live facial recognition by UK police in public places.

A legal code of practice is needed before live facial recognition technology (LFR) can be safely deployed by police forces in public places in the UK, according to the Information Commissioner’s Office (ICO).

The data regulator said it had serious concerns about the use of a technology that relies on large amounts of personal information.

Commissioner Elizabeth Denham said current laws, codes and practices “will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents”.

She called for police forces to be compelled to show justification that LFR is “strictly necessary, balanced and effective” in each case it is deployed.

LFR maps faces in a crowd by measuring the distance between facial features, then compares results with a “watch list” of images, which can include suspects, missing people and persons of interest. South Wales Police and the Met Police have been trialling LFR as a possible way to reduce crime, but the move has been divisive.

“The absence of a statutory code that speaks to the specific challenges posed by LFR will increase the likelihood of legal failures and undermine public confidence in its use,” Denham said.

“As a result, the key recommendation arising from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR.

“This is necessary in order to give the police and the public enough knowledge as to when and how the police can use LFR systems in public spaces. We will therefore be liaising with Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress our recommendation for a statutory code of practice.”

‘There is a balance to be struck’

The ICO called for more research to eliminate bias in the algorithms behind LFR, particularly in relation to ethnicity. It has previously warned about potential technological bias that can see more false positive matches from certain ethnic groups.

In September, a High Court ruled that the use of the technology by South Wales Police had not been unlawful after an activist argued that having his face scanned caused him “distress” and violated his privacy and data protection rights by processing an image taken of him in public.

Ed Bridges from Cardiff brought the challenge after claiming his face was scanned while doing Christmas shopping in 2017 and at a peaceful anti-arms protest in 2018. After the ruling, Bridges said he would appeal against the decision.

Denham said the judgment “should not be seen as a blanket authorisation for police forces to use LFR systems in all circumstances” because it was a case about a specific deployment.

The ICO also published its first opinion – official thoughts issued to Parliament, government or other relevant bodies and the public on an issue related to the protection of personal data – on the use of the technology.

“When LFR is used, my opinion should be followed,” Denham said.

“My opinion recognises there is a balance to be struck between the privacy that people rightly expect when going about their daily lives and the surveillance technology that the police need to effectively carry out their role.

“Therefore it makes clear that police forces must provide demonstrably sound evidence to show that LFR technology is strictly necessary, balanced and effective in each specific context in which it is deployed.”

– PA Media