The UK should immediately ban the use of live facial recognition in public spaces until laws are introduced to regulate biometric technologies, according to an independent legal review published on Wednesday.
The two-year review, conducted by barrister and former deputy mayor of London Matthew Ryder, was commissioned by the independent Ada Lovelace Institute in response to increasing police and private use of technologies such as facial recognition in the UK.
It found an “urgent need” for new legislation after an analysis of existing laws across human rights, privacy and equality found them to be inadequate. The report concluded that a moratorium on live facial recognition — the live surveillance of people in public and private areas — should be enacted.
The review comes as politicians, courts, regulators and rights groups around the world have expressed concerns over using biometric technologies which they argue are invasive and often inaccurate. In the UK, police forces, schools, and private retailers such as Co-op and J Sainsbury have all begun to use it.
“The current use of live facial recognition is not lawful, and until the legal framework is updated, it will not be. There is an overriding public imperative to make changes,” said Carly Kind, director of the Ada Lovelace Institute and a former human rights lawyer.
Biometric data is any personal data relating to an individual’s body or behaviour, ranging from fingerprints, DNA and facial features, to voice prints, gait patterns and iris scans. This data can be used to identify individuals, but also to categorise and make inferences about the behaviour of groups.
“You don’t want to become a society where you are using biometrics to surveil members of the public who can’t opt out of it,” said Ryder.
Ryder recommended that public-private partnerships, such as the use of facial recognition by the 67-acre King’s Cross development to help police, should be investigated further because the scale of collaboration on deploying these types of tools could “grow dramatically” in coming years.
“Facial recognition is just one example, but by no means the entirety of the issue. We are concerned about the collection of health data, behavioural data . . . about the extent to which [companies and governments] are using . . . biometric data, to make judgments about public safety and public health,” he added.
The report joins proposals to limit the use of biometrics across the US and Europe, including the EU’s draft Artificial Intelligence Act and the US Senate bill for a Facial Recognition and Biometric Technology Moratorium Act, which prohibits federal use of certain biometric technologies.
Companies including Microsoft and Amazon have restricted the use of facial recognition and emotion-reading technologies by law enforcement. In 2019 San Francisco became the first city in the US to temporarily ban the use of facial recognition by local authorities.
“[The UK government] is in the midst of large-scale reforms of data use and protection more broadly, so this is a chance . . . before the technology becomes widespread,” said Ada Lovelace’s Kind.