Members of the House of Lords have written to the United Kingdom’s home secretary warning that the use of live facial recognition technology by police lacks a legal basis, and calling for legislation on the matter to be voted on by Parliament.
Live facial recognition (LFR) software, which works by comparing the facial images of passers-by against a database of suspects, has been used in England and Wales since at least 2015. Unlike with similar biometrics such as fingerprints and DNA profiles, there is no focused legislation controlling how police can deploy the technology.
Onlookers are concerned that it provides a significant surveillance capability, trawling CCTV feeds to identify individuals in public, without stringent safeguards.
In the letter published on Saturday, the Justice and Home Affairs Committee of the House of Lords wrote that it was “deeply concerned” about the use of LFR “being expanded without proper scrutiny and accountability,” and called on the government to bring forward legislation regulating the technology that parliament could vote on.
Alongside the concerns about civil liberties, questions have been raised about the accuracy of the technology. An independent study funded by the Metropolitan Police found that 81% of passers-by the system flagged were inaccurately matched to a person on the comparison database. In response, the Met said it was “extremely disappointed with the negative and unbalanced tone” of the scientific report.
MPs previously called for a moratorium on the use of LFR in 2019, and a Court of Appeal judgment in 2020 found that South Wales Police was deploying it unlawfully — while also warning there were “fundamental deficiencies” in the legal ideas around its use — but police across the United Kingdom continue to use the technology.
A spokesperson for the Home Office told Recorded Future News: “Facial recognition, including live facial recognition, is a powerful tool that has a sound legal basis, confirmed by the courts. It has already helped the police to catch a large number of serious criminals, including for murder and sexual offences.
“The police can only use facial recognition for a policing purpose, where necessary, proportionate and fair, in line with data protection and human rights laws,” they added.
The legal shortcomings around the use of facial recognition technology in the United Kingdom stretch back more than a decade. In 2012, the High Court ruled that police were unlawfully holding the mugshot images of hundreds of thousands of innocent citizens who had been arrested but never charged with a crime.
The judgment demanded that the policy be revised in a matter of “in months, not years,” but it was not until 2017 — five years later — that the government formally did so. In its announcement, the government stated its policy was that innocent people’s mugshots could be retained for six years, contrary to the High Court decision.
While the police use of biometric data in England and Wales is regulated — and there are limits to the length of time police can retain fingerprints and DNA profiles — facial recognition technology has never been covered by similar legislation.
The Home Office argues that even without specific legislation, the use of the technology is already covered by laws for data protection, equality and human rights.
It has previously argued that collecting and searching for individuals’ facial images is “generally less intrusive [than collecting and searching for DNA profiles or fingerprints] as many people’s faces are on public display all of the time.”
But in his annual report in 2015, Alastair MacGregor, the independent biometrics commissioner at the time, warned “a searchable police database of facial images arguably represents a much greater threat to individual privacy than searchable databases of DNA profiles or fingerprints.”
It was a position echoed by all of his successors in the role, which — again unlike the regulated biometrics of DNA and fingerprints — did not have a statutory basis for overseeing the use of facial images, something the commissioners repeatedly expressed concern about.
Last year, the British government was accused of “vandalism” over its “shocking” plans to remove the existing ad-hoc safeguards around biometrics and public space surveillance by Fraser Sampson, one of MacGregor’s successors.
The role is set to be abolished in its entirety and replaced by a “Forensic Information Database Strategy Board” under the proposed Data Protection and Digital Information Bill, which is currently being scrutinized by Parliament. This bill will also see the role of the Information Commissioner, responsible for data protection, replaced by an Information Commission.
The legislation does not state that these boards will be independent from government, as the commissioners are. It also allows the Secretary of State to change the databases that the board is required to oversee using statutory instruments, a form of secondary legislation that bypasses parliamentary votes.
Get more insights with the
Recorded Future
Intelligence Cloud.
No previous article
No new articles
Alexander Martin
Alexander Martin is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.