The Metropolitan Police has scanned almost a quarter of a million faces using live facial recognition software in 2023, Byline Times can reveal.
Freedom of Information requests filed to every police force in the UK showed that law enforcement officers in London have been zealous users of the system, which automatically scans the faces of passers-by and matches them against a watchlist.
Civil liberties groups and MPs have said such practices are incompatible with human rights. They have also criticised plans recently announced by the Policing Minister to open up the passport database for use by law enforcement.
The figures come as the EU is in the process of banning police use of live facial recognition (LFR) – while officers in London are pressing ahead with its use.
The Met took several months to respond to the FOI request, which revealed it had scanned an estimated 247,764 faces in 2023 in 13 deployments across London. These efforts resulted in just 12 arrests.
Among the events targeted were the King’s Coronation and his birthday celebrations, raising the question of whether protestors have been targeted by LFR cameras.
Campaigners said that LFR will have a chilling effect on freedom of speech.
Conservative MP Charles Walker – one of 65 parliamentarians who have called for an immediate stop to its use – said: “The difference between us and China is that the state is our servant, we are not the servants of the state. The presumption in an open and free society is that most people are going about their business and conducting their lives in a lawful fashion.”
Anna Bacciarelli, associate tech director at Human Rights Watch, said that the police scanning faces in this way is a “disproportionate use of such powerful and invasive technology” which can have “a chilling effect and deter people from exercising their expression and assembly rights in public spaces”.
When contacted for comment, the Met sent Byline Times a previously released statement which said that the technology “enables us to be more focussed in our approach to tackle crime, including robbery and violence against women and girls”.
It continued: “We understand the concerns raised by some groups and individuals about emerging technology and the potential for bias. We have listened to these voices.”
How Does Facial Recognition Work?
The Met currently uses facial recognition in two ways: live and retrospective. The former is the type that has campaigners concerned.
The technology scans the faces of people as they walk by, immediately assessing them against a watchlist and telling officers if there is a match. It has been procured from a company called NEC.
Giving evidence to a parliamentary select committee in May, the Met’s director of intelligence Lindsey Chiswick said that if there is not a match the image is not only immediately deleted, but is also pixelated – a function called “privacy by design”.
If there is a match then officers can approach the individual and, if necessary, make an arrest, although Chiswick told MPs that they are not obliged to do so. Positive images can be kept for 31 days or longer if there is a judicial reason.
Although there have been concerns in the past about false positives, the Met’s data suggests that there has not been a single example of a false positive match in 2023.
She said the technology had led to “significant” arrests for offences including conspiracy to supply class A drugs, assaulting emergency workers, GBH, and having escaped from prison.
The watchlists themselves are “specific and bespoke” to the event at which they were used, Chiswick said. They are deleted after the deployment.
In the case of King Charles’ Coronation, she told MPs: “It was specific and bespoke to that event, and it contained some fixated individuals who we knew were going to go to the coronation and potentially do harm.”
In addition to arrests, Chiswick claimed that LFR can have a significant “deterrence” effect and cited the Coronation as an example.
When Byline Times asked the Met whether it had credible intelligence about a potential offence which was deterred by the deployment of LFR, a spokesman said: “The MPS recognise the fundamental right of people to lawfully protest and, to date, overt LFR has not been deployed for this purpose. Should the MPS have intelligence to indicate there are people attending these events who have previously been engaged in criminal activity then this is a tactic that is available.”
Emmanuelle Andrews, policy and campaigns manager at Liberty, said its use at the Coronation was “extremely worrying”.
She added: “In the context of a broad crackdown on the ways in which people can make their voices heard, increasing use of facial recognition to target protestors is likely to make it even harder for people to stand up for what they believe in.”
Some campaigners have criticised the opaque reporting of deployments. The record provided by the Met lists vague locations like Camden High Street or Wardour Street to refer to operations in which tens of thousands of faces were scanned.
In one case the location is simply given as “Islington” – an entire borough which is home to 240,000 people – making it hard for the public to assess whether use of the technology is proportionate.
Byline Times asked the Met about these deployments. A spokesman said: “The MPS has always been very transparent in its approach to the use of facial recognition technology and we have informed the public of where the technology is to be deployed in advance.”
Chiswick’s evidence to MPs shed some light as to its use. She said deployments in the first six months of the year in Camden had been related to gang activity and a shooting in January at a church in Euston.
The targets of other operations remain unclear.
A ‘Blunt Tool’
The majority of police forces have never used live facial recognition.
South Wales Police is among those which has used it in the past two years, most recently at a rugby match in Cardiff in November. Other events it has targeted have included a concert by the dance band Chemical Brothers in Cardiff Bay in August, at an air show in Swansea, and at Cardiff concerts by Harry Styles and Beyoncé.
The EU has unveiled plans to ban the use of live facial recognition by police as part of its AI Act, which is currently making its way through the European Parliament.
Meanwhile, in the UK, Policing Minister Chris Philp last month announced plans to open up the country’s database of 45 million passport holders to help catch shoplifters.
For Anna Bacciarelli, the plan is “extremely concerning” and akin to putting “anyone with a passport on a virtual police line-up”.
MP Charles Walker said: “The blunt tool of facial recognition places us all under suspicion, we are reduced to units of data to be harvested and cross-referenced. Autocracies like to keep an eye on their citizens because they don’t trust them, democracies should always aspire for a healthier, more trusting, relationship with those they serve.”
Madeleine Stone, advocacy officer at Big Brother Watch, said the UK was taking a “reckless approach” because “facial recognition is a dangerously authoritarian technology that turns the public into walking ID cards in a constant police line up.” She believes it must be banned.
A Home Office spokesman said technology can “help the police quickly and accurately identify those wanted for serious crimes”.
They added: “It also frees up police time and resources, meaning more officers can be out on the beat, engaging with communities and carrying out complex investigations.”
One question asked by MPs in the May evidence session was whether the watchlists could eventually be linked to the network of cameras across the country to track down suspects. Chiswick answered that, while this was technically possible, its use in this way is currently restricted by the law.