‘Conditioning an entire society’: the rise of biometric data technology

In Moscow, users of the city’s famous Metro system can now pay using their face, a system that, for now at least, is voluntary. Photograph: Natalia Kolesnikova/AFP/Getty

Biometrics

The use of our bodies to unlock access to services raises concerns about the trade-off between convenience and privacy

In a school canteen in Gateshead, cameras scan the faces of children, taking payment automatically after identifying them with facial recognition. More than 200 miles away in North London, staff at a care home recently took part in a trial that used facial data to verify their Covid-19 vaccine status. And in convenience stores around the country, staff are alerted to potential shoplifters by a smart CCTV system that taps into a database of individuals deemed suspect.

In each case, biometric data has been harnessed to try to save time and money. But the growing use of our bodies to unlock areas of the public and private sphere has raised questions about everything from privacy to data security and racial bias.

CRB Cunninghams, the US-owned company whose facial recognition tech is being deployed in lunch halls, has said its systems speed up payment and could reduce the risk of Covid-19 spread via contact with surfaces. The system was first tested at Kingsmeadow school in Gateshead last year and dozens of schools have signed up to follow suit.

Enthusiasm for the system may be on the wane now, though, after North Ayrshire council suspended use of the technology at nine schools following a backlash. The decision to back out came after parents and data ethics experts expressed concerns that the trade-off between convenience and privacy may not have been fully considered.

“It’s about time-saving,” said Prof Sandra Wachter, a data ethics expert at the Oxford Internet Institute. “Is that worth having a database of children’s faces somewhere?”

Stephanie Hare, author of Technology Ethics, sees the use of children’s biometric data as a “disproportionate” way to make lunch queues quicker. “You’re normalising children understanding their bodies as something they use to transact,” she said. “That’s how you condition an entire society to use facial recognition.”

Experts are concerned that biometric data systems are not only flawed in some cases, but are increasingly entering our lives under the radar, with limited public knowledge or understanding.

There are salutary examples of how such technology could be troublingly authoritarian in its usage, and China offers some of the more extreme precedents. After a spate of toilet paper thefts from public conveniences in a park in Beijing, users were asked to submit to a face scan before any paper would be released, and in Shenzhen, pedestrians who crossed the road at a red light had their faces beamed on to a billboard.

In the US, a little-known company called Clearview AI was in 2020 revealed to have scraped social media sites such as Facebook to harvest users’ facial data, collecting more than 3bn pictures that could be shared with police.

Some of the technology set to be rolled out in the UK seems, on the face of it, more benign. Eurostar is testing whether facial data could be used for boarding its cross-Channel trains, using technology built by US-based Entrust.

In Manchester, the city mayor, Andy Burnham, has held talks with FinGo, a startup whose technology analyses the unique pattern of veins in people’s fingers.

Applications under consideration are payment for buses and gaining access to universities and the provision of prescription medicine, while the city’s licensing authority has approved it for use at hospitality venues.

FinGo says it stores an encoded version of the finger vein pattern, which cannot be reverse-engineered by thieves, while different segments of the data are stored in different places to heighten security.

Earlier this year, at three care homes in North London run by Springdene, the London-based facial verification company iProov tested systems that allow staff to verify their Covid status using their faces.

That technology is not in use anywhere at the moment, iProov said, but it is one of several firms whose systems are embedded in the NHS app, deployed when users want to access services such as their Covid status or GP appointment bookings using their face.

Such applications have prompted misgivings among technology experts and civil liberties groups about how long users’ data is stored, how secure that data is, and even whether foreign law enforcement agencies can demand to see it.

Ankur Banerjee, chief technology officer at digital identity startup Cheqd, points out that biometric technology relies on our trust in the people operating it. In Moscow, users of the city’s famous Metro system can now pay using their face, a system that, for now at least, is voluntary.

“That’s convenient for 99% of people, but if someone shows up to an anti-government protest, suddenly they have the ability to track down who went in and out, unlike an Oyster-style card that might not be registered,” said Banerjee.

Some technology that is already in common use in the UK has sparked anxiety about civil liberties. London-based FaceWatch sells security systems that alert shop staff to the presence of a “subject of interest” – typically someone who has behaved antisocially or been caught shoplifting before. It started out as a system for spotting pickpockets at Gordon’s wine bar in central London, of which FaceWatch founder Simon Gordon is the proprietor.

Cameras scan the face of anyone entering a building and compare it with a database of people marked out for special scrutiny.

However, Wachter has concerns about the prospect of such technology becoming more widespread. “Research has shown that facial recognition software is less accurate with people of colour and women.” She also points to the potential for existing human bias to be hardwired into supposedly neutral technology. “How can you trust that they ended up on the watch list accurately? There is bias in selective policing and in the judicial system.”

Nor is it clear in many cases to whom such systems are accountable and how individuals can contest the judgments they make. “What if I’ve been wrongfully accused, or the algorithm incorrectly matched me with somebody else?” Banerjee asks. “It’s private justice where you have zero recourse on being able to correct that.”

FaceWatch said it does not share the facial data it holds with the police, although they can access it if an offence is reported. The company said it minimises the risk of misidentification by ensuring cameras are positioned in good light to enhance accuracy, with any borderline cases referred to a manual checker. People on the watchlist can, it says, challenge the decision.

FaceWatch added that it stores facial data for up to two years and that it is both encrypted and protected by “bank-grade” security.

But Wachter points out that the security systems guarding our biometric data is only state-of-the-art until the day they are breached.

“The idea of a data breach is not a question of if, it’s a question of when,” she said. “Welcome to the internet: everything is hackable.”

We should, she says, be cautious about rolling out technology just because it promises to make our lives easier. “The idea is that as soon as something is developed, it has a place in society,” she said. “But sometimes the price we pay is too high.”

Read More

Rob Davies