Biometric Surveillance Systems Are Being Hastily Rolled Out Across the West, With Next to No Public Debate

By embracing biometric surveillance, governments across the West are hurtling down a path that could lead us all to a very dark place. 

The Chinese Communist Party’s widespread use of facial recognition programs and other forms of artificial intelligence (AI) technologies to track, monitor and, where possible, evaluate the behavior of members of the public has received ample attention from Western media in the past few years. Myriad reports, some overblown, have been published on China’s creeping introduction of a Social Credit System. By contrast, far less attention has been paid to the increasing use of many of the same highly intrusive technologies by so-called “liberal democracies” in the West.

In 2019, IHS Markit released a report showing that while China may be leading the way when it comes to using surveillance cameras to monitor its population, it is far from an outlier.  According to the report, the United States had almost as many surveillance cameras as China, with roughly one for every 4.6 people (compared with China’s one for every 4.1). The UK was not far behind with one for every 6.5. The report also forecast that by 2021 there would be more than one billion surveillance cameras on the planet watching us.

EU Plays Catch Up

Now the EU is on the verge of building one of the largest facial recognition systems on planet Earth, as reports the recent Wired UK article, Europe Is Building a Huge, International Facial Recognition System:

The expansion of facial recognition across Europe is included in wider plans to “modernize” policing across the continent, and it comes under the Prüm II data-sharing proposals. The details were first announced in December, but criticism from European data regulators has gotten louder in recent weeks, as the full impact of the plans have been understood.

“What you are creating is the most extensive biometric surveillance infrastructure that I think we will ever have seen in the world,” says Ella Jakubowska, a policy adviser at the civil rights NGO European Digital Rights (EDRi). Documents obtained by EDRi under freedom of information laws and shared with WIRED reveal how nations pushed for facial recognition to be included in the international policing agreement.

The first iteration of Prüm was signed by seven European countries—Belgium, Germany, Spain, France, Luxembourg, the Netherlands, and Austria—back in 2005 and allows nations to share data to tackle international crime. Since Prüm was introduced, take-up by Europe’s 27 countries has been mixed.

Prüm II plans to significantly expand the amount of information that can be shared, potentially including photos and information from driving licenses. The proposals from the European Commission also say police will have greater “automated” access to information that’s shared. Lawmakers say this means police across Europe will be able to cooperate closely, and the European law enforcement agency Europol will have a “stronger role.”

Targeting Children

The UK, while no longer part of the EU, is hurtling down a similar path. And it is targeting the most vulnerable and impressionable members of society: school children. As I reported in October, nine schools in the Scottish region of North Ayrshire started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US), until a public outcry put paid to the pilot scheme.

But the Tory government is doubling down. According to a new report in the Daily Mail, almost 70 schools have signed up for a system that scans children’s faces to take contactless payments for canteen lunches while others are reportedly planning to use the controversial technology to monitor children in exam rooms. This time round, however, the government didn’t even bother informing the UK Biometrics and Surveillance Camera Commissioner Fraser Sampson of the plans being drafted by the Department for Education (DfE):

Professor Fraser Sampson, the independent Biometrics and Surveillance Camera Commissioner, said his office was unaware the DfE was drafting new surveillance advice in response to the growing trend for cameras in schools.

‘I find out completely by accident a couple of weeks ago by going to a meeting that the Department for Education has drafted a code of practice for surveillance in schools which they are about to put out to the world to consult,’ he told The Mail on Sunday.

‘And they [DfE] said. “What do you think of it?” And I say, “What code?” We had no idea about it. And having seen it, it would have benefited from some earlier sharing.’

Similar facial recognition systems have been used in the US, though usually as a security measure.

Privacy advocates have warned that the growing use of facial recognition systems in school settings could have a much more goal: to condition children to the widespread use of facial recognition systems and other biometric technologies. In my book Scanned I cite Stephanie Hare, author of Technology Ethics, who argues that it is about normalizing children to understand their bodies “as something they use to transact. That’s how you condition an entire society to use facial recognition.”

The Battle for Privacy

As the EU prepares to pass its own long-awaited AI Bill, which will set out the rules for the development, commodification and use of AI-driven products, services and systems across the 27-member bloc, a battle has broken out between proponents of the technologies and privacy advocates. The former, which include  intelligence agencies, law enforcement bodies and tech companies, argue that emerging biometric technology such as facial recognition programs are necessary to catch criminals. The latter have called for an outright ban on the technologies due to the threat they pose to civil liberties.

They include Wojtek Wiewiorowski, who leads the EU’s in-house data protection agency, EPDS, which is supposed to ensure the EU is complying with its own strict privacy rules. In November 2021 Wiewiorowski told Politico that European society is not ready for facial recognition technology: the use of the technology, he said, would “turn society, turn our citizens, turn the places we live, into places where we are permanently recognizable … I’m not sure if we are really as a society ready for that.”

In a paper published in March, the EPDS raised a number of concerns and objections about Prüm II:

  • The Prüm II framework does not make clear what sort of circumstances will warrant the exchange of biometric data such as DNA or the “scope of subjects affected by the automatic exchange of data”.
  • “The automated searching of DNA profiles and facial images should only be possible in the context of individual investigations of serious crimes and not of any criminal offense, as provided for in the proposal.”
  • “The alignment of the Prüm framework with the interoperability framework of the EU information systems in the area of justice and home affairs” requires careful analysis of its implications for fundamental rights.”
  • “The EPDS considers that the necessity of the proposed automated searching and exchange of police records data is not sufficiently demonstrated.”

Flaws in the System

The problem is not just about privacy; it is about the inherent flaws within facial recognition systems. The systems are notoriously inaccurate on women and those with darker skin, and may also be inaccurate on children whose facial features change rapidly. Wired magazine reported in 2019 that “US government tests find even top-performing facial recognition systems misidentify blacks at rates five to ten times higher than they do whites.”

There are also concerns about who the data will end up being shared with or managed by and just how safe it will be in their hands. In December 2021, the civil liberties group Statewatch reported that the Council of the EU, where the senior ministers of the 27 Member States sit, not only intends to extend the purposes for which biometric systems can be used under the EU’s proposed Artificial Intelligence Act but is also seeking to allow private actors to operate mass biometric surveillance systems on behalf of police forces.

In February Statewatch published another report titled “Building the Biometric State: Police Powers and Discrimination.” In it the group warned that the EU’s rapid expansion of biometric profiling at borders and identity checks within the 27-member bloc risk is likely to lead to increased racial profiling of ethnic minority citizens and non-citizens.

The report comes at a time when the EU’s ‘interoperability’ initiative and border policies are placing a renewed impetus on the biometric registration of foreign nationals and an increase in identity checks within the EU, in the hope of detecting individuals lacking the correct documentation. It argues that the treatment of skin colour as a proxy for immigration status, the existence of a huge database holding data solely on foreign nationals (the Common Identity Repository, currently under construction) and explicit policy instructions to step up identity checks to enact removals are likely to exacerbate the racist policing and ethnic profiling that are endemic across the EU.

Mission Creep

Meanwhile, UK and US police forces have been piloting live facial recognition systems, which allow authorities to compare live images of passersby on a street with those on a specific watch list. Some jurisdictions, also including the U.S. and the UK and soon the EU if Prism II becomes reality, are using retrospective facial recognition, which is arguably even more controversial since it enables police to check against a far broader range of sources, including CCTV feeds and images from social media.

“Those deploying it can in effect turn back the clock to see who you are, where you’ve been, what you’ve done and with whom, over many months or even years,” Jakubowska, told Wired magazine, adding that the technology “can suppress people’s free expression, assembly and ability to live without fear.”

One major problem, as with the application of many surveillance technologies, is mission creep. As Sky News reported last week, police in the UK have been cautioned for employing live facial recognition technology not only to identify suspects but also potential witnesses and victims to a crime.

Fraser Sampson, the biometrics and surveillance camera commissioner, has described the idea as a “somewhat sinister development”. It “treats everyone like walk-on extras on a police film set rather than as individual citizens free to travel, meet and talk,” he warned.

The commissioner was responding to guidance from the College of Policing which suggested victims of crimes and potential witnesses could be placed on police watchlists.

“How commonplace will it become to be stopped in our cities, transport hubs, outside arenas or school grounds and required to prove our identity?” asked Mr Samspon.

The former officer for West Yorkshire Police and the British Transport Police said: “The ramifications for our constitutional freedoms in that future are profound.

“Is the status of the UK citizen shifting from our jealously guarded presumption of innocence to that of ‘suspected until we have proved our identity to the satisfaction of the examining officer’?

 

The public backlash against facial recognition software is growing. As the Wired UK article notes, dozens of cities across in the US have even banned police forces from using the technology. The European Parliament is debating whether to ban the police use of facial recognition in public places as part of its AI Act, even as the Commission lays the ground for more “automated” access to biometric data. In the UK, the civil liberties group Big Brother Watch warns that Police and private companies “have been quietly rolling out facial recognition surveillance cameras, taking ‘faceprints’ of millions of people — often without you knowing about it.”

The U.S. company Clearview, which received seed funding from Palantir founder and CEO Peter Thiel, is facing several lawsuits in the United States for scraping people’s photos from the Internet without their permission. At least 600 law-enforcement agencies in the U.S. have used Clearview’s services but the company’s use of people’s photos without their consent has been declared illegal in Britain, Canada, France, Australia and Italy. It was also recently fined €20 million by Italy’s data protection agency and instructed to delete any data on Italians it holds and banned from any further processing of citizens’ facial biometrics.

Another Avenue of Opportunity: War

But Clearview has found a whole new avenue of opportunity: the war in Ukraine, as the New York Times recently reported:

In the weeks after Russia invaded Ukraine and images of the devastation wrought there flooded the news, Hoan Ton-That, the chief executive of the facial recognition company Clearview AI, began thinking about how he could get involved.

He believed his company’s technology could offer clarity in complex situations in the war.

“I remember seeing videos of captured Russian soldiers and Russia claiming they were actors,” Mr. Ton-That said. “I thought if Ukrainians could use Clearview, they could get more information to verify their identities.”

In early March, he reached out to people who might help him contact the Ukrainian government. One of Clearview’s advisory board members, Lee Wolosky, a lawyer who has worked for the Biden administration, was meeting with Ukrainian officials and offered to deliver a message.

Mr. Ton-That drafted a letter explaining that his app “can instantly identify someone just from a photo” and that the police and federal agencies in the United States used it to solve crimes. That feature has brought Clearview scrutiny over concerns about privacy and questions about racism and other biases within artificial-intelligence systems.

When the use of Clearview by Ukraine’s government was first announced, the company said it had scraped more than 2 billion images from Russian social media platform VKontakte. According to a Reuters report, Ukrainian soldiers could use the technology not only to identify fallen Russian soldiers but also weed out Russian operatives at checkpoints. Privacy campaigners are rightly up in arms. One of their biggest concerns is that facial recognition technologies are prone to making mistakes. And in war mistakes can have deadly consequences.

 

Print Friendly, PDF & Email

22 comments

  1. Appleseed

    Great article, Nick.
    However, under the “Flaws in the System” subhed, the Wired quote is missing two key words:
    “US government tests find even top-performing facial recognition systems misidentify blacks at rates five to 10 times higher than they do whites.” [emphasis added]

    1. Nick Corbishley Post author

      Thanks Appleseed, both for the praise and the correction, which has now been rectified.

    2. The Rev Kev

      I believe that they tried that software on US Senators several years ago and it identified several black Senators as wanted criminals. The white Senators, not so much so you can imagine how this would play out in any city. And if Clearview has scraped 2 billion images from Russian social media platform VKontakteso so that the Ukrainians can identify Russian dead and ring their families with the news, then you can be sure that they are scrapping Facebook and any other online publication to build up a database of people’s faces in the west so that when allowed, that they are ‘good to go.’ A few years ago, people were criticizing China for doing stuff like this and it looks like they want to do the same in the west. In the end, the only privacy that you will have will be the several square inches between your ears.

    1. Brooklin Bridge

      Two weaknesses with my rant should be pointed out. 1) While the wreaking of the very notion of privacy is absolutely true and there seems to be no bottom to it’s depravity, I’m not at all sure of the baguette fetish; truth or fiction. 2) Even if true, the baguette example would be of something pitiful (the odor, on the other hand is, or used to be, very real). Whereas the degree and intent of privacy invasion on this scale is as malignant and destructive as it is depraved.

      The very fact that any software engineer on planet earth would put fingers to keyboard to support such a thing never ceases to amaze me as a huge weakness of that whole disipline. Talk about getting hoisted on your own petard.

      1. Oh

        All these morons who worship at the altar of Hi Tech are totally in denial that each of the glorious apps that are so fond of and the conveniences their gargets provide are a method by which they give away their privacy – this includes Ring, Facebook, Google, YouTube, Amazon’s products, any store app, etc. etc. While giving away their privacy they’re giving away mine too.

        1. TimH

          While giving away their privacy they’re giving away mine too.

          Exactly. Ring, Facebook where the people are identified in photos, and Ancestry.com

  2. lex

    I’ve tried to explain to the anti-mask types that the real government intrusion on liberties are all the cameras with facial recognition programs and that likely our best defense at this point is to have a good excuse to wear masks. Hasn’t worked yet. Excellent article.

    1. rob

      I agree,
      I’ve been saying for the last two years that I don’t mind wearing a mask because we have a real problem with facial recognition idiocy being deployed seemingly everywhere.

  3. flora

    2019 was a year of big public protest around the world; protests against neoliberal economic policies, protests against authoritarian govts, protests in France and Spain and Hong Kong and the US and Canada and many other countries.

    There won’t be public debate about increasing surveillance because, imo, this is exactly what western govts want to stop the protests and protestors. (In one respect, I think the C19 lockdowns were welcome by govts for the effect of stopping protests.)

  4. digi_owl

    I do wonder how long before EU tries to expand that into tapping into passport biometrics.

    Because as best i can tell, they are already expanding said biometrics to be included on ID cards (that has to be refreshed every 5 years or so, natch). This ostensibly so that you can travel within EU without carrying a passport.

    I swear, lossing the Stasi as a low water mark to point to has made European politicians lose their minds (or simply revealed their true nature).

  5. Teejay

    Re: Biometric Surveil.: 8- 10 years ago after a dinner I attended, school age kids played a *game* on their smart phones. The app took their fingerprint and spit out a FBI report alleging they were in fact wanted for something. The kids all laughed at the different responses that popped up when they each tried it out. “Isn’t this fun?” Uh no, the normalization of giving out private information so cavalierly; treating it like it’s a game and training kids not just to be OK with it but to think how cool it is horrifies me. I made a couple of attempts to find out about the app but failed. If anybody in the NC community is familiar with this please leave a link or other info.

  6. QuicksilverMessenger

    I have also noticed while watching a couple of programs on apple tv (one of them Slow Horses that was talked about here the other day. The other was Suspicion maybe? Even the pretty interesting show Severance is rife with surveillance)- so much of the crime, cop, spy dramas are filled with surveillance cameras and these NASA Mission Control looking operation monitoring centers buzzing with activity where they are tracking people everywhere they go. Do these actually exist? At any rate, it certainly inures one to the widespread use.

    1. digi_owl

      I think they to some extent do, at least in UK (supposedly London has the most cameras pr KM^2 in Europe).

      And your comment brings to mind Enemy of the State. Interestingly the critics were almost unanimous in their dismissal of it as being over the top. Wonder if they would have been some quick to do so after Snowden’s info dump.

  7. Hepativore

    It is only a matter of time before biometrics are made a requirement for boarding a flight by the TSA…and I doubt that either the government or various security companies are going to care how much of a public backlash there is, as they have long stopped caring about the opinions of what they perceive as being “rabble”

    1. TimH

      biometrics are made a requirement for boarding a flight by the TSA

      Only for scheduled flight passengers. Those leaving from the general aviation plaza won’t be fussed and mussed.

  8. Dee C. Burns

    1984, V for Vendetta, Brave New World, and now Commies Overtly Vicious Insidiously & Dangerous. These monsters must be stopped. I surely feel sorry for the next generation.

    We can use this against them; get masks of those most hated, so all cameras will see are Turdope, Ford, Tory, Biden, Zelensky, Tam, Hadju, Freeland, Fauci, Biden and so many more. Someone could certainly make a good living selling these masks. After all, the point is ultimately imprison terrorists.

    Quick story of how I used an edict against the perpetrators: I worked for a company as an office employee, and they had me use a timeclock. I arrived early every day, left late every day and saw the extra time add up. They too saw how much overtime I had added to my pay, and within a couple of weeks, no more time clock punching for me. Also, no more early arriving or late leaving. They learned, I earned first extra money, then freedom. Better to make fools of them than a slave of me.

    1. RobertC

      Dee — Bravo! I’ve always said Work-to-Rule is the most effective labor action ever. Kurt Gödel proved that for us. When young engineers were having difficulties with their (government) management I gave them that advice.

  9. RobertC

    Many decades ago when 1MB of CPU memory occupied floor space I was invited to a seminar at UCSD on the future of computing. I asserted that some day each of us would have a computer dedicated to recording our lives. One attendee vociferously and furiously disagreed with me that such computing capacity would ever exist.

    We were both wrong. It’s many computers, one of which we carry in our pockets and others all too eager (and effective) at not only recording but guiding our lives.

    This information is not energy-free. As rapid climate change affects and limits our food, housing, transportation, etc choices at what point will the information collectors and vendors energy usage be rationed? And who will prioritize that usage?

  10. rob

    just like all the creeping madness that never seems to stop,,
    The IRS was/is planning to, then not(for now), to have everyone who files a return electronically, register with some form of biometric standard, to access their returns.
    The IRS, which since the on-set of obomney care, has intertwined our financial /birth identity markers with our healthcare/check-up history/medicines and conditions…

    There really is no resistance. the absence of any of this data, will be in itself a red flag.
    We will have to “deep fake” our lives, so that the overlords “think” they are collecting accurate info.. which for us older people may be something .. but for kids.. there will be no way around the mountain of collected info.

    as an aside, what gets me is since post 9-11… and when the bush admin was caught setting up “splitter” rooms to effectively comb ALL ELECTRONIC COMMUNICATIONS ON THE PLANET. And was building storage for tens of terrabytes? in utah… That 1.4 million sq.ft. facility in utah, which can hold the info in a library in something the size of a credit card… how many times over? and how many others?… but with all of this surveillance. how is child and sex trafficking’s and internet abuse still so rampant. About ten years ago there was one child porn site, which was found with @70,000 associated credit card numbers… If the internet is rife with all this… and I’m guessing it is..

    That means all these billions spent on this total information awareness.. seems to omit human trafficking ,and other despicable crimes…

Comments are closed.