How Facial Recognition Technology Is Bringing Surveillance Capitalism to Our Streets

Yves here. Facial recognition is being adopted rapidly, particularly in commercial settings. And I must confess I didn’t know that Bernie Sanders is the only Presidential candidate to call for a ban on its use by police.

By Freddie Stuart, the producer of ourEconomy’s ourVoices podcast. He is also the co-founder of The Junction and part of the media team at The World Transformed. He tweets at @freddiestuart12. Originally published at openDemocracy

Last month it was revealed by the Financial Times that facial recognition cameras had been used to identify pedestrians in the Granary Square area of the new Kings Cross complex in London between 2016 and 2018.

Argent, the developer and asset manager charged with the design and delivery of the site, admitted to the use of two CCTV cameras equipped with biometric technology to map facial features. These cameras then ran this information through a database supplied by the Metropolitan Police Service to check for matches.

The estate at Kings Cross is a privately owned complex, but one that is used by thousands of members of the public every day. In addition to over 2,000 homes, the area hosts a variety of shops, hotels and music venues, as well as the world-renowned Central Saint Martins School of Art.

Rightly, there has been a public outcry over the covert use of this technology. The Information Commissioner’s Office (ICO) has launched a subsequent investigation, whilst London Mayor Sadiq Khan has written to the development’s CEO seeking an explanation.

Although it has garnered significant press attention, Kings Cross is just one example of how facial recognition technology is being rolled out across “public” spaces in London. Last month the Evening Standard revealed that planning permission has been granted by the City of London Corporation for the implementation of advanced surveillance cameras in The Barbican Centre, where 16 of the new 65 cameras will be capable of recognising faces, and possess an invasive two-way audio feature – “potentially allowing controllers to listen in”.

Meanwhile similar projects have been approved at Liberty’s department store in Soho, and Hay’s Galleria on the South Bank. The Financial Times also exposed proposals for the installation of privately owned facial recognition cameras across the 92 acre estate at Canary Wharf.

The same piece noted: “convenience stores such as Budgens, and supermarkets – including Tesco, Sainsbury’s and Marks and Spencer – all have cameras that are already, or soon will be, capable of facial recognition”.

Just as we have been normalised to the 500,000 CCTV cameras operating across London today, facial recognition may soon become a ubiquitous norm of everyday life in the 21st century.

So What’s the Logic?

Despite the claims of private companies that this technology is introduced solely ‘to help ensure public safety’, research from the independent not-for-profit activist group Big Brother Watch shows that, on average, the current biometric cameras identify individuals incorrectly over 90% of the time.

In its research, Big Brother Watch found that the Metropolitan Police’s facial recognition matches were 98% inaccurate, whilst the same technology used by South Wales Police failed to find a correct match on 91% of occasions.

So if the current technology isn’t useful for identifying criminals, why are companies rolling it out across the capital? Do private companies currently have such insurgent security issues that the testing of facial recognition technology is a necessary breach of individual privacy?

Another explanation for this insistence is the potential monetisation of valuable data inherent in surveillance technology.

In our modern economy, data has become a prized economic resource. Once aggregated and refined, collections of data can create powerful predictive models of human behaviour, which provide valuable information to instruct business decisions.

In her recent book, ‘The Age of Surveillance Capitalism’, Shoshana Zubouff has popularised discussions of how this plays out in our modern digital economy. She explains how the value of this new resource has led to a race to extract data, with tech companies utilising digital platforms as a medium of user engagement, and thus a mechanism of data generation.

This has led to covert surveillance techniques in the online marketplace, where sites used by members of the public are owned and surveyed by private companies. It has now become commonplace to hear of Google using your individual searches to sell targeted ads, Twitter promoting content on your feed based on who you follow, or Facebook data being scraped to enhance political campaigns.

But while the centrality of data to the business models of tech companies is well-documented, the collection of data in privately owned physical space is a relatively unexplored phenomenon.

Just as tech firms control much of the publicly-used internet, ownership of open space is increasingly being taken on by private corporations. In 2017 The Guardian published a map of what it called the growth of ‘pseudo-public’ spaces in London – ‘open and publicly accessible locations that are owned and maintained by private developers or other private companies’.

In the same way that the ownership of online platforms is used as space to collect personal information, these physical spaces could soon become the real-world data mines of private firms.

Here, visual surveillance plays a leading role. If a company is able to identify even the rough demographic of pedestrians in areas such as Kings Cross, it can sell this as valuable information to businesses, enabling informed decisions on issues such as location, opening hours or advertisements.

More advanced facial recognition technology may be able to identify the individual, and advise companies looking to tailor their products in real-time.

Once perfected, the ultimate potential here is for facial recognition to match an individual to their digital online profile, connecting the physical data with its digital counterpart. This would enable fine-grain control for firms, altering displays such as digital advertisements in response to detailed information on their particular audience.

In short, just as big tech companies utilise data to tailor our interaction with the digital world, facial recognition technology presupposes a panoptic physical world of administered perception.

Sounding Like a Far-Fetched Dystopian Nightmare? It Isn’t.

A recent report from the Carnegie Endowment for International Peace found that ‘AI surveillance technology is spreading at a faster rate to a wider range of countries than experts have commonly understood’.

According to the report’s executive summary ‘at least seventy-five out of 176 countries globally are actively using AI technologies for surveillance purposes. This includes: smart city/safe city platforms (fifty-six countries), facial recognition systems (sixty-four countries), and smart policing (fifty-two countries)’

The most prominent distribution of this technology is by Chinese companies, which supply AI surveillance technology in sixty-three countries. Thirty-two countries are supplied with similar technology by companies based in the US.

There are many leading examples of how this surveillance technology is used for datafication.

In the US, a company called Cooler Screens has embedded products with sensors and digital screens to individually target advertisements at customers. Walgreens is now rolling out this technology, which seeks to analyse a customer’s age, gender, time spent browsing, and even emotional responses.

Similar technologies were discovered in malls in Canada in 2018, when the media outlet CBC reported that facial recognition had been used to predict the approximate age and gender of customers. The technology was only discovered when ‘a visitor to Chinook Centre in south Calgary spotted a browser window that had seemingly accidentally been left open on one of the mall’s directories’.

Perhaps most relevant for exploring the datafication of facial recognition is the app FindFace. FindFace, which launched in Russia in 2016, made it possible to find an individual’s social media profile simply by capturing an image of their face. In 2016, the Independent reported that ‘FindFace’s creators are working with Moscow police to integrate their software into the city’s CCTV camera network, so authorities will be able to detect wanted suspects as they walk down the street’. FindFace is no longer open for public use, but a similar company SearchFace is now operating in Russia.

Each of these cases demonstrates the capability of surveillance cameras to advance mechanisms of datafication into the real world, analysing, contextualising and commodifying our physical data.

The threat this poses to our right to privacy is obvious, but it also undermines the agency of the individual by denying the notion of truly public space. At its heart, this marketisation threatens to dispossess humans of their independence in the name of ‘convenience’, and by doing so challenges the very notion of individual freedom.

So What Is To Be Done?

In the UK, the use of facial recognition technology to monitor members of the public for commercial purposes is illegal without prior notice. This is in accordance with the General Data Protection Regulation from the EU, which attempts to roll back on the datafication of the social commons by mandating private companies to obtain explicit consent from individuals prior to the collection of sensitive personal data.

In cases such as Kings Cross however, the quoted intention is not commodification but securitisation: ‘to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighbourhood’.

According to a current high court ruling issued in Cardiff earlier this month, the use of facial recognition by the police is permissible, despite judges acknowledging that this technology interferes with fundamental privacy rights.

As it stands, there are no checks and balances on the use of facial recognition by private firms, if they are issued as part of a security strategy.

Yet as the prevalence and sophistication of surveillance technology becomes unearthed, many are beginning to challenge this reality.

In July, the Commons Select Committee on Science and Technology published a report by the Biometrics Commissioner and the Forensic Science Regulator which called for ‘a moratorium on the current use of facial recognition technology’ and noted that ‘no further trials should take place until a legislative framework has been introduced and guidance on trial protocols, and an oversight and evaluation system, has been established’.

In a similar vein, the human rights group Liberty has launched a petition for the Home Secretary to ban the use of facial recognition technology in public spaces. They also intend to appeal the high court ruling against the lawful use of biometric surveillance by the police.

In the United States, Bernie Sanders is the first 2020 Democratic nominee to promise a ban on all facial recognition technology by the police, putting pressure on others to take a tougher stance against covert surveillance.

These responses are the start of a much-needed backlash against invasive technology, and the potential datafication of the public sphere. Just as a fierce struggle has retrospectively begun over the human right to autonomy in the digital commons, a battle must now begin over our individual rights in physical space.

We must draw definitive red lines when it comes to the extraction and use of all kinds of personal data, whilst fundamentally reimagining the use and control of privately owned public spaces.

To democratically channel the power of modern technology, we need new models of data and land ownership fit for our age. Failing this, our valuable open spaces like Kings Cross may yet become the real-world data mines of 21st century surveillance capitalists.

Print Friendly, PDF & Email

34 comments

  1. ambrit

    The suggested remedy for this ‘problem’ is regulation. The recent history of the Western economic and financial sphere has shown that regulations can be ‘rolled back’ with the change of an administration. The rollout of platforms like Uber and Lyft show that today, breaking the law is considered a legitimate business strategy. (More on that later.) The concept of the “Manufacture of Consent” shows that the ‘public’ cannot be expected to always know what it’s best interests are, much less pursue them.
    My suggestion for a remedy to this ongoing phenomenon is the idea of “Radical Privacy.” Simply put, ‘Radical Privacy’ encompasses the private utilization of any and all surveillance counter measures. Things like the purported pattern defeating head cover, (a modern burquah,) personal carry radio frequency scramblers, camera disabling flash devices, and others will slowly gain acceptance among the public, with or without ‘official’ acceptance.
    History has shown that this modern “enclosure movement” is not ‘officially’ stoppable. It is up to the individual to not only try to opt out of this creeping Panopticonism, but to actively oppose it. This is a definite case where breaking the “law” is a positive social act.

    Reply
    1. xkeyscored

      I agree about how quickly regulations can be rolled back. Just look at how eager the US was to relinquish its precious rights and liberties in the wake of 9/11.
      I’m not at all sure about the effectiveness of your suggested counter-measures. Most of them – pattern defeating head cover, (a modern burquah,) personal carry radio frequency scramblers, camera disabling flash devices – can be easily detected, making the user a target for more conventional police tactics.

      Reply
      1. ambrit

        The risks are palpable, true. However, this is verging on becoming a revolutionary situation. Otherwise, we will end up neo-feudal peasants, but without all the rights real feudal peasants had.

        Reply
        1. notabanker

          “Otherwise, we will end up”

          Welcome to tomorrow. It’s already here. We don’t need to wait for dystopia, it is here.

          Reply
          1. ambrit

            Hah! My idea of a neo-liberal Miranda warning:
            “You have the right to remain silent. Anything you do say, write, or think will be taken down and can be used against you. You can have ‘access’ to an attorney. If you cannot afford an attorney, tough luck.”

            Reply
    2. Tom

      Individuals should not have to engage in a technical and operational war with the surveillance state. The more time, resources and psychic energy a person spends on surveillance countermeasures, the more liberty and quality of life is lost. Even a highly advanced and motivated (paranoid?) private citizen will lose that war because the corporations and the state have way better tech and way more resources. Like the 2nd amendment militia enthusiasts in the USA, they are out-gunned at absurd ratios.

      Similarly, I don’t think we should counter the pervasive daily fear of gun violence by arming ourselves and being constantly alert to threats, as the NRA proposes.

      Only a political solution will be acceptable to me.

      Reply
      1. ambrit

        My counterargument is that the groups that control the Panopticon will control the political sphere, and thus, the institutional life of the nation.
        The rationale behind my “Radical Privacy” idea is that the adoption of same by a sufficient percentage of the public will render the maintenance of the Panoptical State counterproductive. “Radical Privacy” is an assertion of rights, and the rejection of the ‘privilege’ theory of citizenship.
        The main ‘evil’ inherent in the Panopticon system of governance is it’s abrogation of the right to be left alone.This is all about power, whether it is to be centralized or diffuse.

        Reply
        1. Tom

          If propaganda world would be sufficient to keep any democratic political action small then it can be effective also against your proposal, since it too must become a mass movement to be effective.

          Reply
          1. ambrit

            True enough if a majority of the population is needed to effect the ‘disruption.’ Here is where the ‘economy of scale’ comes into question. What percentage of the population is needed to truly ‘disrupt’ this system? I’m guessing that this point is what undergirds the Marxist idea of the ‘Vanguard of the Proleteriat.’ A cadre can lead a group to change. Else, how do the present socio-economic elites control the system we inhabit? To paraphrase the Anarchists, how will the clash of the two systems, the Propaganda of the Word and the Propaganda of the Deed, turn out?

            Reply
  2. oaf

    …Recently: A buddy and I had been working at a remote site on a weekend. Exhausted; but satisfied at a challenging task being completed; ( I had promised myself a beer IF we got it done) we stopped to gas the truck at the *convenience* store…Since I wasn’t the driver, I went inside and soon found the refreshment I sought. The counter attendant carded me (!!! WTF!!!)( but; okay. (rules is rules; that’s policy; et al.)
    Since I was really parched; I grumbled; but got out my license…which he took from me ; and scanned with a device at the register!!! Since when do they need to collect personally identifying information; including address; DOB, and more; in order to sell me one beer??? I was not informed that my license would be scanned.(besides my face being on multiple security cameras on the premises) Not much to be done at that point but rant, in front of the other customer…..I would love to include the company name in this post; but that might be problematic. It is owned by one of the several big fuel suppliers in the area (Maine). I won’t be going back to that chain unless to document the seemingly unlawful practice.

    Clue: whir the people of _ _ _…

    Reply
    1. ambrit

      Yes about that normalization of surveillance. I was carded for a beer once a decade ago. That was when I was over fifty and looked it. The checkout woman asked for another form of ID because I had an out of state drivers license. I caved in. I should have done the following:
      “I’ll need to see another ID. This is out of state,” she says.
      “Really?” I reply.
      “Yes” she counters.
      “Well,” I say, “Will my release papers from the State Pen do?”
      “I think you’ll have to buy your beer somewhere else mister,” is her answer to that.
      “That’s all right,” I say. “I’ll have to ask my parole officer about this entire beer business anyway. Things have sure changed while I was away.”
      Nonetheless, I have never been back there even though I drove past it at least once a week for a year.

      Reply
      1. Yves Smith Post author

        A simpler answer would have been: “This is good enough for the TSA, state and city cops, and any liquor store in this state. Do you really want this for ID or for identity theft?”

        Reply
  3. Carolinian

    an invasive two-way audio feature – “potentially allowing controllers to listen in”

    Frances Coppola’s The Conversation–a fable about the perils of surveillance and technological alienation–comes to mind. Gene Hackman deploys an entire team with directional mikes and cameras to spy on two lovers. Now the government will be able to do the same thing with the push of a button. There have been some comments here about Walmart’s new cameras everywhere revamp (with lots of monitors showing your picture to add to the intimidation factor). It is creepy, but it says it all that Walmart thinks the public will accept and not object (and they seem to be right). We are losing all of our privacy unless one wants to strap on a backpack and head for the woods with no electronic devices–an increasingly appealing option.

    Reply
    1. ambrit

      A sign on the entrance doors of my local bank prohibits the carrying of firearms inside the premisis and the wearing of hoodies and sunglasses.
      Deviance has already been defined down. I knew roughly that I was a deviant, just not how much of one.
      Concerning sunglasses; if I see anyone resembling Rowdy Roddy Piper entering my bank, I’m running in the opposite direction.

      Reply
  4. jashley

    Sorry, but you have waived any rights to privacy by the very act of using cell or moving about in a vehicle on public roads.

    The idea that they need a face to pin you down is so far from the reality of the data collection as to be from the horse and buggy days.

    This horse has left the barn and long since over the hill out of your sight.

    Reply
  5. Steve Ardire

    Get the book “Eyes in the Sky” The Secret Rise of the Gorgan Stare and How it will watch us all by Arthur Holland Michael https://www.amazon.com/Eyes-Sky-Secret-Gorgon-Stare-ebook/dp/B07FK9567C

    I had pleasure of hearing him speak then was on my Panel on Ethics in #AI and Privacy Implications
    of #facialrecognition technology and see video recording here https://www.forcemultipliersteveardire.com/big-data-ai-conference-dalla-2019. One of the other panelists was very bright 20yo UT Dallas student doing her own AI startup

    Reply
  6. Tom

    I’m 100% in favor of transparency and regulation of facial recognition. But what about, for example, the 500,000 cameras in London that don’t have this tech? Are they ok? What I mean is, how do we figure out what tech is acceptable and at what scale?

    I don’t think we can have a properly informed discussions about that without trying to understand what it’s all for. What, ultimately, are we trying to accomplish?

    For one thing, I’m concerned about the emphasis on crime prevention as an ultimate goal. That justification will be pretty persuasive to a frightened public. And it has no logical way to end. If introducing such-and-such new tech or expanding this-or-that existing program might prevent crime, how does a politician oppose it?

    Reply
    1. xkeyscored

      They might try opposing and preventing the massive crimes going on in broad daylight, such as the endless wars on wherever and whatever, and the enrichment of the x% at the expense of everyone else’s impoverishment and the destruction of our environment. Such an approach could be very popular, but I’d expect to see a lot of resistance if it gained traction.

      Reply
  7. noonespecial

    NC community may be interested in this article from Defense One.

    “The Intel Community Wants to ID People from Hundreds of Yards Away”
    https://www.defenseone.com/technology/2019/09/intelligence-community-exploring-long-range-biometric-identification/159971/?oref=d-channelriver

    Some quotes:

    “The intelligence community is working to build biometric identification systems that can single out individuals from hundreds of yards away or more, a feat that’s virtually impossible using the technology that exists today.

    Ultimately, the tech would let spy agencies rapidly identify people using cameras deployed on far off rooftops and unmanned aircraft, according to the Intelligence Advanced Research Projects Activity, the research arm for the CIA and other intelligence agencies. IARPA started looking for researchers to participate in its Biometric Recognition and Identification at Altitude and Range, or BRIAR program, which aims to develop identification tools that work from vantage points high above or far away from their subjects. While the program is still getting off the ground, the tech it seeks to develop could improve the government’s ability to surveil adversaries—and citizens—using biometric data.”

    Reply
  8. Dan

    Do your civic duty, deploy paint, glass scratching devices and laser pointers help illuminate despotic darkness.

    The citizens of Hong Kong are leading the way.

    Reply
  9. jfleni

    RE: How Facial Recognition Technology Is Bringing Surveillance Capitalism to Our Streets.

    Read your own posts: Yellow and purple face coatings plus many others can make you invisible forever. If your whiz-bang phone
    has a lens, rub a deep black ink over it. problem solved.

    Reply
  10. lyle

    Actually this is unexpected consequence of the global village talked about in the early 1990s. Even today in the smaller towns clerks recognize you and if you order the same a lot the waiter may ask if you want the usual. In a village every one recoginized you. In principal if sufficent funds exist, one could in smaller towns do this with folks just sitting on street corners. In a village everyone probably knows everyones car also. So what we have gotten is the extension of the village to big cities. So what this means is the lack of privacy in villages is just being extended to cities.

    Reply
    1. xkeyscored

      The idea of privacy is quite alien to some cultures. In Indonesia people assume you’re up to no good if you want any. And it doesn’t mean much when you live ten to a room in a slum.

      Reply
  11. sangell51

    Its going to be tough world for criminals as surveillance becomes ubiquitous that’s for sure. Baltimore is going to fly some Cessna’s over that troubled city that will track people and vehicles leaving a crime scene and facial recognition is no different in function than having a police officer spotting a wanted person from his squad car, it just multiplies the number of eyes looking for that person.

    The dystopian aspect is not so much businesses collecting demographic data for marketing purpose. They’ve been doing that forever. Its annoying perhaps to look at car on line and then have ads for that car appear on your Facebook page or pop up on whatever website you visit. Where it can get scary is when you have a James Comey style FBI looking to justify an investigation by selective use of data to imply a nefarious purpose. An anecdote. Many years ago a gay friend of my mother told her she saw me walk into a gay bar in San Francisco. She asked me about it. I racked my brain trying to figure out what she was talking about. I’m not gay but I needed to know the time and location. Once I had that info I remembered I had gone into a bar there to use their restroom. Didn’t know it was a gay bar just that I needed to use a restroom. Of course a government out to make trouble for you could use instances like that to create a damning case against you.

    Reply
  12. chuck roast

    I have a dumb phone and do everything dumb, like pay cash. Let me take a leap of faith here and ask why the cops, facing a greater than 90% inaccuracy rate, would support the spread of this dumb technology? Maybe because the cops are dumb?
    Yeah, crapification is everywhere, and the awards for it appear to be enormous and growing. So, now we add dumbification to crapification?
    I was rowing my dink the other day and my port oarlock almost bit the bag. Fortunately, I nursed the craft downwind to the local shipyard.
    Crapification and dumbification…picture a mariner in a dingy rowing his starboard oar for all he is worth.

    Reply
    1. ambrit

      I’m presently experiencing another neo-liberal conundrum. My cheapie cell phone provider is “upgrading” their network to 5G in this town. They want all their customers to “upgrade” their phones to something like an older model of iphone, with all the bells and whistles. I complained and discovered that this provider does not offer any “dumb” phones for the replacement. I’m scrambling to find a ‘dumb’ phone compatible with 5G, that costs less than a monthly mortgage payment!
      Welcome to the Machine.

      Reply
  13. ook

    Facial recognition technology has been a thing at restaurants for quite a few years. In 2015 I read about a platform (cannot find the reference) that recognized big spenders in the vicinity and notified the restaurant with pictures of the customer and whatever other data they had. In 2015 that was based more on proximity of cell phones, probably still is, and I don’t think a dumb phone will protect you from that.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *