This time it’s patients’ biometrics — their most sensitive data of all — that appear to be at risk.
The UK’s National Health Service’s rush to embrace remote care has not only left many patients struggling to see a doctor in person; it has also opened up a rich vein of data mining and data managing opportunities for big tech companies, including Google and Microsoft. Even the controversial US spy-tech firm Palantir, which was founded with support from the CIA in 2003, has shared in the spoils. But then, in late May, a scandal broke, as I reported just over a month ago in Going, Going, Gone: UK Government Speeds Up Privatisation of National Health System:
Managers at NHS Digital [had come] up with an ingenious plan to digitise and share up to 55 million patients’ private heath data with just about anyone willing to pay for it. That data includes sensitive information on physical, mental and sexual health, as well as gender, ethnicity, criminal records and history of abuse. It could even include a patient’s drug or alcohol history. The NHS Digital managers kindly allowed patients to opt out of the scheme; they just didn’t bother telling them about it until three weeks before the deadline, presumably because millions of patients opting out of the scheme would have meant less money for the NHS.
When the FT finally broke the story, a scandal erupted. NHS Digital officials have since scrapped the scheme, saying they now want to focus on reaching out to patients and reassuring them their data is safe.
Unfortunately, the scandal doesn’t appear to have spurred much in the way of meaningful change at NHS Digital. This past weekend, just three months after the agency’s last data-related scandal, another one broke. Undisclosed companies, it turns out, are managing facial recognition data collected by the NHS App, which now has 16 million English users (just under 30% of the country’s population). The story, broken by the Guardian, has sparked fresh concerns about the safety of patient data in the hands of (often unidentified) private businesses:
Data security experts have previously criticised the lack of transparency around a contract with the NHS held by iProov, whose facial verification software is used to perform automated ID checks on people signing up for the NHS app.
The Guardian now understands that French company Teleperformance, which has attracted criticism in the UK over working conditions, uses an opaque chain of subcontractors to perform similar work under two contracts worth £35m.
The NHS App, not to be confused with the Covid-19 app, can be used to perform a whole host of functions including booking GP appointments and ordering repeat prescriptions. Its number of users has doubled since May, when it became the easiest means of accessing the NHS certificate proving an individual’s Covid-19 vaccination status (before the pandemic it had fewer than a million users). In the absence of an official vaccine passport in the UK, the NHS App has become the next best thing.
To access the app’s services users must go through an ID verification process. Some are directed to an automated process powered by iProov’s software. Failing that, the NHS app resorts to manual checks, in which users record a short video of themselves reading out a set of four numbers, as well as uploading an ID document. The video is then sent to a team of identity checkers, who compare the ID photo with the user’s face in the video. But the public has no way of knowing what outsourcing companies are performing those checks, or under what terms and conditions.
The NHS also appears to be sharing the facial recognition data with law enforcement bodies, but apparently only after a special panel has analysed the formal request. An expert in surveillance law cited by The Guardian said such information was also likely to be of interest to UK and foreign intelligence services:
“If GCHQ acquired it and it was of use, the likely position is that they would share that with the [US] National Security Agency.”
As with previous scandals, the lack of transparency appears to be a feature, not a bug, of NHS Digital’s outsourcing practices. Until now most patients haven’t even been made aware of the fact that their most sensitive data is up for sale, or that it could end up being managed by deeply conflicted companies like Palantir. This is a company that specialises in online surveillance. Its main line of business is to provide data-science support to US military operations, mass surveillance, and predictive policing. In February, Palantir’s chief operating officer told investors that Palantir was driving towards being “inside of every missile, inside of every drone.”
It’s easy to understand why Palantir may want to diversify into health sciences in the UK (just as it is in the US): there are huge amounts of money to be made. Last year alone it racked up £22 million in profits on the back of its NHS data deals. But it’s a lot harder to fathom why the UK’s National Health Service — the first health system in any Western society to offer free medical care to the entire population — would partner with a company that deals in death on such a gargantuan scale. “Their background has generally been in contracts where people are harmed, not healed,” said Cori Crider, the lawyer who co-founded Foxglove.
The good news is that after strong pressure from Foxglove and Open Democracy, the UK government finally relented and rescinded the NHS’s contract with Palantir earlier this month. But it’s impossible to know what will happen to all the data Palantir has managed once the contract is up. What’s more, Palantir is not completely out of the picture. Through its investments in London-based start-up Babylon Health, which provides AI-powered digital checkups and helps users navigate the UK NHS system, the spytech giant still has its fingers in the NHS pie.
Another Silicon Valley giant that appears to be calling it quits on the NHS while keeping one or two fingers in the pie is Google. The company’s AI arm, London-based DeepMind, began partnering with the NHS since 2016, initially to improve the detection of acute kidney injuries. This gave it access to the sensitive data of over a million NHS patients, in a deal that the UK’s data watchdog later found breached the law as patients were not adequately informed that their data was being used. After that, the contract was taken over by Google Health. Now Google Health says it is decommissioning its clinician support app, Streams, which it had rolled out for NHS clinicians. It is also severing its ties with all of the NHS trusts it has partnered with, bar one: London Free.
One of the most curious aspects of DeepMind and later Google Health ‘s partnership with the NHS is that it never actually involved the development of new AI technologies, reports Tech Crunch:
[D]espite being developed by Google’s AI division — and despite DeepMind founder Mustafa Suleyman saying the goal for the project was to find ways to integrate AI into Streams [Google’s clinician support app] so the app could generate predictive healthcare alerts — the Streams app doesn’t involve any artificial intelligence.
An algorithm in Streams alerts doctors to the risk of a patient developing acute kidney injury but relies on an existing AKI (acute kidney injury) algorithm developed by the NHS. So Streams essentially digitized and mobilized existing practice.
Shrouded in Secrecy
NHS Digital’s latest scandal involves French-based multinational Teleperformance, one of the world’s biggest providers of phone services, including customer care, technical support, debt collection, cloud services and social media management. The company, whose clients include Apple, Google, Amazon and Facebook, is also no stranger to controversy.
In April 2020, at the onset of the pandemic, Teleperformance was accused by French and international trade unions of violating workers’ rights to a safe workplace in ten countries including Albania, Colombia, France, Greece, India, Philippines, and the United Kingdom. There were also reported instances of union busting in Colombia and Albania. This prompted an investigation by the OECD, at the end of which France’s National Contact Point (NCP) issued six recommendations, including that the company should strengthen efforts to ensure respect for human rights and worker safety. It also urged Teleperformance to “ensure, as soon as possible, that its Albanian and Colombian subsidiaries respect the right of workers to form or join trade unions and representative organizations of their choice.”
Teleperformance has also drawn criticism in the UK over working conditions. Earlier this year, it courted controversy for using webcams to monitor staff in many countries to check whether they are looking at their phones, eating or leaving their desks while working from home. Officials warned that it could mark the beginning of “full-time visual monitoring of people working in the home”. Unions allege Teleperformance also asked staff based in Colombia to hand over their biometric and medical data as well as undergo polygraph tests.
Despite these scandals, Teleperformance has been entrusted to manage the facial recognition data of the NHS’ digital app’s millions of users. Yet most of the details regarding its business relationship with the NHS, including the identity of the companies to which Teleperformance outsources some of the ID processes, remain shrouded in secrecy. Once again, from The Guardian:
Both NHS Digital and Teleperformance declined to provide a list naming the subcontractors. The NHS has published a partly redacted version of one of the contracts with Teleperformance, a £7m agreement covering April to June this year, but has not published a larger £28m contract running from June 2021 to March 2022.
It also hasn’t published a data protection impact assessment (DPIA), a document governing how the personal data of people signing up to the NHS app is used, collected and stored.
None of this is confidence inspiring. Nor is the fact that Teleperformance refused to comment on how it processes and protects the data its manual checkers receive. Or the fact that NHS Digital is only willing to publish a heavily redacted version of one of the contracts it has signed with Teleperformance. According to The Guardian, it is considering publishing redacted versions of the second contract with IProov and the data protection impact assessment (DPIA). This, at a time when public trust in health authorities has never been so important yet so fragile, does not go nearly far enough, say transparency advocates:
Civil liberties campaign group Big Brother Watch said there was “no reason at all” not to publish contracts and supporting information about the companies involved and their procedures.
“People don’t even know which companies are involved in processing this identification data, where they’re based, or what privacy protections are in place. There is a clear and pressing need for transparency around this curious tech set up,” said director Silkie Carlo.
The concerns echo those expressed earlier this week about iProov’s contract, which also hasn’t been published and is governed by the same DPIA. The government has said the documents have not been published for security reasons.
Clearly, the NHS needs good data, for planning, research and healthcare. And it also needs external support since it doesn’t have all the expertise and insights required to meet those needs. But all outsourcing must be done in the most transparent manner possible, so as to preserve public trust. That isn’t happening.
The NHS has one of the largest repositories of public health data on the planet. That’s what makes it such an attractive proposition for global corporations, particularly in the tech, insurance and life sciences industries. And the UK government is desperate to give them what they want, not only to raise short-term funds for the NHS but also to curry favour with many of its own party donors, including the private equity backers of iProov.
Under Conservative management, the NHS has already provided access to UK hospital data and patient records to 40 pharmaceutical, consultancy and data companies worldwide. Those companies include McKinsey & Company, KPMG, Novavax, AstraZeneca, marketing firm Experian and a data company co-founded by the Sackler family, who made billions of dollars selling OxyContin, an opiate painkiller stronger than morphine. In 2013, the Cameron government even granted BUPA, one of the UK’s biggest private health insurers, access to England’s “sensitive or identifiable” patient data.
NHS Digital insists that it has now changed its ways. In August it shelved plans to collect data nationally beginning in September after more than a million people opted out of NHS data-sharing in just one month. Some GP practices refused to hand over patient data. There is no longer “a specific start date for the collection of data” and NHS Digital has proposed three “tests” that would need to be met before patient data is collected, including a public awareness campaign and an opting out option. It has also insisted that the data will not be used for commercial purposes.
But many GPs and patients are not buying it, which you can understand given this government’s short but storied history of corruption, malfeasance and cronyism as well as reneging on previous pledges. That’s not to mention, of course, its reckless, disdainful stewardship of the NHS, which is aptly summed up in the following montage by Cold War Steve (h/t Lambert):