NHS to Use US “Spy-Tech” Firm Palantir’s Platform to Extract Patient Data Without Patient Consent

Palantir, with intimate ties to defense, intelligence and security industries around the world, seems set to play an even larger role in the UK’s crisis-ridden National Health System (NHS).

Last summer, as readers may recall, executives at NHS England — the non-departmental government body that runs the National Health Service in England — came up with an ingenious plan to digitally scrape the general practice data of up to 55 million patients and share it with any private third parties willing to pay for it. NHS England allowed patients to opt out of the scheme; they just didn’t bother telling them about it until three weeks before the deadline, presumably because if they had, millions of patients would have opted out.

When the FT finally broke the story, a scandal erupted. NHS England officials responded by shelving the scheme, saying they needed to focus on reaching out to patients and reassuring them their data is safe. But that hasn’t happened. Instead, they have waited for the scandal to die down before embarking on an even more egregious scheme.

This time it is patient data from UK hospitals that is up for grabs. And patients will have no opt-out option. In fact, without even consulting patients, NHS England has instructed NHS Digital — which will soon be merged with NHS England as part of the UK’s governments accelerated reforms to the NHS’ “tech agenda” — to gather patient data from NHS hospitals and extract it to its data platform, which is based on Palantir’s Foundry enterprise data management platform.

The pretext for taking such a step is that researching and analyzing patients’ hospital data will help the NHS better understand and tackle the crisis in treatment waiting times resulting from the COVID-19 pandemic. But the result will be yet more private-sector involvement in essential NHS processes. And in this case, the company being involved in those processes is one of the darkest in the tech universe.

A Highly Coveted Prize

The NHS is the world’s seventh largest employer. And it is home to one of the richest repositories of patient data on the planet. “One of the great requirements for health tech is a single health database,” Damindu Jayaweera, head of technology research at UK investment bank Peel Hunt told Investors’ Chronicle. “There are only two places as far as I know that digitise the data of the whole population from birth to death… China and the UK.”

As the FT reported earlier this year, Palantir aspires to become the underlying data operating system for the NHS. To that end, it has already lured two senior NHS managers to its executive suites, including the former chief of artificial intelligence. It now has its sights set on the ultimate prize: a five-year, £360 million contract to manage the personal health data of millions of patients.

Palantir’s latest encroachment into NHS operations came to light thanks to the publication of board paper’s just hours before NHS Digital’s latest board meeting, on November 1. Those papers no longer seem to be accessible so I am relying on a report published on Friday 4 by The Register, a British technology news website, as well as a heavily detailed twitter thread by Phil Booth of MedConfidential, a group campaigning for confidentiality and consent in health and social care.

According to Booth, on page 158 of the board papers NHS England instructs NHS Digital to use Palantir Tech’s Foundry platform to “collect patient-level identifiable [hospital] data pertaining to admission, inpatient, discharge and outpatient activity from acute care settings on a daily basis.”

Following previous data debacles, both the NHS and UK government ministers had pledged that in future any patient data shared for research and analysis purposes would be anonymized. But now they are talking about using “pseudonymized” data, which is completely different. In 2014, the Information Commissioner’s Office (ICO), the UK’s independent regulatory office (national data protection authority) dealing with the Data Protection Act 2018 and the General Data Protection Regulation, said the following about pseudonymized data:

Pseudonymising personal data can reduce the risks to the data subjects and help you meet your data protection obligations.

However, pseudonymisation is effectively only a security measure. It does not change the status of the data as personal data. Recital 26 makes it clear that pseudonymised personal data remains personal data and within the scope of the UK GDPR.

“…Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person…”

In other words, says Booth, “while NHS England may want to ignore people’s opt-outs from Research & Planning uses, and contorts itself to say their data’s not ‘confidential patient information’, the law(s) says otherwise.”

There are also serious questions about who exactly will be doing the pseudonymisation, and who will hold the keys, says Booth: “There’s a world of difference between an independent statutory Safe Haven (i.e. NHS Digital), NHS England which wants ALL the data to use for whatever it wants, and Palantir.”

A Dark Company

Named after the “seeing stones” used in The Lord of the Rings, Palantir was set up in 2003 with seed money from the CIA’s venture capital arm, In-Q-Tel (IQT). It is one of the darkest companies in the tech sphere. While it is making significant inroads in the corporate world, its main line of business is to provide data-mining technology to support US military operations, mass surveillance, and predictive policing. Its technology is also used by ICE to identify illegal migrants before detaining and deporting them.

When, in 2018, thousands of Google employees refused to participate in Project Maven, a secret Pentagon-funded AI pilot program aimed at the unmanned operation of aerial vehicles, the project was taken up by Palantir. Critics warn that the technology could pave the way to autonomous weapons that decide who to target without human input. In February 2021, Palantir’s chief operating officer boasted to investors that Palantir was driving towards being “inside of every missile, inside of every drone.”

This is a company that deals in death on a daily basis but is also rapidly building a stake in the health and life services sector. During the early months of the pandemic it was one of a number of companies chosen to help collect, store, process and share data for the United States Department of Health and Human Services (HHS) — a project that the Electronic Frontier Foundation (EFF) warned poses “a grave threat to the data privacy of all Americans.”

On the other side of the pond, the UK Government signed a deal in March 2020 with an assortment of private tech firms, including Palantir, to help run the NHS’s massive COVID-19 “data store”. The company charged a mere £1 for its services, but that was enough to get its foot in the floor. What was supposed to be a short-term arranged blossomed into a two year contract with the Department of Health and Social Care, worth up to £23 million, to help run the NHS’ massive database.

Palantir’s gathering takeover of NHS data services has met strong resistance. In September 2021, the UK’s Department for Health and Social Care was forced to terminate a contract with Palantir over the management of social care data, following a massive protest campaign involving more than 50 groups. The move was taken as a tentative sign that the UK government may finally be pivoting away from using Palantir’s services, at least in the healthcare sector. That is clearly not the case.

But even if the UK government had made that pivot, Palantir had a back-up plan in place, as Bloomberg reported in late September. That plan was laid out by Palantir’s regional head Louis Mosley in a Sept. 24 email entitled “Buying our way in…!”, and it essentially involved “hoovering up” small businesses serving the NHS to “take a lot of ground and take down a lot of political resistance.”

As Cory Doctorow notes in his excellent post last month, How Palantir Will Steal the NHS, Palantir has essentially unfettered access to the capital markets, as well as the deep pockets of its founder, the “cartoon villain” Peter Thiel. While it is clear that good data management has a crucial role to play in the future of health and social care provision, Palantir’s unshakeable commitment to proprietary, secretive software development methodologies makes it woefully ill-suited for NHS service provision:

Compare the NHS to Ben Goldacre’s landmark “Better, broader, safer: using health data for research and analysis”:

https://www.gov.uk/government/publications/better-broader-safer-using-health-data-for-research-and-analysis/better-broader-safer-using-health-data-for-research-and-analysis

Goldacre argues that the only way to unlock the medical insights in aggregate NHS patient data is with public software: an open and free “trusted research platform” that anyone can audit and verify.

While the code for this platform would be public, NHS patient data would never leave it. Instead, researchers who wanted to investigate hypotheses about the effectiveness of different interventions would send queries to the platform and get results back — without ever touching the data.

This is a system that only works if it’s hosted by democratically accountable public services — not by private actors accountable to their shareholders, and certainly not secretive companies whose primary expertise is in helping spy agencies conduct mass surveillance.

As Doctorow notes, most people in the UK do not want the NHS to be privatised. For them the NHS, founded in 1948 on the principles of free and equal access to medical treatment, is sacrosanct:

But while the British people oppose privatisation, the British investor class are slavering for it. Oligarchs love to loot public services, which is why the IMF is so adamant that the countries it “helps” sell off their public water, housing, even their roads and schools and museums…

[The NHS] has been subject to the death of a thousand literal cuts, as Tories and Labour alike have starved it of resources. More importantly, both parties have turned ever-larger chunks of the NHS over to private-sector looters who have taken over hospitals, services, record-keeping and more.

An Even Bigger Picture

But this is not just about the NHS. It is about our governments’ role as guardians of our most precious data, including our health and biometric information. As governments, central banks and global corporations trip over each other to rush into existence digital identity programs and central bank digital currencies, that role is set to grow exponentially (unless, of course, we can stop them in their tracks).

In the new digital age that is rapidly forming around us, citizens will be custodians of our own data. We will be the ones who get to decide which parts of our data get shared and with whom. At least that is what we are being told. But these are just words, and words can be hollow.

We have to judge our governments on their actions. And their actions to date — including NHS England’s decision to grant custodianship of NHS patients’ hospital data to Palantir without even informing patients, the US State Department’s decision to give intelligence and law enforcement agencies unfettered access to more than 145 million Americans’ personal data, and the US government’s plans to share the biometric data of its citizens with dozens of other governments — speak of a whole different reality.

Print Friendly, PDF & Email

8 comments

  1. dandyandy

    Great report, Nick, thank you very much.

    Scary beyond belief. Yet more abuse of public ownership without accountability.

    NHS cost cutting in the last ten fifteen years has already made the quality of hospitals care look like war-zone venues. Imagine how much more money can be saved if vultures are called in to implement their enhanced economising and help those nearing their expiry date, well, expire a little earlier.

  2. Fastball

    “the US State Department’s decision to give intelligence and law enforcement agencies unfettered access to more than 145 Americans’ personal data”

    I assume you mean 145 million Americans’ personal data?

  3. TomDority

    J Edgar Hoover is dancing in his grave seeing that his paranoid and delusion driven fight against imagined enemies has finally yielded a path to his unconstitutional and ineffective means to political thought policing and intimidation via leveraged private individual information. It should, without doubt, lead to our fusion centers, police and homeland security departments claiming the need to be in possession of this information to help alleviate the fear and fright our politicians have been driving for decades into the heart of every american. How could anything go wrong when in the capable hands of private capital and political oversight dedicated to the financial capitalistic empire.
    A good reading on this century long struggle to capture private info
    The Federal Bureau of Investigation by Max Lowenthal – 1950

  4. Kouros

    Oh, Pharma companies will salivate for this information. Looking at hospital costs for various treatments, they will be able to adjust (up) the cost of drugs that purportedly will prevent hospitalizations. The miracle drug for HepC was valued the same way: how much it cost to treat HepC with traditional treatments, i.e. interferon etc, for years this is how much the new drug costs.

    Why do I know I was for years on the forefront of data stewardship in a Canadian province and saw things and heard things…

    1. Greg

      That pricing mechanism makes exactly the kind of logic I would expect from sociopaths armed with MBAs. Thanks for sharing.

  5. Dave

    If data isn’t processed properly it isn’t our data.
    It’s just stuff that’s come out of a computer.

    Processing data can challenging. It’s much harder than making up stuff for the media. For example consider medical ‘data’.

    Richard Horton, editor of the Lancet, in March 2004
    “Journals have devolved into information laundering operations for the pharmaceutical industry”

    Marcia Angell, former editor of the New England Journal of Medicine
    lambasted the pharmaceuticals industry for becoming “primarily a marketing machine” and co-opting “every institution that might stand in its way”

    Statisticians issue warning over misuse of P values

    Scientists’ grasp of confidence intervals doesn’t inspire confidence

    Is Palantir giving the public any guarantees about the quality of its data analysis?

    GIGO can mean Good In Garbage Out.

    The output from this initiative might not be the ‘data’ that is currently being claimed by the government and Palantir.

    Whilst the material that the NHS has is potentially valuable it needs to be processed carefully.

    The ‘tech / surveillance’ sector doesn’t have a great reputation when it comes to doing the hard work that is involved in developing algorithms and software for effective data analysis, e.g.

    Mistaken Identifiers: Gene name errors can be introduced inadvertently when using Excel in bioinformatics

    Spreadsheets in the Cloud – Not Ready Yet

    Effective use of AI comes with responsibilities, hence the need for transparent approaches to AI such as explainable AI.

    Explainable AI (XAI) is a methodology that has been built to deal with the issues that can arise when opaque output from computer programs is treated as useful data. In practice this opacity can be problematic and can lead to output from faulty algorithms and poor software being integrated into decision making processes.

    An important goal of XAI is to provide algorithmic accountability. This requires an open approach to building AI systems that goes beyond the closed black-box methodology, where the inputs and outputs are known but the algorithms that are used to arrive at a decision are not available and may not be understood by the people that developed the algorithms and the software that is used to build the black-box AI system.

    In practice, ‘I don’t how I got the answer’ isn’t a satisfactory response to a question, even if the answer appears to be useful.

    XAI provides interpretable explanations in natural language or other easy-to-understand representations, allowing doctors, patients, and other stakeholders to understand the reasoning behind a recommendation – and more easily question its validity, if necessary.”

    How Explainable AI (XAI) for Health Care Helps Build User Trust – Even During Life-and-Death Decisions

    A pell-mell introduction of opaque ‘technology’ without any evidence that it works is likely to create a digital toxic wasteland that fails to deliver on any of the promises whilst extracting a hefty rent from the unsuspecting victims.
    In addition it may lead to a deterioration in the quality of the ‘services’ that the NHS (Neoliberal Hell Service) delivers to the public.

Comments are closed.