fbpx

Kingsley Hayes discusses the impact of NHS data sharing

crowd-about

Share this article:

Facebook
Twitter
LinkedIn
Email

Head of Data Breach, Kingsley Hayes, explores the impact of NHS data sharing on privacy rights, in The Barrister.

Kingsley’s article was published, in print, in this quarter’s edition of The Barrister.

Among the many lessons to be learned from the Covid-19 pandemic is the inextricable link between health and data: effective use of the latter has undoubtedly helped save many lives over the past year. However, long-term use of NHS data is more contentious, not least the issue of data sharing with third parties and protection of that data. In May, The General Practice Data for Planning and Research scheme was announced by the government, under which GP health data for everyone registered in England would be made available to researchers and companies for healthcare research and planning, with people’s identities partially removed.

But according to privacy campaigners, the process to remove identities could be reversed, which led to a widespread online campaign encouraging people to opt out. In August, the Observer revealed that nearly 1.4 million people had opted out of NHS data-sharing in May and June, following a huge backlash against the plan to make patient data available to private companies. As a result, the plan has now been put on hold with no new implementation date yet fixed.

Privacy campaigners can also point to the NHS having a chequered history in data sharing and protection. In 2016, the UK’s Information Commission (ICO) censured the Royal Free NHS Foundation Trust in relation to data on 1.6 million people, which it handed over to Google’s DeepMind division (an AI company) during the early stages of an app test to enhance their machine learning capability. The ICO ruled that the Royal Free did not do enough to protect the privacy of patients, and that it was “inexcusable” that they had not been told about what had been happening to their data. The information commissioner, Elizabeth Denham, said that attempts to make creative use of data had to be carefully managed. “The price of innovation does not need to be the erosion of fundamental privacy rights,” she added.

Since the GDPR came into force in May 2018, NHS Digital has had further significant issues securing the appropriate consents to data record sharing in an IT project that had glaring failures. Meanwhile, the NHS is working on AI projects via NHSX to use machine learning in research and development projects. Again, questions exist around data transparency and public consent with regards to personal data use within that project.

In May, Big Brother Watch reported that NHS Digital’s management of Covid vaccination status data had failed to deliver even basic safeguards, which could lead to information being exploited by insurers, companies, employers or even scammers looking to defraud individuals. Director of Big Brother Watch, Silkie Carlo, said: “This is a seriously shocking failure to protect patients’ medical confidentiality at a time when it could not be more important. This online system has left the population’s Covid vaccine statuses exposed to absolutely anyone to pry into. Robust protections must be put in place immediately and an urgent investigation should be opened to establish how such basic privacy protections could be missing from one of the most sensitive health databases in the country.” After it was revealed that the system leaked people’s vaccination status, NHS Digital then altered its Covid vaccination booking website.

Potential or actual data misuse is the big issue when the NHS shares confidential patient data with a third-party organisation. If that personal data is provided as part of an overall AI project, what happens to it, where does it go, where does it sit, and how many times does it get processed? Ultimately, the key questions for the people concerned are: what does a data subject, as an individual, know about the consent they have given for the processing of that data, where it is then going to be used and how many times is it going to be used?

The number of external suppliers to the NHS is substantial: 28 million lines of picked goods are delivered to the NHS annually with consolidated orders from over 930 suppliers. Information relating to the number of supply chain partners operating with the NHS Digital Commercial team of procurement professionals is not itemized.

The NHS Digital team states: “Our supply chain partners are fundamental to our on-going success, creating significant value through the delivery of new thinking and innovative solutions. Through the deployment of Strategic Supplier Relationship Management (SSRM) we are focused on creating an effective and collaborative relationship with our most important suppliers, creating additional value and innovation that goes beyond our contracts.” It adds the following in relation to the collection and dissemination of data: “We ensure that external organisations can access the information they need to improve outcomes, and the public are confident that their data will be stored safely by NHS Digital.”

What happened with Royal Free, combined with more recent events, demonstrates that public confidence in NHS Digital’s commercial relationships with external organisations is open to question. The Data Protection Act of 2018 and the GDPR are designed to ensure that an individual data subject – the person giving consent – should be fully appraised of all of uses of that data, where that data is going to end up, how it is going to be treated, and ultimately, if it is going to be retained or disposed of.

Post-GDPR being implemented, an element of mystery still exists concerning AI projects as to how often that data is utilised in the machine learning process, and where it ultimately ends up. The overarching aspect is that the designers of AI and machine learning programmes closely guard information about how the algorithms underpinning these programmes work. Once data has been provided so that individual data subjects do not know what has happened to it, there is very little transparency in the process. Moving forward, the concern for any individual is that once they have given consent, is it possible to withdraw it and remove that data from the from the AI tank? If not, then it does not accord with the principles of GDPR and data subjects.

The EU is now looking at AI regulation comparable in many ways to GDPR. But the UK’s direction of travel appears to be that this is one area where we will not keep alignment in place. GDPR and the protection of data rights is an area where which will probably evolve more by judicial intervention than by additional regulation. Over time, UK divergence from the EU will lead to judicial divergence of laws created by the EU.

When considering the future relationships of NHS Digital with third-party companies, there is cause for concern. Based on its track record, it is reasonable to assume that state-owned entities like the NHS simply do not have the technical capabilities to understand what exactly AI projects can and will do. The NHS is buying an outside resource which, necessarily, sometimes has its own agenda. An objective look at some US tech companies operating in the NHS market reveals a very fixed, well-established agenda around the provision of services and understanding of what services will be required in the future, and how they can monetise them.

The problem lies less with the technical capability of NHS Digital, and rather more with a lack of understanding of the core objectives of some tech companies with which they are doing business. These core objectives do not necessarily align with those of the NHS. Essential process changes need to be made to the NHS but it is simultaneously floundering in terms of how to achieve that technically: the more the NHS relies on outside agencies, the greater the risk that it will not have the appropriate level of compliance, particularly where interests do not align.

Against this background, significant data misuse seems inevitable and will ultimately lead to litigation. The key driver will be consumer understanding and a demand for greater transparency in how individuals’ information and data is dealt with. At present, most people do not appreciate the value of their personal and medical data. In some instances, it’s probably worth more than gold. Over the next few years, there will be greater investigation into some of these tech and AI products.

Dissemination of such information will enable the public to understand and regain control of their personal data. It is inevitable that this will provoke litigation – not against the NHS, but against some of the organisations with which they have commercial relationships. The motives and monetary gain that is sought by third-party suppliers will lead to actions against them, and the implementation and processing of data will be key. The public will not sue the NHS for dealing with personal data when seeking to improve their services. The tech companies responsible for handing the data will be the ones in line of sight.

In February 2024, our firm changed its name from Keller Postman UK to KP Law.

Share this article: