I recently wrote a piece on AI in healthcare for the Journal of the Apothecaries Livery Company, which has among its membership a great many doctors and health specialists. This is what I said.
In our House of Lords AI Select Committee report “AI in the UK: Ready Willing and Able?” back in 2018, reflecting the views of many of our witnesses about its likely profound impact, we devoted a full chapter to the potential for AI in its application to healthcare. Not long afterwards the Royal College of Physicians itself made several far-sighted recommendations relating to the incentives, scrutiny and regulation needed for AI development and adoption.2
At that time it was already clear that medical imaging and supporting administrative roles were key areas for adoption. Fast forward 5 years to the current enquiry- “Future Cancer-exploring innovations in cancer diagnosis and treatment”- by the House of Commons Health and Social Care Select Committee and the application of different forms of AI is very much already here and now in the NHS.
It is evident that this is a highly versatile technology. The Committee heard in particular from GRAIL Bio UK about its Galleri AI application which has the ability to detect a genetic signal that is shared by over 50 different types of incipient cancer, particularly more aggressive tumours.3 Over the past year, we have heard of other major breakthroughs —tripling stroke recovery rates with Brainomix4, mental health support through the conversational AI application Wysa5 and Eye2Gene a decision support system with genetic diagnosis of inherited retinal disease, and applications for remotely managing conditions at home.6 Mendelian has developed an AI tool, piloting in the NHS, to interrogate large volumes of electronic patient records to find people with symptoms that could be indicative of a rare disease.6A
We have seen the introduction of Frontier software designed to ease bed blocking by improving the patient discharge process.7 And just a few weeks ago we heard of how, using AI, Lausanne researchers have created a digital bridge from the brain and implanted spine electrodes which allow patients with spinal injuries to regain coordination and movement.8 It is also clear despite the recent fate of Babylon Health 9 that consumer AI-enabled health apps and devices can have a strong future too in terms of health monitoring and self-care. We now have large language models such as Med-PaLM developed by Google research which are designed to designed to provide high quality answers to medical questions. 9A
We are seeing the promise of the use of AI in training surgeons for more precise keyhole brain surgery.
Now it seems just around the corner could be foundation models for generalist medical artificial intelligence which are trained on massive, diverse datasets and will be able to perform a very wide range of tasks based on a broad range of data such as images, electronic health records, laboratory results, genomics, graphs or medical text and to provide communicate directly with patients. 10
We even have the promise of Smartphones being able to detect the onset of dementia 10A
Encouragingly—whatever one’s view of the current condition more broadly of the Health Service—successive Secretaries of State for Health have been aware of the potential and have responded by investing. Over the past few years, the Department of Health and the NHS have set up a number of mechanisms and structures designed to exploit and incentivize the development of AI technologies. The pandemic, whilst diminishing treatment capacity in many areas, has also demonstrated that the NHS is capable of innovation and agile adoption of new technology.
Its performance has yet to be evaluated, but through the NHS AI Lab set up in 2019 11, designed to accelerate the safe, ethical and effective adoption of AI in health and social care, with its AI in Health and Care Awards over the past few years, some £123 million has been invested in 86 different AI technologies, including stroke diagnosis, cancer screening, cardiovascular monitoring, mental health, osteoporosis detection, early warning and clinician support tools for obstetrics applications for remotely managing conditions at home. 11
This June the Government announced a new £21 million AI Diagnostic Fund to accelerate deployment of the most promising AI decision support tools in all 5 stroke networks covering 37 hospitals by the end of 2023, given results showing more patients being treated, more efficient and faster pathways to treatment and better patient outcomes.12
In the wider life sciences research field—which our original Lords enquiry dwelt on less—there has been ground-breaking research work by DeepMind in its Alphafold discovery of protein structures13 and Insilico Medicine’s use of generative AI for drug discovery in the field of idiopathic pulmonary treatment which it claims saved 2 years in the route to market.14 GSK has developed a large language model Cerebras to analyse the data from genetic databases which it means can take a more predictive approach in drug discovery .15
Despite these developments as many clinicians and researchers have emphasized—not least recently to the Health and Social Care Committee—the rate of adoption of AI is still too slow.
There are a variety of systemic issues in the NHS which still need to be overcome. We lag far behind health systems such as Israel’s, pioneers of the virtual hospital16, and Estonia.17
The recent NHS Long Term Workforce Plan 18 rightly acknowledges that AI in augmenting human clinicians will be able to greatly relieve pressures on our Health Service, but one of the principal barriers is a lack of skills to exploit and procure the new technologies, especially in working alongside AI systems. As recognized by the Plan, the introduction of AI into potentially so many healthcare settings has huge implications for healthcare professions especially in terms of the need to strike a balance between AI assistance/human augmentation and substitution with all its implications for future deskilling.
Addressing this in itself is a digital challenge that the NHS Digital Academy—a virtual academy set up in 2018 19- was designed to solve. These issues were tackled by the Topol Review “Preparing the Healthcare Workforce to Deliver the Digital Future”, instituted by Jeremy Hunt when Health Secretary, and reported in February 2019 20. Above all, it concluded that NHS organisations will need to develop an expansive learning environment and flexible ways of working that encourage a culture of innovation and learning. Similarly the review by Sir Paul Nurse of the research, development and innovation organisational landscape 21 highlighted a skills and training gap across these different areas and siloed working. The current reality on the ground however is that the adoption of AI and digital technology still does not feature in workforce planning and is not reflected in medical training which is very traditional in its approach.
More specific AI-related skills are being delivered through the AI Centre for Value-based Healthcare. This is led by King’s College London and Guy’s and Thomas’ NHS Foundation Trust, alongside a number of NHS Trusts, Universities and UK and multinational industry partners.22 Funded by grants from UK Research and Innovation (UKRI) and the Department of Health and Social Care (DHSC) and the Office of Life Sciences their Fellowship in Clinical AI is a year-long programme integrated part-time alongside the clinical training of doctors and dentists approaching consultancy. This was first piloted in London and the South East in 2022 and is now being rolled out more widely but although it has been a catalyst for collaboration it has yet to make an impact at any scale. The fact remains that outside the major health centres there is still insufficient financial or administrative support for innovation.
Set against these ambitions many NHS clinicians complain that the IT in the health service is simply not fit for purpose. This particularly applies in areas such as breast cancer screening.
One of the key areas where developers and adopters can also find frustrations is in the multiplicity of regulators and regulatory processes. Reports such as the review by Professor Dame Angela McLean the Government’s Chief Scientific Adviser on the Pro Innovation Regulation of Technologies-Life Sciences 23 identified blockages in the regulatory process for the “innovation pathway”. At the same time however we need to be mindful of the patient safety findings— shocking it must be said—of the Cumberledge Review—the independent Medicines and Medical Device Safety Review.24
To streamline the AI regulatory process the NHS AI Lab has set up the AI and Digital Regulations Service (formerly the multi-agency advisory service) which is a collaboration between 4 regulators: The National Institute for Health and Care Excellence, The Medicines and Healthcare Products Regulatory Agency (the MHRA), The Health Research Authority and The Care Quality Commission.25
The MHRA itself through its Software and AI as a Medical Device Change Programme Roadmap is a good example of how an individual health regulator is gearing up for the AI regulatory future, the intention being to produce guidance in a variety of areas, including bringing clarity to distinctions such as software as a medical device versus wellbeing and lifestyle software products and versus medicines and companion diagnostics.26
Specific regulation for AI systems is another factor that healthcare AI developers and adopters will need to factor in going forward. There are no current specific proposals in the UK but the EU’s AI Act will set up a regulatory regime which will apply to high-risk AI systems and applications which are made available within the EU, used within the EU or whose output affects people in the EU.
Whatever regulatory regime applies the higher the impact on the patient an AI application has, the stronger the need for clear and continuing ethical governance to ensure trust in its use, including preventing potential bias, ensuring explainability, accuracy, privacy, cybersecurity, and reliability, and determining how much human oversight should be maintained. This becomes of even greater importance in the long term if AI systems in healthcare become more autonomous.27
In particular AI in healthcare will not be successfully deployed unless the public is confident that its health data will be used in an ethical manner, is of high quality, assigned its true value, and used for the greater benefit of UK healthcare. The Ada Lovelace Institute in conjunction with the NHS AI Lab, has developed an algorithmic impact assessment for data access in a healthcare environment which demonstrates the crucial risk and ethical factors that need to be considered in the context of AI development and adoption .28
For consumer self-care products explainability statements, of the kind developed by Best Practice AI for Heathily’s AI smart symptom checker, which provides a non-technical explanation of the app to its customers, regulators and the wider public will need to become the norm.29 We have also just recently seen the Introduction of British Standard 30440 designed as a validation framework for the use of AI in healthcare 30, All these are steps in the right direction but there needs to be regulatory incentivisation of adoption and compliance with these standards if the technology is to be trustworthy and patients to be safe.
The adoption of and alignment with global standards is a governance requirement of growing relevance too. The WHO in 2021 produced important guidance on the Governance of Artificial Intelligence for Health31
Issues relating to patient data access, which is so crucial for research and training of AI systems, in terms of public trust, procedures for sharing and access have long bedevilled progress on AI development. This was recognized in the Data Saves Lives strategy of 2022 32 and led to the Goldacre Review which reported earlier this year. 33
As a result, greater interoperability and a Federated Data Platform comprised of Secure Research Environments is now emerging with greater clarity on data governance requirements which will allow researchers to navigate access to data more easily whilst safeguarding patient confidentiality.
All this links to barriers to the ability of researchers and developers to conduct clinical trials which has been recognized as a major impediment to innovation. The UK has fallen behind in global research rankings as a result. The O’Shaughnessy review on clinical trials which reported earlier this year made a number of key recommendations 34
-A national participatory process on patient consent to examine how to achieve greater data usage for research in a way that commands public trust. Much greater public communication and engagement on this has long been called for by the National Data Guardian.
-Urgent publication of guidance for NHS bodies on engaging in research with industry. This is particularly welcome. At the time of our Lords report, the Royal Free was criticized by the Information Commissioner for its arrangement with DeepMind which developed its Streams app to diagnose acute kidney injury, as being in breach of data protection law,35 Given subsequent questionable commercial relationships which have been entered into by NHS bodies, a standard protocol for the use of NHS patient data for commercial purposes, ensuring benefit flows back into the health service has long been needed and is only just now emerging.
-Above all, it recommended a much-needed national directory of clinical trials to give much greater visibility to national trials activity for the benefit of patients, clinicians, researchers and potential trial sponsors.
Following the Health and Care Act of 2022, the impact of the recent reorganisation and merger of NHS X and NHS Digital into the Transformation Directorate of NHS England, the former of which was specifically designed when set up in 2019 to speed up innovation and technology adoption in the NHS, is yet to be seen, but clearly, there is an ambition for the new structure to be more effective in driving innovation.
The pace of drug discovery through AI has undoubtedly quickened over recent years but there is a risky and rocky road in the AI healthcare investment environment. Adopting AI techniques for drug discovery does not necessarily shortcut the uncertainties. The experience of drug discovery investor Benevolent AI is a case in point which recently announced that it would need to shed up to 180 staff out of 360.36
Pharma companies are adamant too that in the UK the NHS branded drug pricing system is a disincentive to drug development although it remains to be seen what the newly negotiated voluntary and statutory agreements will deliver.
To conclude, expectations of AI in healthcare are high but where is the next real frontier and where should we be focusing our research and development efforts for maximum impact? Where are the gaps in adoption?
It is still the case that much unrealized AI potential lies in some of the non-clinical healthcare aspects such as workforce planning and mapping demand to future needs. This needs to be allied with day to day clinical AI predictive tools for patient care which can link data sets and combine analysis from imaging and patient records.
In my view too, of even greater significance than improvements in diagnosis and treatment, is a new emphasis on a preventative philosophy through the application of predictive AI systems to genetic data. As a result, long term risks can be identified and appropriate action taken to inform patients and clinicians about the likelihood of their getting a particular disease including common cancer types.
Our Future Health is a new such project working in association with the NHS. 37 The plan (probably overambitious and not without controversy in terms of its intention to link into government non health data) is to collect the vital health statistics of 5 million adult volunteers.
With positive results from this kind of genetic work, however, AI in primary care could come into its own, capturing the potential for illness and disease at a much earlier stage. This is where I believe that ultimately the greatest potential arises for impact on our national health and the opportunity for greater equity in life expectancy across our population lies. Alongside this, however, the greatest care needs to be taken in retaining public trust about personal data access and use and the ethics of AI systems used.
Footnotes
1.AI in the UK : Ready Willing and Able? House of Lords AI Select Committee 2018 Artificial Intelligence Committee – Summary
2. Royal College of Physicians, Artificial intelligence in healthcare Report of a working party, https://www.rcplondon.ac.uk/projects/outputs/artificial-intelligence-ai-health
3. House of Commons Health and Social Care Committee, Future cancer – Committees
4. Brainomix’s e-Stroke Software Triples Stroke Recovery Rates,https://www.brainomix.com/news/oahsn-interim-report/
5. Evidence-based Conversational AI for Mental Health Wysa
6A https://www.newstatesman.com/spotlight/healthcare/innovation/2023/10/ai-diagnosis-technology-artificial-intelligence-healthcare
7.AI cure for bed blocking can predict hospital stay
8.Clinical trial evaluates implanted tech that wirelessly stimulates spinal cord to restore movement after paralysis, Walking naturally after spinal cord injury using a brain–spine interface
9.Babylon the future of the NHS, goes into administration
9A https://sites.research.google/med-palm/
10. Foundation models for generalist medical artificial intelligence | Nature
10 A. https://www.thetimes.co.uk/article/ai-could-detect-dementia-long-before-doctors-claims-oxford-professor-slbdz70v3#:~:text=Michael%20Wooldridge%2C%20a%20professor%20of,possible%20sign%20of%20the%20condition.
11..The Artificial Intelligence in Health and Care Award – NHS AI Lab programmes
12.NHS invests £21 million to expand life-saving stroke care app, https://www.gov.uk/government/news/21-million-to-roll-out-artificial-intelligence-across-the-nhs
13.DeepMind, AlphaFold: a solution to a 50-year-old grand challenge in biology, 2020. AlphaFold: a solution to a 50-year-old grand challenge in biology
14.From Start to Phase 1 in 30 Months | Insilico Medicine
15.GlaxoSmithKline and Cerebras are Advancing the State of the Art in AI for Drug Discoveryl
16.Israeli virtual hospital is caring for Ukrainian refugees – ISRAEL21c
17. Estonia embraces new AI-based services in healthcare.
18. NHS Long Term Workforce Plan
20. The Topol Review: Preparing the Healthcare Workforce to Deliver the Digital Future
21.The Nurse Review: Research, development and innovation (RDI) organisational landscape: an independent review – GOV.UK
22. The AI Centre for Value Based Healthcare
23 Pro-innovation Regulation of Technologies Review Life Sciences – GOV.UK
24 Department of Health and Social Care, First Do No Harm – The report of the Independent Medicines and Medical Devices Safety Review, 2020. The report of the IMMDSReview
25. AI and digital regulations service – AI Regulation – NHS Transformation Directorate
26. Software and AI as a Medical Device Change Programme – Roadmap – GOV.UK
27. AI Act: a step closer to the first rules on Artificial Intelligence | News | European Parliament
28. Algorithmic impact assessment in healthcare | Ada Lovelace Institute
31. Ethics and governance of artificial intelligence for health
32. Data saves lives: reshaping health and social care with data – GOV.UK
33. Goldacre Review
34. Commercial clinical trials in the UK: the Lord O’Shaughnessy review – final report – GOV.UK
35 .Royal Free breached UK data law in 1.6m patient deal with Google’s DeepMind.
36. BenevolentAI cuts half of its staff after drug trial flop
10th December 2022
Tackling the Harms in the Metaverse
12th August 2022
Creating the best framework for AI in the UK
5th April 2021