How the UK can accelerate radiology AI adoption
It’s no secret that the deployment of artificial intelligence in radiology is lagging behind expectations. Hundreds of startups, SMEs and vendors have spent the past few years grinding away at making a dent in the NHS, gaining regulatory clearances and conducting never-ending pilot studies. However, with venture funding running low, and little hope on the horizon for mass-uptake of this technology, many companies are starting to wonder if the NHS is ever going to open it’s doors and embrace the AI revolution.
I’ve worked with many radiology AI companies from across the world, and they are all asking the same thing - how do we get into the NHS? On the flip side, I know firsthand that both the UK Government and NHS leadership is also wondering the same - how can we adopt radiology AI more quickly? Through dozens of conversations, and hands-on experience consulting on both sides of the chasm, I’ve got a good idea of what the main blockers and barriers are - so here goes…
Scrap the DTAC
If there was ever a common emotion amongst AI radiology vendors engaging with the NHS, it’s their visceral resentment of the Digital Assessment Technology Criteria assessment (DTAC). The DTAC was designed to be a framework assessment for NHS trusts to use in order to enable them to gather relevant product information regarding evidence of safety, risk management and cybersecurity. It is aligned with NHS specific standards known as DCB0129 and DCB0160 for clinical risk management during development and deployment of health IT systems.
According to the DTAC website: “all new digital technology should be assessed using the DTAC, even if you are piloting or trialling it”. What this means in practice is that every time a vendor wants to engage with an NHS Trust in any capacity they must complete a DTAC assessment. Since there is no central ‘DTAC certification’, and each trust can add its own spin on their assessment criteria, this creates a layer of bureaucracy that is exponential in nature, since each trust and each deployment requires a new DTAC assessment for each vendor. This wastefulness has enabled a predatory third party market of non-NHS affiliated ‘certification’ companies to exist, offering paid-for DTAC certificates, with no accreditation or legal meaning.
To add salt to the wound, every single vendor has already undergone mandatory regulatory clearance via UKCA or CE marking to prove their device is safe, effective and cybersecure. These regulatory processes are centrally certified (so vendors only need do them once), plus they incorporate pretty much everything covered in the DTAC. Yes, you heard that right - the DTAC adds very little to a purchasers knowledge in regards to the product above a CE mark. The only additional requirements are that risk management documentation must be surfaced in the form of a shared risk management log (which could easily be lifted from the vendor’s regulatory technical file which was produced under the internationally recognised standard ISO 14971 for risk management, and inside of a validated Quality Management System under ISO 13485), and that the vendor must have appointed a Clinical Safety Officer (CSO). I’m yet to see any concrete evidence that having a CSO meaningfully makes a product safer or less risky - but nevertheless, this requirement can easily be met without having to do a full DTAC assessment for each and every deployment site.
Vendors have become so allergic to the DTAC that the industry representative organisation body for radiology - AXREM - has publicly called for the DTAC to be scrapped. They rightly argue that the DTAC is wasteful and duplicative - and point out that the original DTAC wasn’t even intended for medical devices, but somehow has been subverted to cover every single AI product. They also raise the question that, since the standards on which the DTAC is based are not internationally recognised, and there are perfectly adequate ISO and IEC standards with which vendors already comply, then why doesn’t the NHS use those? This would significantly streamline with international efforts, and remove the utterly fruitless layer of DTAC paperwork.
Change IR(ME)R
One of the most tantalising promises of radiology AI is that of autonomous systems working without a human-in-the-loop. In keeping with the ideology of the great Henry Ford, the NHS must understand that by deliberately keeping a human within the value chain, no costs are saved, and very little added value is generated. If we are ever to unlock the true value of AI, it can only be done by automation, not augmentation. For those radiology use cases that can safely and robustly be shown to be fully automatable, we should be putting every effort into doing so.
One such use case is that of triaging normal chest X-rays from abnormal ones, so that the normals never have to be reviewed by a radiologist (or radiographer). This is one of the holy grails of radiology AI, but it is currently being held back by an out-dated law in the form of the The Ionising Radiation (Medical Exposure) Regulations 2017, better known as IR(ME)R.
I’ve written on this extensively previously, but in a nutshell, IR(ME)R lays out in law that the responsibility for the clinical evaluation of a medical exposure to ionising radiation (a CXR) lies with an appropriately qualified and trained human operator.
Now is the time to consider changing this law - by removing the explicit need for a human operator to conduct the clinical evaluation of an ionising radiation exposure for medical purposes, and allowing for an appropriately evidenced AI system to conduct the evaluation. The UK is, by quirk of serendipity, uniquely positioned as a country to change this law, since we are no longer bound by the equivalent European EURATOM legislation. I think I just found the elusive Brexit bonus….
Model the economics at national scale
One of the pillars of the NHS is it’s dedication to delivering evidence-based medicine. For this reason it is rightly regarded in high esteem globally. Everything that the NHS does is because there is evidence to back it up (in theory…). So why on earth are we not actively and robustly generating the health economic evidence to showcase the potential value of AI in radiology?
Currently, the only motivation vendors have to generate health economic evidence is to cover their own product for the purposes of developing a business case, or for a NICE technology evaluation. This means that very few have actually done this kind of work, and even fewer have been actually approved by NICE (I count only two radiology AI products with full NICE approval at the time of writing - a woefully small number). NICE have the nasty habit of questioning any evidence they receive - so even if a vendor does do the hard work and generate an economic model, it gets pulled apart in a 62 week technology assessment process. Not exactly an inspiring proposition…
I think we have this all backwards.
Instead of passing the burden of health economic evidence onto cash-strapped vendors knocking on the NHS’s door - we should instead be conducting national economic modelling on the use-cases of AI that we want to see become used in practice. For example - we know that AI can be used in mammography screening as a second reader (replacing a human second reader) with a powerful economic argument of an almost 50% reduction in workforce resource. Why is the NHS not modelling out the economic impact of this centrally? Similarly, for the above example of autonomous CXR AI for normals - or the use of AI in time-critical situations such as stroke detection.
If the NHS did it’s own modelling, it would not have to question the evidence presented by vendors - plus it would have a better idea of how to approach the government for the relevant funding, equipped with an economic model that projects millions, if not billions, in cost effectiveness at a national scale.
Develop a dedicated AI procurement framework
There is a running joke in the healthtech community that if you ask twenty people how to get a product into the NHS, they will all give you a different answer. Of course, to vendors, this isn’t funny. It’s literally life or death for their company.
Hospitals have the luxury of buying their rubber gloves, drugs, and other equipment via centrally managed procurement frameworks. All that a vendor needs to do is apply to the relevant procurement framework, et voilà, they are in the NHS.
Are we doing this for radiology AI? Sort of…
The NHS has a whopping 34 procurement frameworks available, including one specifically for Artificial Intelligence (AI), Imaging and Radiotherapy Equipment, Associated Products and Diagnostic Imaging. However, AI systems are lumped in with everything else a radiology department can buy, there’s only 10 vendors (out of >100 individual products available on market), and this framework appears to run out in Sept 2024 - surely it wouldn’t be too difficult to stand up a new dedicated framework explicitly for AI products and their integration into PACS/RIS systems?
I’m fairly sure that a dedicated radiology AI framework would be eminently more suitable, and would have the vendors coming at it like moths.
Incentivise adoption
Finally, let’s talk about the elephant in the room - the AIDF.
The AI Diagnostic Fund was a gulping last gasp of air, aiming to finally get radiology AI into the NHS through a mere £21million pounds - less than most AI companies have raised individually in VC funding. Announced with fanfare and fervour, and the idealistic aim to get AI into hospitals in time for Christmas - the reality has been an utterly chaotic and apparently ‘legally challenging’ lesson in how not to fund radiology AI.
Vendors have been attacking the fund like piranhas, desperate for a bite. Christmas was ruined for many in the rush to fill in as many forms as possible. NHS Trust AI champions have been gagged and silenced from talking to their collaborating vendors, and there are even rumours of law suits, and now it’s March and we still don’t know which AI is going to be deployed where. Fun times.
What we should learn from this debacle is that simply chucking a wodge of cash at a problem doesn’t really work. Instead, if the government really is so keen to get AI deployed quickly in order to make their election chances look like they have a hope, what should happen is this: incentivise adoption by offering to increase the annual budget for NHS trusts that deploy AI. It’s that simple.
Use AI, get more funding.
The genius of this is that it costs nothing initially. You can simply pass on the promised budget increases to the next government. And if that turns out to be you, then hopefully you’ll have done the economic modelling I described above, and will know how much money you can expect to get back in return.
That’s all folks!
Hardian Health is a clinical digital consultancy focused on leveraging technology into healthcare markets through clinical strategy, scientific validation, regulation, health economics and intellectual property.