How AI is revolutionising chest X-ray at the Royal Surrey

During a recent trip to the Royal Surrey County Hospital, Synergy joined Charlotte Beardmore, executive director of professional policy, to find out more about the trust’s experience with AI integration

How AI is revolutionising chest X-ray at the Royal Surrey

During a recent trip to the Royal Surrey County Hospital, Synergy joined Charlotte Beardmore, executive director of professional policy, to find out more about the trust’s experience with AI integration

By Will Phillips

By Will Phillips

It’s been just over a year since the Royal Surrey NHS Foundation Trust implemented global health tech company Harrison.ai (formerly Annalise.ai) AI solution for chest X-ray reporting – and the results can only be called ‘phenomenal’.

That’s according to radiographers and members of the AI deployment team itself, who helped demonstrate how the triage and prioritisation system has been aiding the department in slashing scan waiting times.

Following its successful rollout at the Royal Surrey County Hospital (RSCH) for GP-referred patients in September 2024, the technology was introduced to the emergency department on 11 March 2025, marking the second phase of its implementation.

Synergy was invited on a recent visit by Charlotte Beardmore, executive director of professional policy at the SoR, to the RSCH, where she was given a firsthand look at how the system works. Mike Jones, digital manager, Chris Whitton, radiographer and IT specialist, and Mary Tuke, reporting radiographer, introduced Charlotte to its advantages and costs alike, as well as explaining the importance of ensuring trusts across the country are supported to integrate AI into their workflows.

Synergy listened in to find out more about how AI can help reduce waiting times, what more can be done by the government to support it and why everyone ‘just needs to give it a go’.

By Will Phillips

By Will Phillips

It’s been just over a year since the Royal Surrey NHS Foundation Trust implemented global health tech company Harrison.ai (formerly Annalise.ai) AI solution for chest X-ray reporting – and the results can only be called ‘phenomenal’.

That’s according to radiographers and members of the AI deployment team itself, who helped demonstrate how the triage and prioritisation system has been aiding the department in slashing scan waiting times.

Following its successful rollout at the Royal Surrey County Hospital (RSCH) for GP-referred patients in September 2024, the technology was introduced to the emergency department on 11 March 2025, marking the second phase of its implementation.

Synergy was invited on a recent visit by Charlotte Beardmore, executive director of professional policy at the SoR, to the RSCH, where she was given a firsthand look at how the system works. Mike Jones, digital manager, Chris Whitton, radiographer and IT specialist, and Mary Tuke, reporting radiographer, introduced Charlotte to its advantages and costs alike, as well as explaining the importance of ensuring trusts across the country are supported to integrate AI into their workflows.

Synergy listened in to find out more about how AI can help reduce waiting times, what more can be done by the government to support it and why everyone ‘just needs to give it a go’.

Early days

Chris Whitton has worked at the RSCH since July this year but, before that, he was a member of the Harrison team itself, working to deploy the company’s AI solutions across 43 NHS trusts and six imaging networks.

Having studied radiography at University College London Hospitals from 1984 to 1985, Chris initially moved out of the profession to work as a sales consultant in the packaging industry. But the draw of imaging was inescapable – Chris retrained as a radiographer at London South Bank University from 1994 to 1997 to gain his Bachelor's, and then worked clinically for two years at Frimley Park Hospital in Surrey. 

But Chris’s sales experience and his own interest in information technology drew him back to business management and the technological aspects of radiography. Over the next 25 years, Chris worked across various healthcare roles, as an adviser, a sales manager, a business director and a technical consultant, until, in March 2024, he was employed by Harrison to support the rollout of its solution under the government’s AI Diagnostic Fund (AIDF).

Over the course of the next year, Chris helped deploy Harrison's musculoskeletal AI solution for fractures on projectional radiographs and a CT brain solution. Where it would normally take around an hour for a CT brain image to be processed and reported upon, Harrison's solution could do so in just three minutes.

Now, Chris works as a radiographer and an IT specialist at the RSCH. “There’s a reason you become a radiographer in the first place,” he explains. “I get to be arms-deep with patients. I really feel like I’m achieving something.”

The AIDF is a £21m fund to expedite artificial intelligence integration into the NHS, introduced in June 2023 by Rishi Sunak’s Conservative government. Specifically intended for procuring and deploying AI imaging and decision-support tools for radiology, the fund has helped trusts to “just try out” AI solutions, says Chris, safely within a pilot setting and with a radiologist's or radiographer's reporting images as the gold standard.

He explains that radiology has always been positive around AI’s potential for improving patient outcomes, but getting access to it has required time, evidence and funding. Central support from the AIDF helped incentivise trusts to push it through – and now they know the steps they need to follow in future.

With the funding allocation stage having finished last year, trusts are now in the process of rolling out solutions in record time. Barriers to deployment were quickly knocked down, Chris says, because funding was only available for a brief period.

Over the past year and a half of funding, the RSCH has built its case for the return on investment, and that’s enabled the team to make sure they can keep the technology on.

AI in practice

Mary Tuke has been a reporting radiographer at the RSCH for the last six years. After completing her undergraduate degree at Portsmouth University, Mary became the first chest reporting radiographer at the hospital. Now, she is one of four chest reporters helping set up AI projects at the trust. That role gives her day-to-day experience with the RSCH’s AI usage, particularly in its usage for MSK projectional radiography.

During Synergy’s visit, Mary demonstrated the usage of the program on a dummy scan. She explains that the technology doesn’t pinpoint any one pathology – it checks for everything. The system also shows confidence levels for its diagnoses marked on the scan itself.

For Mary, the main benefits are how easily it slots into her workflow, its high level of usability and its excellent user interface. “If it just slows me down, I’m not interested,” she says.

The AI tracks through the backlog of images and, within 90 seconds of acquisition, assesses and triages them, prioritising scans based on its suspicion of severity. Its speed allows the team to keep patients in the department while it assesses them – so patients with a serious diagnosis can be given more attention on the same day.

The RSCH completes around 40,000 chest X-rays a year, with almost all (roughly 95 per cent) of these being reported by just four advanced radiography practitioners. At any one time, there can be quite a few scans waiting to be reported on – even if the team completes 100 X-rays each per day, if someone is on leave or sick, the backlog can quickly climb higher. “It’s not a huge wait,” Mike says. “But if you’ve come in and discovered you’ve got an early lung cancer detection, you don’t want to wait eight days for your interpretation – you want it done there and then. That’s what AI allows us to do. It pre-reads the X-ray and says ‘this one could represent an acute, urgent finding’; it recommends you interpret this one first.”

If an incidental lung cancer is picked following a GP chest X-ray referral, then the AI integration means the patient can immediately be entered into the National Optimal Lung Cancer Pathway. This pathway suggests all patients coming through from primary care should have a CT on the same day as the chest X-ray, or at least within three days. Very few trusts across the country are meeting this standard, Mike adds, primarily because of capacity issues – but also because it’s so hard to identify that it relies purely on radiographers flagging it during routine scans. If a finding is very subtle, such as the earliest signs of a stage one cancer, it's unlikely to be spotted at the point of acquisition, as radiographers aren't trained to interpret images to the same level as advanced practitioners. AI can triage those X-rays and flag potential abnormalities, with a prioritisation alert appearing in PACS within a minute. This capacity means that 85 per cent of GP-referred lung cancer patients are now having same-day results from their CT scans, which would normally take around four weeks. “That is a phenomenal turnaround,” says Mike.

An internal retrospective study carried out within the Trust has shown encouraging performance, with the AI identifying all cancers found by radiologists over the past five years - and, in some instances, flagging them even earlier.

AI systems aren’t static. As scanners, clinical practice, and patient populations evolve, models can drift and gradually lose accuracy. Keeping them reliable requires periodic retraining and controlled updates using refreshed, high-quality data.

Mary explains that updating models is really difficult. The system has been tested and verified on specific datasets but, while Harrison itself can update these processes, self-updating datasets remain a legislative question.

Fortunately, radiographers are still able to question the AI’s diagnosis and tweak its sensitivities. Limitations certainly exist that make this necessary; the solution struggles to differentiate between more and less severe cases.

Maintaining interoperability between trusts in an imaging network also means they must agree on their specific configurations. Rolling out the technology for other departments is difficult – teams aren’t allowed to refer a CT scan based solely on AI opinion.

Currently, the system cannot compare a patient’s most recent scan to previous ones to assess progression. Measuring change, Mary says, is incredibly useful, and she is hopeful that future updates will fix this. “It’s got to be responsive,” she adds. “Autonomous AI reporting is not legal, and not safe. You can only add value when you combine the two [AI and radiographers].”

Early days

Chris Whitton has worked at the RSCH since July this year but, before that, he was a member of the Harrison team itself, working to deploy the company’s AI solutions across 43 NHS trusts and six imaging networks.

Having studied radiography at University College London Hospitals from 1984 to 1985, Chris initially moved out of the profession to work as a sales consultant in the packaging industry. But the draw of imaging was inescapable – Chris retrained as a radiographer at London South Bank University from 1994 to 1997 to gain his Bachelor's, and then worked clinically for two years at Frimley Park Hospital in Surrey. 

But Chris’s sales experience and his own interest in information technology drew him back to business management and the technological aspects of radiography. Over the next 25 years, Chris worked across various healthcare roles, as an adviser, a sales manager, a business director and a technical consultant, until, in March 2024, he was employed by Harrison to support the rollout of its solution under the government’s AI Diagnostic Fund (AIDF).

Over the course of the next year, Chris helped deploy Harrison's musculoskeletal AI solution for fractures on projectional radiographs and a CT brain solution. Where it would normally take around an hour for a CT brain image to be processed and reported upon, Harrison's solution could do so in just three minutes.

Now, Chris works as a radiographer and an IT specialist at the RSCH. “There’s a reason you become a radiographer in the first place,” he explains. “I get to be arms-deep with patients. I really feel like I’m achieving something.”

The AIDF is a £21m fund to expedite artificial intelligence integration into the NHS, introduced in June 2023 by Rishi Sunak’s Conservative government. Specifically intended for procuring and deploying AI imaging and decision-support tools for radiology, the fund has helped trusts to “just try out” AI solutions, says Chris, safely within a pilot setting and with a radiologist's or radiographer's reporting images as the gold standard.

He explains that radiology has always been positive around AI’s potential for improving patient outcomes, but getting access to it has required time, evidence and funding. Central support from the AIDF helped incentivise trusts to push it through – and now they know the steps they need to follow in future.

With the funding allocation stage having finished last year, trusts are now in the process of rolling out solutions in record time. Barriers to deployment were quickly knocked down, Chris says, because funding was only available for a brief period.

Over the past year and a half of funding, the RSCH has built its case for the return on investment, and that’s enabled the team to make sure they can keep the technology on.

AI in practice

Mary Tuke has been a reporting radiographer at the RSCH for the last six years. After completing her undergraduate degree at Portsmouth University, Mary became the first chest reporting radiographer at the hospital. Now, she is one of four chest reporters helping set up AI projects at the trust. That role gives her day-to-day experience with the RSCH’s AI usage, particularly in its usage for MSK projectional radiography.

During Synergy’s visit, Mary demonstrated the usage of the program on a dummy scan. She explains that the technology doesn’t pinpoint any one pathology – it checks for everything. The system also shows confidence levels for its diagnoses marked on the scan itself.

For Mary, the main benefits are how easily it slots into her workflow, its high level of usability and its excellent user interface. “If it just slows me down, I’m not interested,” she says.

The AI tracks through the backlog of images and, within 90 seconds of acquisition, assesses and triages them, prioritising scans based on its suspicion of severity. Its speed allows the team to keep patients in the department while it assesses them – so patients with a serious diagnosis can be given more attention on the same day.

The RSCH completes around 40,000 chest X-rays a year, with almost all (roughly 95 per cent) of these being reported by just four advanced radiography practitioners. At any one time, there can be quite a few scans waiting to be reported on – even if the team completes 100 X-rays each per day, if someone is on leave or sick, the backlog can quickly climb higher. “It’s not a huge wait,” Mike says. “But if you’ve come in and discovered you’ve got an early lung cancer detection, you don’t want to wait eight days for your interpretation – you want it done there and then. That’s what AI allows us to do. It pre-reads the X-ray and says ‘this one could represent an acute, urgent finding’; it recommends you interpret this one first.”

If an incidental lung cancer is picked following a GP chest X-ray referral, then the AI integration means the patient can immediately be entered into the National Optimal Lung Cancer Pathway. This pathway suggests all patients coming through from primary care should have a CT on the same day as the chest X-ray, or at least within three days. Very few trusts across the country are meeting this standard, Mike adds, primarily because of capacity issues – but also because it’s so hard to identify that it relies purely on radiographers flagging it during routine scans. If a finding is very subtle, such as the earliest signs of a stage one cancer, it's unlikely to be spotted at the point of acquisition, as radiographers aren't trained to interpret images to the same level as advanced practitioners. AI can triage those X-rays and flag potential abnormalities, with a prioritisation alert appearing in PACS within a minute. This capacity means that 85 per cent of GP-referred lung cancer patients are now having same-day results from their CT scans, which would normally take around four weeks. “That is a phenomenal turnaround,” says Mike.

An internal retrospective study carried out within the Trust has shown encouraging performance, with the AI identifying all cancers found by radiologists over the past five years - and, in some instances, flagging them even earlier.

AI systems aren’t static. As scanners, clinical practice, and patient populations evolve, models can drift and gradually lose accuracy. Keeping them reliable requires periodic retraining and controlled updates using refreshed, high-quality data.

Mary explains that updating models is really difficult. The system has been tested and verified on specific datasets but, while Harrison itself can update these processes, self-updating datasets remain a legislative question.

Fortunately, radiographers are still able to question the AI’s diagnosis and tweak its sensitivities. Limitations certainly exist that make this necessary; the solution struggles to differentiate between more and less severe cases.

Maintaining interoperability between trusts in an imaging network also means they must agree on their specific configurations. Rolling out the technology for other departments is difficult – teams aren’t allowed to refer a CT scan based solely on AI opinion.

Currently, the system cannot compare a patient’s most recent scan to previous ones to assess progression. Measuring change, Mary says, is incredibly useful, and she is hopeful that future updates will fix this. “It’s got to be responsive,” she adds. “Autonomous AI reporting is not legal, and not safe. You can only add value when you combine the two [AI and radiographers].”

Example results from use of AI decision support - the system outlines in purple the potential abnormalities and provides suggestions for what they could be, prioritised according to significance

Example results from use of AI decision support - the system outlines in purple the potential abnormalities and provides suggestions for what they could be

‘We shouldn’t be reinventing the wheel’

Mike Jones was a radiographer for 14 years before becoming the digital manager at the RSCH. His role was initially focused on AI in radiology, but he has now had the PACS placed in his lap as well.

His role in helping to integrate the AI solution into the trust has been instrumental in building the business case and ensuring the transition runs smoothly. “Procurement and integration are recurring challenges,” he explains. “Trusts must follow strict rules and can’t simply agree on using the same system, even when that would improve interoperability and deliver economies of scale. If AI is to be scaled effectively across the NHS, we need clearer procurement pathways. We shouldn’t be reinventing the wheel every time we onboard a new product.”

The whole point of AIDF was to allow trusts to try before they buy, Mike adds. Having it centrally funded to be deployed meant teams could prove its worth, and then turn to internal funding to keep it going. Interoperability of AI solutions, however, would allow for an economy of scale and more regional thinking.

Still, Mike is keen to emphasise that the benefits of integrating an AI system far outweigh the risks. In fact, aversion to risk is really hamstringing trusts from achieving the inescapable benefits AI can provide. “There’s more clinical risk to not deploying this system,” he continues. “The local metrics [productivity and scan turnaround time] are remarkable.”

In an ideal world, AI would help filter the large volume of routine scans that are ultimately reported as normal, allowing radiographers to devote more time to the complex cases where their expertise is most needed. But, Mike emphasises, this doesn’t mean ‘deskilling’ radiographers or radiologists. “You still have to understand what normal looks like in order to recognise the abnormal - AI should enhance, not replace, clinical judgement.”
The next step is to start expanding the evidence base. “Everyone’s in the same boat, waiting for someone else to take the leap,” he explains. “You need a clinical director who is pro-radiographer. The right person to do it [take the lead on AI integration] is someone who wants to do it.”

Over the next few months, Charlotte is going to be joining AIDF meetings. Her interest is in taking forward the momentum and real-world evidence that initial funding rounds have created, especially because, the longer it takes to establish central funding, the more likely that people with relevant experience will make the transition over to industry.

“To put it simply, AI is good,” Mike concludes. “There will be patients whose lives have been saved because of it. Even when everything is lined up in the normal course of the patient pathway, there are so many points of failure. Every department is siloed when it comes to budget, structure and so on. Radiology can play a central role, and AI can be the icing on the cake.”

More about Harrison.ai

The Harrison.ai chest X-ray tool uses AI to examine chest X-rays and identify up to 124 possible health issues in patients’ lungs, including serious conditions such as cancer and collapsed lungs. 

Harrison.ai's technology is being used across five NHS trusts that are part of the Surrey, Sussex and Frimley Imaging Network: Royal Surrey, Frimley Health NHS Foundation Trust, Ashford and St Peter’s Hospitals NHS Foundation Trust, University Hospitals Sussex NHS Foundation Trust and East Sussex Healthcare NHS Trust. It's also being used in 40+ Trusts outside of the Surrey, Sussex and Frimley Imaging Network.

Read more