Posted on 11/25/2025 2:40:02 PM PST by nickcarraway
Findings are "important milestone" in a field where AI has yet to prove clinical use, expert says
Key Takeaways
-In a confirmatory study, AI achieved statistically non-inferior and superior performance in detecting pancreatic cancer compared with a pool of participating radiologists.
-At matched sensitivity levels, the AI system reduced the number of false positives compared with radiologists.
-AI could help diagnose pancreatic cancer sooner, and more accurately, an expert suggested.
-An artificial intelligence (AI) system outperformed radiologists in detecting pancreatic ductal adenocarcinoma (PDAC) on routine CT scans, according to a non-inferiority, confirmatory, observational study.
In a testing cohort of 1,130 patients, the AI system achieved an area under the receiver operating characteristic curve (AUROC) of 0.92 (95% CI 0.90-0.93), with 85.7% sensitivity and 83.5% specificity at the optimal ROC threshold, reported Natalia Alves, MSc, of Radboud University Medical Center in Nijmegen, the Netherlands, and colleagues.
Additionally, in a subset of 391 patients, AI achieved statistically non-inferior (P<0.0001) and superior (P=0.001) performance in detecting PDAC, with an AUROC of 0.92 (95% CI 0.89-0.94), compared with a pool of participating radiologists, who had an AUROC of 0.88 (95% CI 0.85-0.91), they noted in Lancet Oncology.
At matched sensitivity levels, the AI system reduced the number of false positives by 38% (85 with AI vs 138 with readers).
"The PANORAMA study is, to the best of our knowledge, the first paired, international, confirmatory diagnostic accuracy study to evaluate the performance of radiologists and a stand-alone AI system for PDAC detection on contrast-enhanced CT, using a FAIR (findable, accessible, interoperable, and reusable) benchmark," Alves and team wrote. "Our findings show that AI trained on large, diverse datasets can exceed average radiologist performance in PDAC detection on routine contrast-enhanced CT, providing a foundation for regulatory dialogue and prospective validation."
In an accompanying commentary, Misha Luyer, MD, PhD, of Catharina Hospital Eindhoven in the Netherlands, noted that the finding that AI outperformed radiologists is "an important milestone in a field where many AI tools still have to prove their clinical use."
AI could help diagnose pancreatic cancer earlier, and more accurately, he added, which is particularly important "given how difficult early diagnosis can be, how harmful delays are, and how much radiology expertise varies between hospitals."
Luyer also pointed out that AI's success at reducing false positives should alleviate concerns about unnecessary follow-up testing and increased patient anxiety.
"Translating these promising results into daily clinical practice will be a next key step," he wrote.
The PANORAMA study consisted of two parallel parts -- an AI grand challenge and the multi-reader, multi-case observer study.
In the AI grand challenge, international teams of AI developers created algorithms for PDAC detection on CT using a multicenter, public dataset for model training and tuning. Those algorithms were submitted for masked evaluation with standardized, prespecified performance metrics.
The multi-reader, multi-case observer study included 68 radiologists from 40 centers in 12 countries, all of whom had previous experience in reading abdominal CT scans.
In total, the study involved 3,440 patients (56% men, median age 67 years) across five participating centers and two publicly available datasets, including a cohort of 2,310 patients from four tertiary care centers in the Netherlands and the U.S. for training (n=2,224) and tuning (n=86), and a sequestered cohort of 1,130 patients (406 with histologically confirmed PDAC) from five tertiary care centers in the Netherlands, Sweden, and Norway for testing.
The radiologists read images from a subset of 391 cases (144 with PDAC) from the sequestered cohort.
Alves and colleagues acknowledged the study had limitations, notably that it was conducted in a controlled online environment, "which does not fully replicate the nuanced decision-making processes in a clinical setting, where access to previous imaging, clinical notes, and laboratory data informs interpretation."
They observed that the focus in this study was on distinguishing PDAC from other pancreatic abnormalities and that future research should also address the differential diagnosis of malignant pancreato-biliary neoplasms, "reflecting the broader scope of routine clinical practice."
Good, we can stop paying radiologists in India for reading our diagnostic imaging systems.
I haven’t heard of that.
There’s a lot of hype about “AI”, which usually means some sort of LLM, without much seeming to come from it.
But here and there, on the margins, there are news stories like this showing versions of AI with actual, real world application.
And then there’s the authors of Alphafold getting the Nobel prize last year, and accurately predicting the structure of ~200 million different proteins.
And I recently learned that various AI systems have “solved” ~10 of Paul Erdős famous math problems, though some say that they just were able to find previously published solutions that had been overlooked. That’s impressive enought.
When these things are doing math, and science, and making actually useful interpretations of medical imaging, it’s going way past hype into something that seems much more significant.
And it’s happening really fast.
However, patients will still consult physicians to interpret the results and discuss treatment.
Yet it surmises that a 1-3/16” socket is actually being confused with an 1-1/16” 27 mm size ? AI fails to impress, lacks basis of thought.
Narrow focus, expert systems are what is beneficial, like this. Expert systems enhance human experts and are regarded as tools. But only in a narrow targeted scope/field.
My BIL is a retired Radiologist.
He worked 15 hours a day for 3 weeks and then took a week off. The long hours and volume of work was affecting his health so he retired at age 62.
You’re not serious, are you?
If you want to sum up what AI is exceptionally good at in a few words, it is “detecting patterns”.
Making predictions based on complex patterns.
I was told by a doc at Kaiser that they have been electronically transferring radiological images to India for reading.
BIG news ping!
Is it cheaper?
AI voice: Turn your head and cough.
Depends on whether your life is worth a potentially life saving early diagnosis for one of the most deadly cancers due to failure to detect in its early stages.
Hang tight to your money. You can always take it with you into the afterlife.
Is this news? I was under the impression that for decades it was routine for a CT scan or MRI scan to have the machine's image analyzing software review the image to try to find things to point out to the tech worker/ doctor. Quite often the image analyzer found something the tech worker would have overlooked.
....and who loses?
Pretty school there will be no good reason for careers in radiology much less med school.
You can upload a picture of your moles on to ChatGPT it will read it for you
Of course I went to dermatologist to make sure
Both ChatGPT and dermatologist told me the exact same thing.
“And it’s happening really fast.”
Yep, AI is going to dwarf the industrial and internet revolutions.
As a result it will cause major cultural and societal upheavals.
Buckle up!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.