
Both radiologists and AI models struggle to differentiate between authentic and AI-generated ('deepfake') radiographic images, raising major security and clinical concerns.
Key Details
- 1Research published in RSNA's Radiology shows deepfake X-rays are highly convincing, deceiving even expert radiologists.
- 217 radiologists from 12 centers across 6 countries participated in the study.
- 3Images included AI-generated X-rays (from GPT-4o or RoentGen) mixed with real exams.
- 4Even when aware of the presence of fakes, radiologists could not reliably distinguish between authentic and fake images.
- 5The risk includes fraudulent litigation and potential for clinical harm if synthetic images are injected into hospital records.
Why It Matters

Source
Radiology Business
Related News

Radiology Receives Declining Share of Industry Research Funding
Radiologists received only 1.1% of industry-funded research payments in 2024, with a continuing downward trend.

GPT-4o AI Matches Radiologists in Follow-Up Imaging Recommendations
GPT-4o matched the performance of experienced radiologists and surpassed residents in recommending follow-up imaging from routine radiology reports.

AI Leverages Head CTs for Automated Heart Risk Assessments
AI models can turn routine head CT scans into automated cardiovascular risk assessments, expanding the utility of radiology studies.