AI-Generated Media Often Misrepresents Radiologists’ Roles and Diversity
A Canadian study finds that AI-generated patient media often misrepresents radiologists' roles and underrepresents diversity.
Key Details
- 1Study analyzed 1,380 images and videos generated by 8 text-to-image/video AI models.
- 2Technologists were depicted accurately in 82% of cases, but only 56.2% of radiologist depictions were role-appropriate.
- 3AI portrayals of radiologists were predominantly male (73.8%) and white (79.7%), with technologist portrayals being more diverse.
- 4Stethoscopes were incorrectly depicted in 45.4% of radiologist and 19.7% of technologist images.
- 5Bias in attire and environment, such as radiologists in business dress and dimly lit rooms, reinforced outdated stereotypes.
Why It Matters

Source
AuntMinnie
Related News

Toronto Study: LLMs Must Cite Sources for Radiology Decision Support
University of Toronto researchers found that large language models (LLMs) such as DeepSeek V3 and GPT-4o offer promising support for radiology decision-making in pancreatic cancer when their recommendations cite guideline sources.

AI Model Using Mammograms Enhances Five-Year Breast Cancer Risk Assessment
A new image-only AI model more accurately predicts five-year breast cancer risk than breast density alone, according to multinational research presented at RSNA 2025.

AI Model Uses CT Scans to Reveal Biomarker for Chronic Stress
Researchers developed an AI model to measure chronic stress using adrenal gland volume on routine CT scans.