
The American College of Radiology is calling for greater transparency and explainability in AI algorithms for medical imaging.
Key Details
- 1ACR CEO Dana Smetherman emphasizes the need for AI explainability to build public trust.
- 2ACR supported Resolution 519 at the AMA House of Delegates 2025 meeting, requesting a framework for evidence-based AI transparency.
- 3The resolution was not adopted due to existing AMA policies, but highlighted ongoing concerns about AI 'black box' decision-making.
- 4ACR wants AI decisions to be explainable by qualified medical experts.
Why It Matters
As AI becomes more prevalent in radiology, transparent decision-making is crucial for safety, trust, and regulatory approval. ACR's stance elevates the conversation around explainability, directly impacting how future imaging AI systems may be developed and implemented.

Source
Radiology Business
Related News

•AuntMinnie
Experts Call for Stricter FDA Standards in Radiology AI Validation
Dana-Farber experts recommend actionable steps to enhance the rigor and transparency of FDA validation standards for radiology AI software.

•AI in Healthcare
FDA Seeks Real-World Performance Insights on AI Medical Devices
FDA calls for healthcare worker feedback to enhance monitoring of AI-enabled medical devices in real-world settings.

•Radiology Business
Cigna Expands Nationwide Coverage for CT Imaging AI Tools
Cigna will reimburse for CT-based plaque analysis AI software for millions of members nationwide.