
The American College of Radiology is calling for greater transparency and explainability in AI algorithms for medical imaging.
Key Details
- 1ACR CEO Dana Smetherman emphasizes the need for AI explainability to build public trust.
- 2ACR supported Resolution 519 at the AMA House of Delegates 2025 meeting, requesting a framework for evidence-based AI transparency.
- 3The resolution was not adopted due to existing AMA policies, but highlighted ongoing concerns about AI 'black box' decision-making.
- 4ACR wants AI decisions to be explainable by qualified medical experts.
Why It Matters
As AI becomes more prevalent in radiology, transparent decision-making is crucial for safety, trust, and regulatory approval. ACR's stance elevates the conversation around explainability, directly impacting how future imaging AI systems may be developed and implemented.

Source
Radiology Business
Related News

•Radiology Business
Cigna Expands Nationwide Coverage for CT Imaging AI Tools
Cigna will reimburse for CT-based plaque analysis AI software for millions of members nationwide.

•AI in Healthcare
Joint Commission Issues AI Guidance; Radiology AI Advancements Highlighted
Joint Commission releases AI safety guidance while major advances surface in predictive and radiology AI models.

•AI in Healthcare
Physician Leadership Is Crucial for Ethical AI Integration in Healthcare
Researchers emphasize doctors must direct the integration of AI into clinical medicine to prioritize patient welfare.