Back to all papers

Transparent reporting is central to reproducible radiological AI research: A call to action.

March 6, 2026pubmed logopapers

Authors

White SJ,Yaxley KL,To MS,Chau M

Affiliations (4)

  • Adelaide Medical School, Faculty of Health and Medical Sciences, University of Adelaide, Adelaide, South Australia, Australia; School of Dentistry and Medical Sciences, Faculty of Science and Health, Charles Sturt University, Wagga Wagga, New South Wales, Australia. Electronic address: [email protected].
  • Department of Radiology, Fiona Stanley Hospital, Murdoch, Western Australia, Australia.
  • Flinders Health and Medical Research Institute, Flinders University, Bedford Park, South Australia, Australia; South Australia Medical Imaging, Flinders Medical Centre, Bedford Park, South Australia, Australia.
  • School of Dentistry and Medical Sciences, Faculty of Science and Health, Charles Sturt University, Wagga Wagga, New South Wales, Australia.

Abstract

Artificial intelligence (AI) is increasingly embedded in radiology research and practice, yet concerns about the reproducibility of AI studies remain a key barrier to regulatory acceptance and clinical translation. Transparent reporting across the analytic pipeline is essential for independent verification, evidence synthesis, and safe implementation. To examine major barriers to transparent reporting in radiological AI research and propose practical solutions to strengthen reproducibility. We identify four interrelated challenges undermining reporting quality. First, reporting is often treated as a final-stage manuscript requirement rather than a prospectively embedded component of study design, leading to loss of methodological detail. Second, inconsistent operationalization of reporting standards by journals and funders makes it difficult to assess reporting completeness. Third, fragmentation and overlapping complexity of AI-specific reporting guidelines increase interpretive burden for authors and reviews and contribute to heterogeneity in reporting. Fourth, limited availability and usability of data and code restrict independent verification, even when resources are nominally shared. These issues are amplified by the methodological complexity and heterogeneity of radiological AI pipelines. For each challenge, we outline actionable strategies including prospective documentation of methods, mandatory and consistently applied reporting standards, harmonized reporting principles, and clearer requirements for data and code availability with structured alternatives when sharing is constrained. Strengthening transparent reporting is a prerequisite for reproducible and clinically translatable radiological AI research. Embedding reporting within study design, aligning stakeholder expectations, and improving access to reproducibility-enabling resources are essential to enhance credibility, facilitate independent evaluation, and support safe clinical implementation of AI in radiology.

Topics

Journal ArticleReview

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.