Back to all papers

Artificial Intelligence in Dental and Orthopedic Skeletal Imaging: A Scoping Review of Methodological Convergence and Translational Gaps.

April 27, 2026pubmed logopapers

Authors

Hung M,Strickler I,Jensen A,Gunnell B,Parry A,Dyal S,Lipsky MS

Affiliations (8)

  • College of Dental Medicine, Roseman University of Health Sciences, South Jordan, UT, USA. [email protected].
  • Department of Orthopaedic Surgery Operations, University of Utah, Salt Lake City, UT, USA. [email protected].
  • Department of Family and Preventive Medicine, University of Utah, Salt Lake City, UT, USA. [email protected].
  • Huntsman Cancer Institute, Salt Lake City, UT, USA. [email protected].
  • School of Medicine, Oakland University, Rochester, MI, USA.
  • College of Dental Medicine, Roseman University of Health Sciences, South Jordan, UT, USA.
  • Library, Roseman University of Health Sciences, South Jordan, UT, USA.
  • Institute on Aging, Portland State University, Portland, OR, USA.

Abstract

This scoping review aimed to answer the question: to what extent do artificial intelligence applications in dental and orthopedic skeletal imaging demonstrate true cross-disciplinary methodological convergence versus parallel development with shared translational barriers? This scoping review synthesizes AI applications across both fields to characterize methodological overlap, developmental asymmetries, and translational gaps, rather than assuming convergence. Following the PRISMA-ScR reporting standards, we searched PubMed, Scopus, Web of Science, IEEE Xplore, and EMBASE for peer-reviewed, English-language human studies published between January 2015 and May 2025. Eligible studies applied AI, machine learning, or deep learning to diagnostic, segmentation, or preoperative planning tasks in dental or orthopedic imaging. Three reviewers independently extracted data on imaging modality, task, model architecture, dataset characteristics, validation strategy, performance metrics, and translational considerations, with random auditing for consistency. Fifty-nine studies met inclusion criteria, comprising 48 dental (81.36%) and 11 orthopedic (18.64%) investigations, with no study spanning both domains. Most applications focused on foundational tasks such as segmentation and detection/classification using two-dimensional radiographs and cone-beam computed tomography. Computed tomography primarily supported bony anatomy and preoperative planning, while magnetic resonance imaging, the EOS system, and intraoral scanners were used in specialized workflows. Convolutional neural networks, particularly U-Net/nnU-Net variants and EfficientNet/ResNet backbones with YOLO-based detectors, dominated, alongside emerging transformer-based and hybrid physics-informed approaches. Internal validation performance was frequently high for segmentation (typical Dice 0.90-0.99), while more complex or anatomically challenging targets showed lower and more variable performance. External validation, prospective evaluation, and standardized reporting of calibration, expert comparators, and demographic performance were uncommon. The current AI skeletal imaging literature demonstrates strong technical feasibility but uneven clinical maturity, with dental imaging dominating in volume and automation of foundational tasks and orthopedic applications remaining fewer, more heterogeneous, and less mature. Rather than evidencing true cross-disciplinary convergence, the findings highlight asymmetrical development and shared translational barriers, particularly in validation rigor and real-world integration. By explicitly identifying these asymmetries, this review provides a realistic foundation for future cross-disciplinary collaboration focused on harmonized validation standards, clinically meaningful benchmarks, and equitable, workflow-native deployment.

Topics

Journal ArticleReview

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.