Back to all papers

Development of an explainable machine learning model to reproduce and interpret expert pharmacological decisions in osteoporosis treatment.

December 1, 2025pubmed logopapers

Authors

Sugawara Y,Shimizu T,Ishizu H,Arita K,Ohashi Y,Yamazaki S,Kokabu T,Yamada K,Iwasaki N

Affiliations (4)

  • Department of Orthopedic Surgery, Faculty of Medicine and Graduate School of Medicine, Hokkaido University, Sapporo, Japan; Department of Orthopedic Surgery, Central Hospital, Hakodate, Hokkaido, Japan.
  • Department of Orthopedic Surgery, Faculty of Medicine and Graduate School of Medicine, Hokkaido University, Sapporo, Japan. Electronic address: [email protected].
  • Department of Orthopedic Surgery, Faculty of Medicine and Graduate School of Medicine, Hokkaido University, Sapporo, Japan.
  • Department of Orthopedic Surgery, Eniwa Hospital, Eniwa, Hokkaido, Japan.

Abstract

The management of osteoporosis in real-world clinical practice is highly heterogeneous, reflecting the complexity and variability inherent in therapeutic decision-making. Although artificial intelligence (AI)-based tools have been developed to support diagnosis, limited research has investigated their potential to elucidate the rationale underlying treatment choices. This study applied explainable machine learning to replicate and interpret pharmacological treatment decisions made by two board-certified osteoporosis specialists. We retrospectively analyzed 1481 patients who underwent dual-energy X-ray absorptiometry (DXA) and lateral spine radiography between 2020 and 2023 at Hokkaido University Hospital and two affiliated institutions. Two specialists independently assigned patients to one of five non-overlapping treatment categories. External validation was performed in 372 outpatients from three independent hospitals in 2024. The LightGBM model demonstrated the highest predictive performance. To interpret this model, we analyzed feature importance and applied SHapley Additive exPlanations (SHAP) to identify the most influential clinical factors driving treatment decisions. The LightGBM model achieved an accuracy of 0.90 and an F1-score of 0.90 in external validation. SHAP analysis revealed that femoral neck bone mineral density (BMD) and severity of vertebral fractures (especially grade 3) were the most influential factors in treatment selection. These patterns mirrored the expert reasoning, highlighting the prioritization of objective imaging data in therapeutic decisions. This study demonstrated that explainable AI can clarify the clinical reasoning behind osteoporosis treatment decisions. Bone mineral density and vertebral fracture severity are key determinants, supporting a transparent and reproducible framework for future decision support tools that assist clinicians in making consistent therapeutic decisions.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.