Back to all papers

BCECNN: an explainable deep ensemble architecture for accurate diagnosis of breast cancer.

Authors

Ergün U,Çoban T,Kayadibi İ

Affiliations (3)

  • Department of Biomedical Engineering, Faculty of Engineering, Afyon Kocatepe University, Afyonkarahisar, Turkey. [email protected].
  • Department of Biomedical Engineering, Faculty of Engineering, Afyon Kocatepe University, Afyonkarahisar, Turkey.
  • Department of Management Information Systems, Faculty of Economics and Administrative Sciences, Afyon Kocatepe University, Afyonkarahisar, Turkey.

Abstract

Breast cancer remains one of the leading causes of cancer-related deaths globally, affecting both women and men. This study aims to develop a novel deep learning (DL)-based architecture, the Breast Cancer Ensemble Convolutional Neural Network (BCECNN), to enhance the diagnostic accuracy and interpretability of breast cancer detection systems. The BCECNN architecture incorporates two ensemble learning (EL) structures: Triple Ensemble CNN (TECNN) and Quintuple Ensemble CNN (QECNN). These ensemble models integrate the predictions of multiple CNN architectures-AlexNet, VGG16, ResNet-18, EfficientNetB0, and XceptionNet-using a majority voting mechanism. These models were trained using transfer learning (TL) and evaluated on five distinct sub-datasets generated from the Artificial Intelligence Smart Solution Laboratory (AISSLab) dataset, which consists of 266 mammography images labeled and validated by radiologists. To improve transparency and interpretability, Explainable Artificial Intelligence (XAI) techniques, including Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME), were applied. Additionally, explainability was assessed through clinical evaluation by an experienced radiologist. Experimental results demonstrated that the TECNN model-comprising AlexNet, VGG16, and EfficientNetB0-achieved the highest accuracy of 98.75% on the AISSLab-v2 dataset. The integration of XAI methods substantially enhanced the interpretability of the model, enabling clinicians to better understand and validate the model's decision-making process. Clinical evaluation confirmed that the XAI outputs aligned well with expert assessments, underscoring the practical utility of the model in a diagnostic setting. The BCECNN model presents a promising solution for improving both the accuracy and interpretability of breast cancer diagnostic systems. Unlike many previous studies that rely on single architectures or large datasets, BCECNN leverages the strengths of an ensemble of CNN models and performs robustly even with limited data. It integrates advanced XAI techniques-such as Grad-CAM and LIME-to provide visual justifications for model decisions, enhancing clinical interpretability. Moreover, the model was validated using AISSLab dataset, designed to reflect real-world diagnostic challenges. This combination of EL, interpretability, and robust performance on small yet clinically relevant data positions BCECNN as a novel and reliable decision support tool for AI-assisted breast cancer diagnostics.

Topics

Breast NeoplasmsDeep LearningNeural Networks, ComputerMammographyJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.