An adaptive deep learning approach based on InBNFus and CNNDen-GRU networks for breast cancer and maternal fetal classification using ultrasound images.

Authors

Fatima M,Khan MA,Mirza AM,Shin J,Alasiry A,Marzougui M,Cha J,Chang B

Affiliations (7)

  • Department of Computer Science, COMSATS University Islamabad, Wah Campus, Wah Cantt, 47040, Pakistan.
  • Department of AI, College of Computer Engineering and Science, Prince Mohammad Bin Fahd University, Al-Khobar, Saudi Arabia. [email protected].
  • Department of AI, College of Computer Engineering and Science, Prince Mohammad Bin Fahd University, Al-Khobar, Saudi Arabia.
  • School of Computer Science and Engineering, The University of Aizu, Aizuwakamatsu, 965-8580, Japan.
  • College of Computer Science, King Khalid University, Abha, 61413, Saudi Arabia.
  • Hanynag University, Seoul, South Korea.
  • Hanynag University, Seoul, South Korea. [email protected].

Abstract

Convolutional Neural Networks (CNNs), a sophisticated deep learning technique, have proven highly effective in identifying and classifying abnormalities related to various diseases. The manual classification of these is a hectic and time-consuming process; therefore, it is essential to develop a computerized technique. Most existing methods are designed to address a single specific problem, limiting their adaptability. In this work, we proposed a novel adaptive deep-learning framework for simultaneously classifying breast cancer and maternal-fetal ultrasound datasets. Data augmentation was applied in the preprocessing phase to address the data imbalance problem. After, two novel architectures are proposed: InBnFUS and CNNDen-GRU. The InBnFUS network combines 5-Blocks inception-based architecture (Model 1) and 5-Blocks inverted bottleneck-based architecture (Model 2) through a depth-wise concatenation layer, while CNNDen-GRU incorporates 5-Blocks dense architecture with an integrated GRU layer. Post-training features were extracted from the global average pooling and GRU layer and classified using neural network classifiers. The experimental evaluation achieved enhanced accuracy rates of 99.0% for breast cancer, 96.6% for maternal-fetal (common planes), and 94.6% for maternal-fetal (brain) datasets. Additionally, the models consistently achieve high precision, recall, and F1 scores across both datasets. A comprehensive ablation study has been performed, and the results show the superior performance of the proposed models.

Topics

Breast NeoplasmsDeep LearningNeural Networks, ComputerUltrasonography, PrenatalFetusJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.