Back to all papers

SMDRnet: Saliency multiscale dense residual network for multimodal medical image fusion.

March 7, 2026pubmed logopapers

Authors

Fu J,Yang J,Ge G

Affiliations (2)

  • School of Information Engineering, Zunyi Normal University, Zunyi, Guizhou, 563006, China. Electronic address: [email protected].
  • School of Information Engineering, Zunyi Normal University, Zunyi, Guizhou, 563006, China.

Abstract

Medical image fusion methods can provide more comprehensive and precise details about the internal structures of the human body by integrating data from multiple medical images, thereby helping doctors diagnose diseases more accurately. Meanwhile, medical fusion images can help doctors better observe and process diseased tissues, reduce the occurrence of misdiagnosis and missed diagnosis, and lower medical risks and costs. Furthermore, medical image fusion technology can also serve as a teaching tool to help medical students better understand and master relevant knowledge. However, current medical image fusion algorithms have drawbacks such as information loss, edge blurring, algorithm complexity, and poor real-time performance, which limit their applications. In response to these deficiencies, this article proposes a saliency multiscale dense residual network for multimodal medical image fusion. Firstly, add the two original medical images together, and then input them into a deep convolutional neural network to extract saliency feature maps. Next, a multiscale dense residual network reconstructs the image, followed by color transformation to produce the final fusion result. The experimental results show that the fusion images of the proposed algorithm have richer details, higher color fidelity, and better objective performance compared to the reference algorithms.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.