Back to all papers

Domain adaptation for low-dose CT denoising via pretraining and self-supervised fine-tuning.

March 2, 2026pubmed logopapers

Authors

Yuan S,Lv H,Zhou Z,Wu Z,Wang J,Li M,Zheng J,Du Q

Affiliations (3)

  • School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China.
  • Medical Imaging Department, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China.
  • Shandong Lab of Advanced Biomaterials and Medical Devices in Weihai, Weihai, China.

Abstract

Deep learning-based methods have become the dominant approach for low-dose CT (LDCT) denoising. However, their performance often degrades on cross-domain datasets due to domain gaps, highlighting the need for effective domain adaptation techniques. While domain adaptation methods based on the pretraining and fine-tuning paradigm show great potential, they typically require additional labeled data from the target domain, which limits their practicality. Therefore, this work aims to develop a self-supervised fine-tuning method for LDCT denoising. In our work, we propose to fine-tune pretrained models using self-supervised loss based on pixel shuffle image preprocessing. Additionally, we design a two-stage fine-tuning strategy to mitigate the input misalignment between the pretraining and fine-tuning stages. Furthermore, to effectively capture prior knowledge from the source domain, we design a dual-scale SwinIR model as the pretrained backbone. We evaluate our method on two public datasets, and the results demonstrate that it bridges the domain gap without requiring target-domain labels, achieving effective denoising performance and strong cross-domain generalization. Code and model for our proposed approach are publicly available at https://github.com/Wasserdawn/TSFDAN.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.