Back to all papers

Interpretable Auto Window setting for deep-learning-based CT analysis.

Authors

Zhang Y,Chen M,Zhang Z

Affiliations (3)

  • University of Shanghai for Science and Technology, Shanghai, China. Electronic address: [email protected].
  • University of Shanghai for Science and Technology, Shanghai, China. Electronic address: [email protected].
  • Huashan Hospital, Fudan University, Shanghai, China. Electronic address: [email protected].

Abstract

Whether during the early days of popularization or in the present, the window setting in Computed Tomography (CT) has always been an indispensable part of the CT analysis process. Although research has investigated the capabilities of CT multi-window fusion in enhancing neural networks, there remains a paucity of domain-invariant, intuitively interpretable methodologies for Auto Window Setting. In this work, we propose plug-and-play module derived from Tanh activation function. This module enables the deployment of medical imaging neural network backbones without requiring manual CT window configuration. Domain-invariant design facilitates observation of the preference decisions rendered by the adaptive mechanism from a clinically intuitive perspective. We confirm the effectiveness of the proposed method on multiple open-source datasets, allowing for direct training without the need for manual window setting and yielding improvements with 54%∼127%+ Dice, 14%∼32%+ Recall and 94%∼200%+ Precision on hard segmentation targets. Experimental results conducted in NVIDIA NGC environment demonstrate that the module facilitates efficient deployment of AI-powered medical imaging tasks. The proposed method enables automatic determination of CT window settings for specific downstream tasks in the development and deployment of mainstream medical imaging neural networks, demonstrating the potential to reduce associated deployment costs.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.