Back to all papers

Learning from Prototypes: Contrastive Learning with Prior-Aware Multi-Label Chest X-ray Classification.

April 23, 2026pubmed logopapers

Authors

Zeng X,Ye H,Yu N,Ding F,Yu K,Li H,Wan Z

Abstract

Multi-label Chest X-ray (CXR) classification faces significant challenges from the inherently imperfect nature of clinical data, particularly the complex interplay of co-occurring pathologies, training data with a long-tailed distribution, and high visual similarity between distinct diseases. To address these challenges, we propose a novel framework that synergizes medical prior knowledge with prototype-driven contrastive learning, enabling disentangled and discriminative per-pathology representation learning. In particular, our approach integrates a co-occurrence modulated Label Graph Attention (LGA) module, which leverages semantic prior knowledge from a pre-trained large language model (LLM) and statistical co-occurrence patterns from training data to model inter-pathology relationships. Subsequently, a Label-Aware Decoupling (LAD) decoder is proposed to isolate pathology-specific visual features and mitigate feature suppression by dominant classes. Furthermore, we introduce an Adaptive Proto type Contrastive Learning (APCL) mechanism to enhance the discriminability of visually similar pathologies. Extensive experiments on the NIH ChestX-ray14 and CheXpert datasets demonstrate the framework's superiority, achieving state-of-the-art mean AUCs of 0.834 and 0.840, respectively. Furthermore, cross-dataset evaluations on the external MIMIC-CXR dataset validate the framework's exceptional zero-shot and few-shot generalization capabilities, highlighting its strong robustness and potential for real world clinical deployment. The implementation is available at https://github.com/ZengXHYX/Learning-from-Prototypes.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.