Back to all papers

TET Loss: A Temperature-Entropy Calibrated Transfer Loss for Reliable Medical Image Classification.

January 5, 2026pubmed logopapers

Authors

Pan W

Affiliations (1)

  • School of Computer and Artificial Intelligence, Shandong Jianzhu University, No.1000 Fengming Road, Ganggou Subdistrict, Jinan, Shandong, 250101, People's Republic of China. [email protected].

Abstract

Deep learning models for medical image classification often exhibit overconfident predictions and domain mismatch when transferred from natural image pretraining, which undermines their generalization and clinical reliability. This study proposes TET Loss (Temperature-Entropy calibrated Transfer Loss Function), a plug-and-play objective function that combines temperature scaling to moderate logit sharpness with entropy regularization to promote uncertainty-aware learning. TET Loss is model-agnostic and introduces zero inference-time overhead. Across four public benchmarks (BreastMNIST, DermaMNIST, PneumoniaMNIST, and RetinaMNIST), TET Loss consistently enhances CNNs, transformers, and hybrid backbones under short 10-epoch fine-tuning. For example, EfficientViT-M2 improves its F1 score from 53.9 to 66.7% on BreastMNIST, and BiFormer-Tiny increases its F1 from 73.1 to 86.1% with an AUC gain to 94.1%. On PneumoniaMNIST, RMT-T3 with TET Loss reaches an F1 of 96.4% and an AUC of 99.1%, surpassing several medical-specific architectures trained for 50-150 epochs. Grad-CAM visualizations demonstrate tighter lesion localization and fewer spurious activations, reflecting improved interpretability. By calibrating confidence while preserving discriminative learning, TET Loss provides a lightweight and effective pathway toward more reliable and robust medical imaging systems. Our code will be available at https://github.com/JEFfersusu/TET_loss .

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 8,000+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.