Back to all papers

A multi-paradigm evaluation spanning pixels to voxels for deep learning-based kidney tumor segmentation.

April 15, 2026pubmed logopapers

Authors

Lalwani R,Telang A,Tiwari V

Affiliations (1)

  • Center for Artificial Intelligence, Madhav Institute of Technology & Science, Deemed University, Gwalior, India.

Abstract

Automated segmentation of kidney tumors from computed tomography (CT) scans is crit- ical for diagnosis, treatment planning, and monitoring of renal cell carcinoma (RCC). While recent deep learning models report high Dice scores (>0.97), their clinical utility remains questionable due to false positive predictions that misclassify healthy tissue as tumors and computational constraints limiting real-world deployment. Unlike existing studies that emphasise quantitative metrics, this work investigates the critical gap between high segmentation accuracy and clinical applicability. We systematically evaluate six diverse architectures spanning 2D CNNs (U-Net, MedSAM) to 3D volumetric models (nnU-Net, UNETR, Total Segmenta- tor, MIScnn) on the KiTS19 dataset, emphasising false positive analysis, boundary delineation accuracy, and computational feasibility. Key findings [1]: MONAI U-Net achieves Dice score of 0.98 but exhibits excessive false positives, undermining clinical trust [2]; nnU-Net provides balanced performance (Dice: 0.82) with consistent results but demands 16GB VRAM [3]; MedSAM achieves state-of-the-art accuracy (Dice: 0.99) with minimal false positives but re- quires high-end GPUs [4]; computational constraints prevented full training of UNETR. This study identifies that high Dice scores do not guarantee clinical utility and provides actionable insights for developing clinically feasible segmentation tools for renal oncology applications including treatment planning, longitudinal monitoring, and risk assessment.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.