RadAI Slice is your weekly intelligence briefing on the most critical developments at the intersection of radiology and artificial intelligence. Stop searching. Start leading.
The latest developments in Radiology & AI.
Each issue is precisely structured to give you exactly what you need. No fluff, just facts and forward-looking insights.
Researchers developed MoBluRF for creating sharp, dynamic 3D neural radiance fields from blurry monocular videos.
Joint Commission releases AI safety guidance while major advances surface in predictive and radiology AI models.
A study finds mammography acquisition settings influence both AI and radiologist performance in breast cancer detection.
This study aimed to develop a deep learning (DL) model for automatic detection and diagnosis of gouty arthritis (GA) in the first metatarsophalangeal joint (MTPJ) using ultrasound (US) images. A retrospective study included individuals who underwent first MTPJ ultrasonography between February and July 2023. A five-fold cross-validation method (training set = 4:1) was employed. A deep residual convolutional neural network (CNN) was trained, and Gradient-weighted Class Activation Mapping (Grad-CAM) was used for visualization. Different ResNet18 models with varying residual blocks (2, 3, 4, 6) were compared to select the optimal model for image classification. Diagnostic decisions were based on a threshold proportion of abnormal images, determined from the training set. A total of 2401 US images from 260 patients (149 gout, 111 control) were analyzed. The model with 3 residual blocks performed best, achieving an AUC of 0.904 (95% CI: 0.887~0.927). Visualization results aligned with radiologist opinions in 2000 images. The diagnostic model attained an accuracy of 91.1% (95% CI: 90.4%~91.8%) on the testing set, with a diagnostic threshold of 0.328. The DL model demonstrated excellent performance in automatically detecting and diagnosing GA in the first MTPJ.
This study aims to explore the potential of machine learning as a non-invasive automated tool for skin tumor differentiation. Data were included from 156 lesions, collected retrospectively from September 2021 to February 2024. Univariate and multivariate analyses of traditional clinical features were performed to establish a logistic regression model. Ultrasound-based radiomics features are extracted from grayscale images after delineating regions of interest (ROIs). Independent samples t-tests, Mann-Whitney U tests, and Least Absolute Shrinkage and Selection Operator (LASSO) regression were employed to select ultrasound-based radiomics features. Subsequently, five machine learning methods were used to construct radiomics models based on the selected features. Model performance was evaluated using receiver operating characteristic (ROC) curves and the Delong test. Age, poorly defined margins, and irregular shape were identified as independent risk factors for malignant skin tumors. The multilayer perception (MLP) model achieved the best performance, with area under the curve (AUC) values of 0.963 and 0.912, respectively. The results of DeLong's test revealed a statistically significant discrepancy in efficacy between the MLP and clinical models (Z=2.611, p=0.009). Machine learning based skin tumor models may serve as a potential non-invasive method to improve diagnostic efficiency.
ObjectiveThis study aims to enhance breast cancer diagnosis by developing an automated deep learning framework for real-time, quantitative ultrasound imaging. Breast cancer is the second leading cause of cancer-related deaths among women, and early detection is crucial for improving survival rates. Conventional ultrasound, valued for its non-invasive nature and real-time capability, is limited by qualitative assessments and inter-observer variability. Quantitative ultrasound (QUS) methods, including Nakagami imaging--which models the statistical distribution of backscattered signals and lesion morphology--present an opportunity for more objective analysis. MethodsThe proposed framework integrates three convolutional neural networks (CNNs): (1) NakaSynthNet, synthesizing quantitative Nakagami parameter images from B-mode ultrasound; (2) SegmentNet, enabling automated lesion segmentation; and (3) FeatureNet, which combines anatomical and statistical features for classifying lesions as benign or malignant. Training utilized a diverse dataset of 110,247 images, comprising clinical B-mode scans and various simulated examples (fruit, mammographic lesions, digital phantoms). Quantitative performance was evaluated using mean squared error (MSE), structural similarity index (SSIM), segmentation accuracy, sensitivity, specificity, and area under the curve (AUC). ResultsNakaSynthNet achieved real-time synthesis at 21 frames/s, with MSE of 0.09% and SSIM of 98%. SegmentNet reached 98.4% accuracy, and FeatureNet delivered 96.7% overall classification accuracy, 93% sensitivity, 98% specificity, and an AUC of 98%. ConclusionThe proposed multi-parametric deep learning pipeline enables accurate, real-time breast cancer diagnosis from ultrasound data using objective quantitative imaging. SignificanceThis framework advances the clinical utility of ultrasound by reducing subjectivity and providing robust, multi-parametric information for improved breast cancer detection.
BONTECH Co., Ltd.
The BONX805 is a mobile X-ray system designed for radiology applications. It helps clinicians by providing portable X-ray imaging capabilities, facilitating diagnostic imaging in various clinical settings where mobility is essential.
Shenzhen Mindray Bio-medical Electronics Co., Ltd.
The ViewMate Ultrasound System by Shenzhen Mindray is a pulsed Doppler ultrasound imaging system. It provides clinicians with ultrasound imaging capabilities to assist in diagnostic examinations, helping to visualize blood flow and other physiological data non-invasively.
Shanghai United Imaging Healthcare Co., Ltd.
The uWS-Angio is an image processing system used in radiology to assist clinicians by enhancing and analyzing medical images, improving diagnosis and patient care.
We scour dozens of sources so you don't have to. Get all the essential information in a 5-minute read.
Never miss a critical update. Understand the trends shaping the future of your practice and research.
Be the first to know about the tools and technologies that matter, from clinical practice to academic research.
Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.
We respect your privacy. Unsubscribe at any time.