Sort by:
Page 3 of 325 results

The added value of artificial intelligence using Quantib Prostate for the detection of prostate cancer at multiparametric magnetic resonance imaging.

Russo T, Quarta L, Pellegrino F, Cosenza M, Camisassa E, Lavalle S, Apostolo G, Zaurito P, Scuderi S, Barletta F, Marzorati C, Stabile A, Montorsi F, De Cobelli F, Brembilla G, Gandaglia G, Briganti A

pubmed logopapersMay 7 2025
Artificial intelligence (AI) has been proposed to assist radiologists in reporting multiparametric magnetic resonance imaging (mpMRI) of the prostate. We evaluate the diagnostic performance of radiologists with different levels of experience when reporting mpMRI with the support of available AI-based software (Quantib Prostate). This is a single-center study (NCT06298305) involving 110 patients. Those with a positive mpMRI (PI-RADS ≥ 3) underwent targeted plus systematic biopsy (TBx plus SBx), while those with a negative mpMRI but a high clinical suspicion of prostate cancer (PCa) underwent SBx. Three readers with different levels of experience, identified as R1, R2, and R3 reviewed all mpMRI. Inter-reader agreement among the three readers with or without the assistance of Quantib Prostate as well as sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for the detection of clinically significant PCa (csPCa) were assessed. 102 patients underwent prostate biopsy and the csPCa detection rate was 47%. Using Quantib Prostate resulted in an increased number of lesions identified for R3 (101 vs. 127). Inter-reader agreement slightly increased when using Quantib Prostate from 0.37 to 0.41 without vs. with Quantib Prostate, respectively. PPV, NPV and diagnostic accuracy (measured by the area under the curve [AUC]) of R3 improved (0.51 vs. 0.55, 0.65 vs.0.82 and 0.56 vs. 0.62, respectively). Conversely, no changes were observed for R1 and R2. Using Quantib Prostate did not enhance the detection rate of csPCa for readers with some experience in prostate imaging. However, for an inexperienced reader, this AI-based software is demonstrated to improve the performance. Name of registry: clinicaltrials.gov. NCT06298305. Date of registration: 2022-09.

Early budget impact analysis of AI to support the review of radiographic examinations for suspected fractures in NHS emergency departments (ED).

Gregory L, Boodhna T, Storey M, Shelmerdine S, Novak A, Lowe D, Harvey H

pubmed logopapersMay 7 2025
To develop an early budget impact analysis of and inform future research on the national adoption of a commercially available AI application to support clinicians reviewing radiographs for suspected fractures across NHS emergency departments in England. A decision tree framework was coded to assess a change in outcomes for suspected fractures in adults when AI fracture detection was integrated into clinical workflow over a 1-year time horizon. Standard of care was the comparator scenario and the ground truth reference cases were characterised by radiology report findings. The effect of AI on assisting ED clinicians when detecting fractures was sourced from US literature. Data on resource use conditioned on the correct identification of a fracture in the ED was extracted from a London NHS trust. Sensitivity analysis was conducted to account for the influence of parameter uncertainty on results. In one year, an estimated 658,564 radiographs were performed in emergency departments across England for suspected wrist, ankle or hip fractures. The number of patients returning to the ED with a missed fracture was reduced by 21,674 cases and a reduction of 20, 916 unnecessary referrals to fracture clinics. The cost of current practice was estimated at £66,646,542 and £63,012,150 with the integration of AI. Overall, generating a return on investment of £3,634,392 to the NHS. The adoption of AI in EDs across England has the potential to generate cost savings. However, additional evidence on radiograph review accuracy and subsequent resource use is required to further demonstrate this.

Accelerated inference for thyroid nodule recognition in ultrasound imaging using FPGA.

Ma W, Wu X, Zhang Q, Li X, Wu X, Wang J

pubmed logopapersMay 7 2025
Thyroid cancer is the most prevalent malignant tumour in the endocrine system, with its incidence steadily rising in recent years. Current central processing units (CPUs) and graphics processing units (GPUs) face significant challenges in terms of processing speed, energy consumption, cost, and scalability in the identification of thyroid nodules, making them inadequate for the demands of future green, efficient, and accessible healthcare. To overcome these limitations, this study proposes an efficient quantized inference method using a field-programmable gate array (FPGA). We employ the YOLOv4-tiny neural network model, enhancing software performance with the K-means + + optimization algorithm and improving hardware performance through techniques such as 8-bit weight quantization, batch normalization, and convolutional layer fusion. The study is based on the ZYNQ7020 FPGA platform. Experimental results demonstrate an average accuracy of 81.44% on the Tn3k dataset and 81.20% on the internal test set from a Chinese tertiary hospital. The power consumption of the FPGA platform, CPU (Intel Core i5-10200 H), and GPU (NVIDIA RTX 4090) were 3.119 watts, 45 watts, and 68 watts, respectively, with energy efficiency ratios of 5.45, 0.31, and 5.56. This indicates that the FPGA's energy efficiency is 17.6 times that of the CPU and 0.98 times that of the GPU. These results show that the FPGA not only significantly outperforms the CPU in speed but also consumes far less power than the GPU. Moreover, using mid-to-low-end FPGAs yields performance comparable to that of commercial-grade GPUs. This technology presents a novel solution for medical imaging diagnostics, with the potential to significantly enhance the speed, accuracy, and environmental sustainability of ultrasound image analysis, thereby supporting the future development of medical care.

Real-time brain tumour diagnoses using a novel lightweight deep learning model.

Alnageeb MHO, M H S

pubmed logopapersMay 6 2025
Brain tumours continue to be a primary cause of worldwide death, highlighting the critical need for effective and accurate diagnostic tools. This article presents MK-YOLOv8, an innovative lightweight deep learning framework developed for the real-time detection and categorization of brain tumours from MRI images. Based on the YOLOv8 architecture, the proposed model incorporates Ghost Convolution, the C3Ghost module, and the SPPELAN module to improve feature extraction and substantially decrease computational complexity. An x-small object detection layer has been added, supporting precise detection of small and x-small tumours, which is crucial for early diagnosis. Trained on the Figshare Brain Tumour (FBT) dataset comprising (3,064) MRI images, MK-YOLOv8 achieved a mean Average Precision (mAP) of 99.1% at IoU (0.50) and 88.4% at IoU (0.50-0.95), outperforming YOLOv8 (98% and 78.8%, respectively). Glioma recall improved by 26%, underscoring the enhanced sensitivity to challenging tumour types. With a computational footprint of only 96.9 GFLOPs (representing 37.5% of YOYOLOv8x'sFLOPs) and utilizing 12.6 million parameters, a mere 18.5% of YOYOLOv8's parameters, MK-YOLOv8 delivers high efficiency with reduced resource demands. Also, it trained on the Br35H dataset (801 images) to guarantee the model's robustness and generalization; it achieved a mAP of 98.6% at IoU (0.50). The suggested model operates at 62 frames per second (FPS) and is suited for real-time clinical processes. These developments establish MK-YOLOv8 as an innovative framework, overcoming challenges in tiny tumour identification and providing a generalizable, adaptable, and precise detection approach for brain tumour diagnostics in clinical settings.

Artificial intelligence-based echocardiography assessment to detect pulmonary hypertension.

Salehi M, Alabed S, Sharkey M, Maiter A, Dwivedi K, Yardibi T, Selej M, Hameed A, Charalampopoulos A, Kiely DG, Swift AJ

pubmed logopapersMay 1 2025
Tricuspid regurgitation jet velocity (TRJV) on echocardiography is used for screening patients with suspected pulmonary hypertension (PH). Artificial intelligence (AI) tools, such as the US2.AI, have been developed for automated evaluation of echocardiograms and can yield measurements that aid PH detection. This study evaluated the performance and utility of the US2.AI in a consecutive cohort of patients with suspected PH. 1031 patients who had been investigated for suspected PH between 2009-2021 were retrospectively identified from the ASPIRE registry. All patients had undergone echocardiography and right heart catheterisation (RHC). Based on RHC results, 771 (75%) patients with a mean pulmonary arterial pressure >20 mmHg were classified as having a diagnosis of PH (as per the 2022 European guidelines). Echocardiograms were evaluated manually and by the US2.AI tool to yield TRJV measurements. The AI tool demonstrated high interpretation yield, successfully measuring TRJV in 87% of echocardiograms. Manually and automatically derived TRJV values showed excellent agreement (intraclass correlation coefficient 0.94, 95% CI 0.94-0.95) with minimal bias (Bland-Altman analysis). Automated TRJV measurements showed equally high diagnostic accuracy for PH as manual measurements (area under the curve 0.88, 95% CI 0.84-0.90 <i>versus</i> 0.88, 95% CI 0.86-0.91). Automated TRJV measurements on echocardiography were similar to manual measurements, with similarly high and noninferior diagnostic accuracy for PH. These findings demonstrate that automated measurement of TRJV on echocardiography is feasible, accurate and reliable and support the implementation of AI-based approaches to echocardiogram evaluation and diagnostic imaging for PH.
Page 3 of 325 results
Show
per page
Get Started

Upload your X-ray image and get interpretation.

Upload now →

Disclaimer: X-ray Interpreter's AI-generated results are for informational purposes only and not a substitute for professional medical advice. Always consult a healthcare professional for medical diagnosis and treatment.