Contextual structured annotations on PACS: a futuristic vision for reporting routine oncologic imaging studies and its potential to transform clinical work and research.
Authors
Affiliations (5)
Affiliations (5)
- Abdominal Imaging Department, The University of Texas MD Anderson Cancer Center, 1515 Holcomb Blvd., Unit 1473, Houston, TX, 77030, USA. [email protected].
- Abdominal Imaging Department, The University of Texas MD Anderson Cancer Center, 1515 Holcomb Blvd., Unit 1473, Houston, TX, 77030, USA.
- College of Arts and Sciences, Community Health and Biology Major, Tufts University, 419 Boston Avenue, Medford, 02155, USA.
- Honors Program, 2025, College of Liberal Arts, Neuroscience Major, Temple University, 1801 N. Broad St., Philadelphia, PA, 19122, USA.
- Abdominal Imaging Department, The University of Texas MD Anderson Cancer Center, 1515 Holcomb Blvd., Unit 1473, Houston, TX, 77030, USA. [email protected].
Abstract
Radiologists currently have very limited and time-consuming options to annotate findings on the images and are mostly limited to arrows, calipers and lines to annotate any type of findings on most PACS systems. We propose a framework placing encoded, transferable, highly contextual structured text annotations directly on PACS images indicating the type of lesion, level of suspicion, location, lesion measurement, and TNM status for malignant lesions, along with automated integration of this information into the radiology report. This approach offers a one-stop solution to generate radiology reports that are easily understood by other radiologists, patient care providers, patients, and machines while reducing the effort needed to dictate a detailed radiology report and minimizing speech recognition errors. It also provides a framework for automated generation of large volume high quality annotated data sets for machine learning algorithms from daily work of radiologists. Enabling voice dictation of these contextual annotations directly into PACS similar to voice enabled Google search will further enhance the user experience. Wider adaptation of contextualized structured annotations in the future can facilitate studies understanding the temporal evolution of different tumor lesions across multiple lines of treatment and early detection of asynchronous response/areas of treatment failure. We present a futuristic vision, and solution with the potential to transform clinical work and research in oncologic imaging.