Back to all papers

Application of a virtual imaging framework for investigating a deep learning-based reconstruction method for 3D quantitative photoacoustic computed tomography.

December 29, 2025pubmed logopapers

Authors

Cam RM,Park S,Villa U,Anastasio MA

Affiliations (4)

  • Department of Electrical & Computer Engineering, University of Illinois Urbana-Champaign, 61801, IL, USA.
  • Department of Bioengineering, University of Illinois Urbana-Champaign, 61801, IL, USA.
  • Oden Institute for Computational Engineering and Sciences, The University of Texas at Austin, 78712, TX, USA.
  • Department of Biomedical Engineering, The University of Texas at Austin, 78712, TX, USA.

Abstract

Quantitative photoacoustic computed tomography (qPACT) is a promising imaging modality for estimating physiological parameters such as blood oxygen saturation. However, developing robust qPACT reconstruction methods remains challenging due to computational demands, modeling difficulties, and experimental uncertainties. Learning-based methods have been proposed to address these issues but remain largely unvalidated. Virtual imaging (VI) studies are essential for validating such methods early in development, before proceeding to less-controlled phantom or in vivo studies. Effective VI studies must employ ensembles of stochastically generated numerical phantoms that accurately reflect relevant anatomy and physiology. Yet, most prior VI studies for qPACT relied on overly simplified phantoms. In this work, a realistic VI testbed is employed for the first time to assess a representative 3D learning-based qPACT reconstruction method for breast imaging. The method is evaluated across subject variability and physical factors such as measurement noise and acoustic aberrations, offering insights into its strengths and limitations.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.