A multi-modal graph-based framework for Alzheimer's disease detection.

Authors

Mashhadi N,Marinescu R

Affiliations (2)

  • Department of Computer Science and Engineering, University of California, Santa Cruz, CA, USA.
  • Department of Computer Science and Engineering, University of California, Santa Cruz, CA, USA. [email protected].

Abstract

We propose a compositional graph-based Machine Learning (ML) framework for Alzheimer's disease (AD) detection that constructs complex ML predictors from modular components. In our directed computational graph, datasets are represented as nodes [Formula: see text], and deep learning (DL) models are represented as directed edges [Formula: see text], allowing us to model complex image-processing pipelines [Formula: see text] as end-to-end DL predictors. Each directed path in the graph functions as a DL predictor, supporting both forward propagation for transforming data representations, as well as backpropagation for model finetuning, saliency map computation, and input data optimization. We demonstrate our model on Alzheimer's disease prediction, a complex problem that requires integrating multimodal data containing scans of different modalities and contrasts, genetic data and cognitive tests. We built a graph of 11 nodes (data) and 14 edges (ML models), where each model has been trained on handling a specific task (e.g. skull-stripping MRI scans, AD detection,image2image translation, ...). By using a modular and adaptive approach, our framework effectively integrates diverse data types, handles distribution shifts, and scales to arbitrary complexity, offering a practical tool that remains accurate even when modalities are missing for advancing Alzheimer's disease diagnosis and potentially other complex medical prediction tasks.

Topics

Alzheimer DiseaseImage Processing, Computer-AssistedJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.