Back to all papers

Learning diverse and generic representations of the brain with large-scale multi-task pretraining

December 22, 2025medrxiv logopreprint

Authors

Leonardsen, E. H.,Dahl, A.,Holm, M. C.,de Lange, A.-M.,Grodem, E. O. S.,Marquand, A. F.,Sorensen, O.,Schwarz, E.,Andreassen, O. A.,Wolfers, T.,Wang, Y.,Westlye, L. T.

Affiliations (1)

  • University of Oslo

Abstract

Large pretrained models developed and shared by actors with privileged access to data and compute have played a central role in the democratisation of deep learning in a range of domains. Here, we contribute to this endeavour in the field of neuroimaging, by compiling a large dataset of structural magnetic resonance imaging scans (n=114,257) and using them to pretrain a multi-task convolutional neural network to predict age, sex, handedness, BMI, fluid intelligence and neuroticism. Subsequent analyses show that our pretraining approach results in a rich and diverse feature space, implying the model is sensitive to a broad spectrum of nuanced neuroanatomical variation. We also demonstrate the usefulness of the pretrained model by employing it to predict brain age and perform clinical classifications via transfer learning, showing that it consistently outperforms models trained from scratch.

Topics

neurology

Ready to Sharpen Your Edge?

Subscribe to join 7,600+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.