Learning diverse and generic representations of the brain with large-scale multi-task pretraining
Authors
Affiliations (1)
Affiliations (1)
- University of Oslo
Abstract
Large pretrained models developed and shared by actors with privileged access to data and compute have played a central role in the democratisation of deep learning in a range of domains. Here, we contribute to this endeavour in the field of neuroimaging, by compiling a large dataset of structural magnetic resonance imaging scans (n=114,257) and using them to pretrain a multi-task convolutional neural network to predict age, sex, handedness, BMI, fluid intelligence and neuroticism. Subsequent analyses show that our pretraining approach results in a rich and diverse feature space, implying the model is sensitive to a broad spectrum of nuanced neuroanatomical variation. We also demonstrate the usefulness of the pretrained model by employing it to predict brain age and perform clinical classifications via transfer learning, showing that it consistently outperforms models trained from scratch.