Duplums / yAwareContrastiveLearning

Official Pytorch Implementation for y-Aware Contrastive Learning
54 stars 11 forks source link

BHB10K dataset for pretraining. #1

Closed aneesh-aparajit closed 2 years ago

aneesh-aparajit commented 2 years ago

I am currently working on contrastive learning with SimCLR on the Alzheimer's data set. I have tried the pre-trained weights, but the validation loss doesn't seem to converge quite well. Would it be possible to access to the BHB-10K data, for further pre-training?

Duplums commented 2 years ago

Thanks for your interest ! Data from BHB-10K cannot necessarily be shared but you can find an opened version here: https://ieee-dataport.org/open-access/openbhb-multi-site-brain-mri-dataset-age-prediction-and-debiasing I hope this could help.

aneesh-aparajit commented 2 years ago

Thanks a lot!

aneesh-aparajit commented 2 years ago

I also noticed that your code also fine tunes the encoder and not just the classifier. Which I believe is not the conventional way of doing contrastive learning. Is there any specific reason for doing so?

Duplums commented 2 years ago

This code allows you to fine-tune the whole encoder to perform Transfer Learning on a downstream task (it is standard to fine-tune all weights in that case). This is conventional for practioners working on specific pathologies (e.g. Alzheimer's disease) as it usually gives better results than simple linear evaluation (where only a linear layer is trained on top of the pre-trained encoder). The code for linear evaluation can be found in several standard repos on CL (e.g [SimCLR]https://github.com/sthalles/SimCLR) ).

aneesh-aparajit commented 2 years ago

Oh thanks! I was unaware of this convention. Pretty new to the field. Thanks a lot! Really Appreciate it! Thanks!