cwmok / C2FViT

This is the official Pytorch implementation of "Affine Medical Image Registration with Coarse-to-Fine Vision Transformer" (CVPR 2022), written by Tony C. W. Mok and Albert C. S. Chung.
MIT License
131 stars 3 forks source link

training data queries #12

Closed EvelyneCalista closed 5 months ago

EvelyneCalista commented 9 months ago

Hi, thanks for your great work.

I have downloaded the OASIS data from your provided link. However, for the semi supervised and unsupervised template matching require some files for template matching and pairwise which i couldn't find on the provided data folder. _semi supervised pairwise: seg35_onehot.nii.gz semi supervised template matching: seg4_mni_onehot.nii.gz semi supervised template matching: MNI-maxprob-thr50-1mm_pad_RSP_oasis_onehot.nii.gz unsupervised template matching : seg4mni.nii.gz

Could you please help me to explain how can i get these kinds of data? Thank you so much!!

cwmok commented 9 months ago

Hi @EvelyneCalista,

Thanks for your interest in our work. The data you mentioned is just the one-hot encoding of the label files. See the example data: https://github.com/cwmok/C2FViT/tree/main/Data

I will attach my preprocessing script below. Note that I currently don't have time to clean the code. The attached code serves as a reference only.

unsupervised template matching : seg4_mni.nii.gz

This should be the segmentation map of MNI152 template but with only the selected 4 anatomical structures.

cwmok commented 9 months ago

Here is the preprocessing code I use to get the one-hot encoding label: preprocessing_script.zip

EvelyneCalista commented 9 months ago

Hi @cwmok

Thank you for the reference code. I would like to reconfirm that for this file 'seg4_mni.nii.gz' , do the input file is from 'seg35.nii.gz' like in your reference code?

cwmok commented 9 months ago

Hi @EvelyneCalista ,

Yes, you're correct. Please see "preprocess_MNI152.py" for more details.

EvelyneCalista commented 9 months ago

Hi @cwmok ,

Got it, I can run the training code. Thank you for your help! :)