IcarusWizard / MAE

PyTorch implementation of Masked Autoencoder
MIT License
234 stars 46 forks source link

train_classifier #21

Open amirrezadolatpour2000 opened 9 months ago

amirrezadolatpour2000 commented 9 months ago

Hello, Why did we not drop the decoder and insert a new linear layer at the end of the norm layer in the train classifier for fine-tuning? Why did you get each module of the pretrained model individually and try to reconstruct the fine-tuner instead of dropping the decoder module?

IcarusWizard commented 9 months ago

Hi, this is to disable the PatchShuffle. That should be the only difference between ViT_Classifier and MAE_Encoder. One could also implement a flag to switch off PatchShuffle directly in MAE_Encoder.