EPFL-VILAB / MultiMAE

MultiMAE: Multi-modal Multi-task Masked Autoencoders, ECCV 2022
https://multimae.epfl.ch
Other
544 stars 58 forks source link

is it normal to see this during finetuning? #22

Closed ucalyptus2 closed 1 year ago

ucalyptus2 commented 1 year ago
_IncompatibleKeys(missing_keys=['output_adapters.cls.norm.weight', 'output_adapters.cls.norm.bias', 'output_adapters.cls.head.weight', 'output_adapters.cls.head.bias'], unexpected_keys=[])

Could be happening because of deleting the output adapter. image

@dmizr thank you for replying to previous issues.

dmizr commented 1 year ago

Hi @forkbabu ,

Yes, this is normal to see. There is no output adapter for classification during pre-training, it is only added for fine-tuning.

Best, David