airsplay / vokenization

PyTorch code for EMNLP 2020 Paper "Vokenization: Improving Language Understanding with Visual Supervision"
MIT License
186 stars 22 forks source link

Do vokens need when fine-tuning? #5

Closed VickiCui closed 3 years ago

VickiCui commented 3 years ago

Hello, Thank you for your great works! I wonder if vokens are only used for pre-training? That is to say, when fine-tuning, one can directly use the pre-trained model you have released as a BERT-like model. And there is no need to do vokenization on downstream data.

airsplay commented 3 years ago

Yes. It does not need vokens in fine-tuning.

VickiCui commented 3 years ago

Thanks for your reply!