Closed FelixLabelle closed 3 years ago
I'm trying to integrate Vokenization with BERTScore and I'd like to get clarification on which tokenizer is being used for the pretrained RoBERTa + VLM model. Is it roberta-base or bert-base-uncased?
roberta-base
bert-base-uncased
Thanks. I am using bert-base-uncased for BERT + VLM (on Wiki) here, and use roberta-base for RoBERTa + VLM (on Wiki).
Perfect, thanks for the quick reply!
I'm trying to integrate Vokenization with BERTScore and I'd like to get clarification on which tokenizer is being used for the pretrained RoBERTa + VLM model. Is it
roberta-base
orbert-base-uncased
?