Closed roboserg closed 3 years ago
It should work with xlm-roberta-base
if you change the model_type
to xlmroberta
, but not with the fine-tuned xlm-roberta-large-finetuned-conll03-german
model.
I've explained the reasons in the other issues already.
Btw, you will run into the same issue if you try to use bert-large-cased-whole-word-masking-finetuned-squad
with 3 labels as it has been fine-tuned on a 9 label task.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
I want to fine-tune a transformer for a NER task with my custom data set with 3 labels. Everything works fine with a BERT based model. However when using a Roberta based model, such as xlm-roberta-base or xlm-roberta-large-finetuned-conll03-german I get an error:
Error picture for xlm-roberta-large-finetuned-conll03-german - https://i.imgur.com/8RcbLIW.png Error picture for xlm-roberta-base - https://i.imgur.com/3pVITaL.png
The xlm-roberta-large-finetuned-conll03-german is a NER task-specific model, only the output size is different between my custom labels (3) and pre-trained labels (9). As far as I understand this repo should automatically remove the last layer and replace it with the number of outputs equal to the number of my custom labels, just as it does for bert based models. But it doesn't work.
How do I fine-tune the roberta based models for my custom labels? I am using the NER specific model and load my own labels as per documentation.