BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
We want to try to use BERTje for Semantic Role Labeling, which you mention as one of the tasks in the paper. Could you share the code for fine-tuning the model, and possibly also the fine-tuned model?
Thanks for sharing the BERTje model!
We want to try to use BERTje for Semantic Role Labeling, which you mention as one of the tasks in the paper. Could you share the code for fine-tuning the model, and possibly also the fine-tuned model?