wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
135 stars 10 forks source link

BERTje for SRL #5

Closed dafnevk closed 4 years ago

dafnevk commented 4 years ago

Thanks for sharing the BERTje model!

We want to try to use BERTje for Semantic Role Labeling, which you mention as one of the tasks in the paper. Could you share the code for fine-tuning the model, and possibly also the fine-tuned model?

losimons commented 4 years ago

Interested in the example code as well!

wietsedv commented 4 years ago

(Sorry for the late response, I did not receive/notice Github notifications.)

See issue #4. I will include some sample code with the models.