issues
search
wietsedv
/
bertje
BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
133
stars
10
forks
source link
Update citations
#17
Closed
andreasvc
closed
3 years ago
andreasvc
commented
3 years ago
Good luck with the poster session tomorrow!
wietsedv
commented
3 years ago
Thanks! :)
Good luck with the poster session tomorrow!