BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
Hello! I'm trying to run your code and I wanted to work with the coref data from SoNaR.
I saw this commented code for preparing the coreference data from the SoNaR data and I was wondering if you still have this method somewhere as it would be very useful for me?
https://github.com/wietsedv/bertje/blob/5cf291e084b84b4cb1cdb8f1a91b7c52587e0b4b/finetuning/prepare/prepare-sonar.py#L315
Hello! I'm trying to run your code and I wanted to work with the coref data from SoNaR. I saw this commented code for preparing the coreference data from the SoNaR data and I was wondering if you still have this method somewhere as it would be very useful for me?
Thank you!