BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
Hi! Could you please tell me if the line "Books: a collection of contemporary and historical fiction novels (4.4GB)" in the paper (section 2.1: Data) refers to DBNL or to some other dataset? Hartelijk dank! :)
Hi! Could you please tell me if the line "Books: a collection of contemporary and historical fiction novels (4.4GB)" in the paper (section 2.1: Data) refers to DBNL or to some other dataset? Hartelijk dank! :)