BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
No, I did not explicitly create a TF model. But I think Huggingface Transformers converts the PyTorch checkpoints for you if you run TFAutoModel.from_pretrained("wietsedv/bert-base-dutch-cased").
Hi,
Is there any TF checkpoints available ? not the Pytorch one