wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
133 stars 10 forks source link

TF checkpoints #13

Closed vkocaman closed 3 years ago

vkocaman commented 3 years ago

Hi,

Is there any TF checkpoints available ? not the Pytorch one

wietsedv commented 3 years ago

No, I did not explicitly create a TF model. But I think Huggingface Transformers converts the PyTorch checkpoints for you if you run TFAutoModel.from_pretrained("wietsedv/bert-base-dutch-cased").