wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
135 stars 10 forks source link

downloadable model is unavailable #11

Closed jurrr closed 4 years ago

jurrr commented 4 years ago

Currently, the BERTje model seems to be unavailable for download (https://bertje.s3.eu-central-1.amazonaws.com/ ).

On each attempt, it returns the message:

AccessDenied Access Denied CB0B32AE86ECEDB8 4d9xaP3L2LMiU++7QtKIS+T7dOUAfvy+UJsdvCFy5iOfuZ40TWEEs1SEVO4azEPY9S4tDqgCeDM=
wietsedv commented 4 years ago

Thanks for noticing and reporting. I have changed the download links to the files served by Hugging Face. In any case, the recommended download method is loading wietsedv/bert-base-dutch-cased directly with the Transformers library as described in the model hub.

jurrr commented 4 years ago

Ok, thanks for your swift action and response, Wietse!