wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
135 stars 10 forks source link

License for pre-trained model #6

Closed MobiusLooper closed 4 years ago

MobiusLooper commented 4 years ago

Hi!

First off, thanks a lot for making and sharing this model.

Second, is there a license that can be applied to the pre-trained model itself?

Thanks, Josh

wietsedv commented 4 years ago

(Sorry for the late response, I did not receive/notice Github notifications.)

I am not an expert on this, but I think you may assume the same license that applies to the Arxiv paper: https://creativecommons.org/licenses/by/4.0/

Let me know if you require anything more specific / explicit.