Pre-trained BERT Models for Ancient and Medieval Greek, and associated code for LaTeCH 2021 paper titled - "A Pilot Study for BERT Language Modelling and Morphological Analysis for Ancient and Medieval Greek"
I have a question about the model available on HaggingFace (AutoModel.from_pretrained("pranaydeeps/Ancient-Greek-BERT")): do I understand correctly that the parameters of this model have NOT been fine-tuned using treebank data (but have only been calculated using Greek texts)?
I have a question about the model available on HaggingFace (
AutoModel.from_pretrained("pranaydeeps/Ancient-Greek-BERT")
): do I understand correctly that the parameters of this model have NOT been fine-tuned using treebank data (but have only been calculated using Greek texts)?