Hyperparticle / udify

A single model that parses Universal Dependencies across 75 languages. Given a sentence, jointly predicts part-of-speech tags, morphology tags, lemmas, and dependency trees.
https://arxiv.org/abs/1904.02099
MIT License
220 stars 56 forks source link

How to run the UDify+Lang experiments? #32

Open Lguyogiro opened 1 year ago

Lguyogiro commented 1 year ago

Is there an example config somewhere showing how to fine-tune on a specific treebank using BERT weights saved from fine-tuning on all UD treebanks combined (using the saved pretrained models)? Corresponding to the UDify+Lang experiments in table 2 in the paper.