Closed omerarshad closed 5 years ago
yes, there is a fine-tune command:
https://github.com/allenai/allennlp/blob/master/allennlp/commands/fine_tune.py
that just needs the path to the trained model archive, a config file specifying the details of the fine tuning (what data to use, what training parameters to use, etc) and a directory in which to put the outputs
and what should be the format of data? should it be conll2012 format?
This one is actually tricky, because the SRL model depends on old components. @DeNeutoy opened a PR with a BERT-based model, which would be better to use today.
so any link to it? and is it implemented? secondly what if i have data in BIO tag format?
any progress?
@joelgrus
that just needs the path to the trained model archive, a config file specifying the details of the fine tuning (what data to use, what training parameters to use, etc) and a directory in which to put the outputs
Can you provide some example of fine-tune config regarding bert or any other model, it would be very helpful?
is there a way to finetune pretrained allennlp SRL model using our own data?