allenai / scibert

A BERT model for scientific text.
https://arxiv.org/abs/1903.10676
Apache License 2.0
1.47k stars 214 forks source link

How was PICO and Dependency parsing trained? #104

Open Santosh-Gupta opened 3 years ago

Santosh-Gupta commented 3 years ago

I'm curious about how PICO and dependency parsing was trained using sciBert. For PICO, I can imagine training being set up like squad, where a 'question' is one if the labels, and the model outputs a span. Was this how PICO was trained, or something else?

I'm having a trickier time coming up with a training regimen for dependency parsing.

I tried looking how at code; it seems that training is passed off to the AllenNLP library

https://github.com/allenai/scibert/blob/master/scripts/train_allennlp_local.sh#L35

But I'm having a tricky time figuring out where in that code does scibert finetuning happen.

For example, searching for the task 'PICO' or searching for 'scibert' in that repo does not return any results

https://github.com/allenai/allennlp/search?q=scibert