Open JiayuanDing100 opened 4 years ago
@JiayuanDing100 were you able to get any inputs?
You would follow the same recipe except replacing bert-base-cased
with one of the scibert models, allenai/scibert_scivocab_uncased
for example.
Side note, you might find better trainers in the HF examples https://github.com/huggingface/transformers/tree/master/examples/text-classification. The blog post is a little outdated.
Thanks a lot @ibeltagy
Can you please explain how can i replace bert with scibert for BERT_FOR_SEQUENCE_CLASSIFICATION ?
@jaihonikhil You could utilize the AutoModel and call SciBert using AutoModel.from_pretrained('allenai/scibert_scivocab_uncased'). Then have a linear layer for sequence classification just like any other bert model.
Hi Thanks for your awesome work of domain BERT model. I just tried the pre-trained BERT in PyTorch for binary classification using this link. https://medium.com/swlh/a-simple-guide-on-using-bert-for-text-classification-bbf041ac8d04
Is there any simple tutorial for SciBert on binary classification in PyTorch? Thanks