Code and dataset of EMNLP 2020 paper "Infusing Disease Knowledge into BERT for Health Question Answering, Medical Inference and Disease Name Recognition"
Hi , I really like the thoughts and appreciate your work .
I got some problems after pretraining the language model for classification tasks.
Only tensors were predicted but it should be labels like [[0 1 0 1 ...]...[ 1 0 1 0 ....]]
Is there any minor change on the model causing it not able to be used directly ? Cause I assume that after pretraining it should be the same structural format as the model provided by Hugging Face. (I am using simpletransformers for Multi-Label classification tasks.)
Really looking forward to your response. Thank you for your time and I really appreciate that . Stay Safe!
Hi , I really like the thoughts and appreciate your work . I got some problems after pretraining the language model for classification tasks. Only tensors were predicted but it should be labels like [[0 1 0 1 ...]...[ 1 0 1 0 ....]] Is there any minor change on the model causing it not able to be used directly ? Cause I assume that after pretraining it should be the same structural format as the model provided by Hugging Face. (I am using simpletransformers for Multi-Label classification tasks.) Really looking forward to your response. Thank you for your time and I really appreciate that . Stay Safe!