heyunh2015 / diseaseBERT

Code and dataset of EMNLP 2020 paper "Infusing Disease Knowledge into BERT for Health Question Answering, Medical Inference and Disease Name Recognition"
67 stars 9 forks source link

model after pretraining phase #2

Open phc4valid opened 3 years ago

phc4valid commented 3 years ago

Hi , I really like the thoughts and appreciate your work . I got some problems after pretraining the language model for classification tasks. Only tensors were predicted but it should be labels like [[0 1 0 1 ...]...[ 1 0 1 0 ....]] Is there any minor change on the model causing it not able to be used directly ? Cause I assume that after pretraining it should be the same structural format as the model provided by Hugging Face. (I am using simpletransformers for Multi-Label classification tasks.) Really looking forward to your response. Thank you for your time and I really appreciate that . Stay Safe!

heyunh2015 commented 3 years ago

Again... please try to use Huggingface transformers 2.5.1. Thanks!!