slczgwh / REDN

Downstream Model Design of Pre-trained Language Model for Relation Extraction Task
MIT License
95 stars 14 forks source link

Pre-trained model & Inference #1

Open kemalaraz opened 4 years ago

kemalaraz commented 4 years ago

Where can I get the BERT pre-trained model that you used as you stated in your paper. I think you are referring to standard BERT trained by google research but I want to make sure. Also you haven't talked about inference in README, after training how can I do inference? Also where is that supplementary materials that you talked about in your thesis.

Thanks in advance

slczgwh commented 4 years ago

Yes,your guess is right. Actually [BERT,BASE].

And the inference method is like what we done in eval function. Check this part: f1_metric.
What we actually done is to construct entity-mask M for all possible entity-pairs, and calculate the average score on it and use threshold to decide if a certain relation exists or not.

What's more, leave those supplementary alone, they are all about the parameters, which are already inside this code as default values.