AI21Labs / sense-bert

This is the code for loading the SenseBERT model, described in our paper from ACL 2020.
Apache License 2.0
43 stars 9 forks source link

Word in Context #6

Open Syavaprd opened 3 years ago

Syavaprd commented 3 years ago

Hello. Thank you very much for sharing your code. I'm a student and I want to reproduce your results on WiC competition. As I understood, in the inference you get two supersenses for the word in different contexts, after that you compare these two supersenses and decide if they are in the same meaning or not. The question is, do you finetune the model, if so, how is it done? As I see it, you have to take supersense logits, after that compare them, may be with CosineSimilarity and get the loss.