Closed Risho92 closed 3 years ago
We haven't tried it yet but SBert might be more suitable for getting question embedding.
@Risho92, I have claimed this paper for MLRC2020. I have tried using SBert and it performs better than RoBERTa.
@jishnujayakumar thanks for trying that. I was also expecting SBert to perform better.
I was planning to write the code myself next month. If you have already done it, is it possible to share your repository?
@jishnujayakumar thanks for trying that. I was also expecting SBert to perform better.
I was planning to write the code myself next month. If you have already done it, is it possible to share your repository?
Thanks @jishnujayakumar . This is so cool. Nice work!!!
While working on other NLP tasks, we noticed that SBert performs better than Roberta in sentence level tasks. KGQA looks like a sentence level task. Have you tried SBert in the place of Roberta? What is your intuition on SBert's performance?