malllabiisc / EmbedKGQA

ACL 2020: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings
Apache License 2.0
415 stars 95 forks source link

Roberta vs SBert #58

Closed Risho92 closed 3 years ago

Risho92 commented 3 years ago

While working on other NLP tasks, we noticed that SBert performs better than Roberta in sentence level tasks. KGQA looks like a sentence level task. Have you tried SBert in the place of Roberta? What is your intuition on SBert's performance?

apoorvumang commented 3 years ago

We haven't tried it yet but SBert might be more suitable for getting question embedding.

jishnujayakumar commented 3 years ago

@Risho92, I have claimed this paper for MLRC2020. I have tried using SBert and it performs better than RoBERTa.

Risho92 commented 3 years ago

@jishnujayakumar thanks for trying that. I was also expecting SBert to perform better.

I was planning to write the code myself next month. If you have already done it, is it possible to share your repository?

jishnujayakumar commented 3 years ago

@jishnujayakumar thanks for trying that. I was also expecting SBert to perform better.

I was planning to write the code myself next month. If you have already done it, is it possible to share your repository?

Check https://github.com/jishnujayakumar/MLRC2020-EmbedKGQA

Risho92 commented 3 years ago

Thanks @jishnujayakumar . This is so cool. Nice work!!!