malllabiisc / EmbedKGQA

ACL 2020: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings
Apache License 2.0
415 stars 95 forks source link

Do you use RoBERTa for only fbwq but not MetaQA? #47

Closed sharon-gao closed 3 years ago

sharon-gao commented 3 years ago

Hi,

When reading your code I found there is not a part in KGQA/LSTM that you load the transformer and use the pre-trained embeddings for sentences, as what you have done in KGQA/RoBERTa. Is that true, and why?

Best, Shuang

apoorvumang commented 3 years ago

In LSTM, we don't use any text pretraining. This is because MetaQA has a large number of questions and the model can learn word embeddings from those questions.