malllabiisc / EmbedKGQA

ACL 2020: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings
Apache License 2.0
412 stars 95 forks source link

Question about the parameter "freeze" #95

Closed mug2mag closed 2 years ago

mug2mag commented 2 years ago

Thanks for great work and open code! As you depicted in paper "Since all the entities of the KG are not covered in the training set, freezing the entity embeddings after learning them during KG embedding learning phase (Section 4.2) is necessary." that the knowledge graph embedding was freezer in EmbedKGQA, I want to know if you have tried to set the parameter "freeze=0" to make the knowledge graph embedding be adjustable? If so, how about the results? Any reply will be very appriciated!

apoorvumang commented 2 years ago

Thanks for your interest! Yes we tried with freeze=0. There isn't much performance difference in case of MetaQA since the size of the QA train dataset is large and most entities are covered. However in case of WebQSP, there is a significant drop in performance, most likely because the model is overfitting to the entities seen in the QA dataset only (which is very small compared to total entities)

mug2mag commented 2 years ago

Thanks very much.