malllabiisc / EmbedKGQA

ACL 2020: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings
Apache License 2.0
414 stars 96 forks source link

What dose "positive_head" mean in line 197 in "KGQA/LSTM/main.py" #46

Closed Ironeie closed 3 years ago

Ironeie commented 3 years ago

for i_batch, a in enumerate(loader): model.zero_grad() question = a[0].to(device) sent_len = a[1].to(device) positive_head = a[2].to(device) positive_tail = a[3].to(device) Since question represents the embedding of questions, sent_len represents the length of questions, positive_tail represents the lables(answers) of questions, what dose "positive_head" represent?

apoorvumang commented 3 years ago

positive_head is the entity present in the question. As followed by others, we assume entity linking to be 'perfect' ie. the model knows exactly which entity is there in the question. There is always exactly one entity.

ShuangNYU commented 3 years ago

positive_head is the entity present in the question. As followed by others, we assume entity linking to be 'perfect' ie. the model knows exactly which entity is there in the question. There is always exactly one entity.

Is that true that positive_tail is a one-hot vector but only one position is '1'?

Ironeie commented 3 years ago

positive_head is the entity present in the question. As followed by others, we assume entity linking to be 'perfect' ie. the model knows exactly which entity is there in the question. There is always exactly one entity.

Is that true that positive_tail is a one-hot vector but only one position is '1'?

I tried to print the positive_tail and found that there are more than one position being '1', where I don't understand why it is.

apoorvumang commented 3 years ago

positive_head is the entity present in the question. As followed by others, we assume entity linking to be 'perfect' ie. the model knows exactly which entity is there in the question. There is always exactly one entity.

Is that true that positive_tail is a one-hot vector but only one position is '1'?

I tried to print the positive_tail and found that there are more than one position being '1', where I don't understand why it is.

Multiple answers can be correct, so it's k-hot not 1-hot