ibalazevic / TuckER

TuckER: Tensor Factorization for Knowledge Graph Completion
MIT License
350 stars 60 forks source link

Why set "padding_idx=0" in nn.Embedding #17

Closed THUCSTHanxu13 closed 4 years ago

THUCSTHanxu13 commented 4 years ago

Hi~ I have found that the code set "padding_idx=0" in nn.Embedding, like self.E = torch.nn.Embedding(len(d.entities), d1, ) self.R = torch.nn.Embedding(len(d.relations), d2, padding_idx=0) However, this will lead the gradient of the first entity and relation becoming zero. This is very interesting and I want to know the reason for this. Thank you!

apoorvumang commented 4 years ago

I found this issue as well. Did you try to run the code after fixing the issue?

ibalazevic commented 4 years ago

Thanks for pointing this out. This is a mistake and should be fixed. Overall, I doubt it will have much influence on the overall results given the large number of entities and relations.