Open RongchuanTang opened 2 years ago
Hi Rongchuan, The define_loss function computes the distance: |t - (h+r)|, where for true triples, the smaller the value, the better, and for negative samples vice versa. Therefore in the margin loss part, we set -1 to make those true triples have smaller distances than those negative samples.
Thanks for your reply. I've got it. I got the sign wrong before. But I still have another question. You seem to try to pull h + r close to t and push t_neg far away from t. But in TransE, we usually push h + r away from t_neg. This two ways seem to be equivalent but I wonder if there additional benefit in choosing the first way, lol...😄
Hi Rongchuan, In TransE we try to use the marginal loss that takes the input of positive scores and negative scores. You can find the similar reference code here : https://github.com/DeepGraphLearning/KnowledgeGraphEmbedding
OK, thanks for that~
In function ssaga_model.forward_kg(), neg_losses = self.define_loss([t, t_neg]), what does this mean? Since the pos_loss in the last line is -(h + r -t), should this be calculated as self.define_loss([t_neg, projected_t]), equal to -(h + r - t_neg)? And the target in self.criterion_KG should be set as 1 rather than -1 to make the pos_loss is greater than the neg_loss. Maybe I misunderstood something, could the authors give some detailed explanations, thanks very much!