Lancelot39 / KGSF

KDD2020 Improving Conversational Recommender Systems via Knowledge Graph based Semantic Fusion
73 stars 36 forks source link

Problem of computing recommendation probability #7

Closed mwang98 closed 3 years ago

mwang98 commented 3 years ago

It states that the probability of a recommended item is the softmax value of the inner product (i.e., similarity) between the user representation and its embedding. However, the instruction to get softmax values is commented. Instead, values of the inner product are directly used to compute cross-entropy losses. Could you help me to understand the inconsistency between the paper and your implementation?

The statement in Improving Conversational Recommender Systems via Knowledge Graph based Semantic Fusion. image

entity_scores = F.linear(user_emb, db_nodes_features, self.output_en.bias)
#entity_scores = scores_db * gate + scores_con * (1 - gate)
#entity_scores=(scores_db+scores_con)/2

#mask loss
#m_emb=db_nodes_features[labels.cuda()]
#mask_mask=concept_mask!=self.concept_padding
mask_loss=0#self.mask_predict_loss(m_emb, attention, xs, mask_mask.cuda(),rec.float())

info_db_loss, info_con_loss=self.infomax_loss(con_nodes_features,db_nodes_features,con_user_emb,db_user_emb,con_label,db_label,db_con_mask)

# why comments??
#entity_scores = F.softmax(entity_scores.cuda(), dim=-1).cuda()

rec_loss=self.criterion(entity_scores.squeeze(1).squeeze(1).float(), labels.cuda())
#rec_loss=self.klloss(entity_scores.squeeze(1).squeeze(1).float(), labels.float().cuda())
rec_loss = torch.sum(rec_loss*rec.float().cuda())

self.user_rep=user_emb
LQlq123 commented 1 year ago

I have the same doubt.How do you understand that