Closed SeongJinAhn closed 5 years ago
https://github.com/AaronHeee/Neural-Attentive-Item-Similarity-Model/blob/751bb9ae3542f7dac43990b1772d38b07c6cb718/NAIS.py#L130
I think that, A means a_ij Hence, L130 means a_ij * qj.
https://github.com/AaronHeee/Neural-Attentive-Item-Similarity-Model/blob/751bb9ae3542f7dac43990b1772d38b07c6cb718/NAIS.py#L146
But when it comes to y_hat_ui (self.output), they multiply self.embedding_p (a_ij * q_j) and self.embedding_q again. I think it multiplies 'q_j' again.
Is there any reason why you multiply it again?
I understand it thanks
https://github.com/AaronHeee/Neural-Attentive-Item-Similarity-Model/blob/751bb9ae3542f7dac43990b1772d38b07c6cb718/NAIS.py#L130
I think that, A means a_ij Hence, L130 means a_ij * qj.
https://github.com/AaronHeee/Neural-Attentive-Item-Similarity-Model/blob/751bb9ae3542f7dac43990b1772d38b07c6cb718/NAIS.py#L146
But when it comes to y_hat_ui (self.output), they multiply self.embedding_p (a_ij * q_j) and self.embedding_q again. I think it multiplies 'q_j' again.
Is there any reason why you multiply it again?