carpedm20 / MemN2N-tensorflow

"End-To-End Memory Networks" in Tensorflow
http://arxiv.org/abs/1503.08895v4
MIT License
829 stars 251 forks source link

model output to log probability for the loss #1

Closed cesc-park closed 8 years ago

cesc-park commented 8 years ago

Model output should be changed to log probability

to calculate

tf.nn.softmax_cross_entropy_with_logits(logpro, target)

carpedm20 commented 8 years ago

@cesc-park I was wondering what is right and what is wrong. Did you closed this issue because the original code is right?

cesc-park commented 8 years ago

@carpedm20 Yep

softmax_cross_entropy_with_logits implicitly calculate log probability.

Tensorflow document "tf.nn.softmax_cross_entropy_with_logits(logits, labels, name=None) logits: Unscaled log probabilities."

Made me confusing.

Sorry!