Hi,
The cross entropy loss you used in Tensforflow is 'sparse_softmax_cross_entropy_with_logits()'. however, this loss function will perform a 'softmax' on 'logits'. Since your lgm loss doesn't contain a 'softmax' operation, I wonder whether the loss function is correctly used here?
Hi, The cross entropy loss you used in Tensforflow is 'sparse_softmax_cross_entropy_with_logits()'. however, this loss function will perform a 'softmax' on 'logits'. Since your lgm loss doesn't contain a 'softmax' operation, I wonder whether the loss function is correctly used here?
I am looking forward to hearing from you.