Open bordakov opened 6 years ago
In lecture 5 loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, Y_true_flat) raises ValueError while loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits = logits, labels = Y_true_flat) seems to work see https://www.kadenze.com/forums/creative-applications-of-deep-learning-with-tensorflow-sessions-session-5-generative-models/threads/valueerror-only-call-sparse_softmax_cross_entropy_with_logits-with-named-arguments-labels-logits
In lecture 5 loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, Y_true_flat) raises ValueError while loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits = logits, labels = Y_true_flat) seems to work see https://www.kadenze.com/forums/creative-applications-of-deep-learning-with-tensorflow-sessions-session-5-generative-models/threads/valueerror-only-call-sparse_softmax_cross_entropy_with_logits-with-named-arguments-labels-logits