Closed gthaker closed 3 years ago
Yes, I could add that, and will do that the next time I revamp that module. For multi-class essentially it boils down to calculating the log-loss on the class that was correct.
For each data item, just do the normal log-loss calculation on whichever output neuron corresponds to the correct prediction. For example, if the predictions are an array of 5 probabilities, summing to 1.0 and the correct answer was the first class (the first neuron) then calculate the logloss on the first array element (the probability of the first class) and the y (expected) of 1.0 (because it is correct classification).
The other probabilities do not matter and cancel out.
The videos and the code are all wonderful. Thank you so much for taking so much time and trouble.
I was wondering if it would be possible to do a hand example for the following calculations for a multi-class classification. I can't figure out by hand and get the same calculations as:
from tensorflow.python.keras.utils import losses_utils cce = tf.keras.losses.CategoricalCrossentropy(reduction=losses_utils.ReductionV2.AUTO) truth = tf.constant([[1., 0., 0.], [0., 1., 0.], [0., 0., 1.]]) predictions = tf.constant([[.9, .05, .05], [.05, .89, .06], [.05, .01, .94]]) print('truth', truth) print('predictions', predictions) loss = cce(truth, predictions) print('CategorialCrossenthropy Loss: ', loss.numpy()) # Loss: 0.0945
what is equivalent hand calculation?