When I trained the model using your default settings (loss and classifier), the train and test loss became "nan" after few epochs. When I checked the output probabilities of the classes, they were not adding upto 1. Rather the numbers were large negative values which eventually made the loss equal to nan. Please meticulously check the issue and update the codes.
[N.B: I used the same dataset mentioned in your paper]
When I trained the model using your default settings (loss and classifier), the train and test loss became "nan" after few epochs. When I checked the output probabilities of the classes, they were not adding upto 1. Rather the numbers were large negative values which eventually made the loss equal to nan. Please meticulously check the issue and update the codes.
[N.B: I used the same dataset mentioned in your paper]