luoyetx / mx-lsoftmax

mxnet version of Large-Margin Softmax Loss for Convolutional Neural Networks.
BSD 3-Clause "New" or "Revised" License
177 stars 46 forks source link

The loss suddenly to be nan. #15

Closed huyangc closed 7 years ago

huyangc commented 7 years ago

I set the parameters as : beta:1000 margin:4 scale=0.9997 beta_min=5

And after some iteration, the cross entropy loss suddenly become nan. Using the C++ layer and compile with mxnet. Does anyone have idea of this situation and how to solve it?

huyangc commented 7 years ago

When using caffe code with the same parameter of asoftmax, model and dataset, it will be all right.