Open barisozmen opened 5 years ago
I'm experimenting with wrn-16-8 (WideResNet) at this repo. During training, loss suddenly turned into nan. I guess it's a numerical calculation problem.
Setting clipnorm to 1. was suggested at https://stackoverflow.com/questions/37232782/nan-loss-when-training-regression-network
It didn't worked..
Might some issues from data.
I'm experimenting with wrn-16-8 (WideResNet) at this repo. During training, loss suddenly turned into nan. I guess it's a numerical calculation problem.