Open qingyuanxingsi opened 8 years ago
Having NaNs suggests that we might have a vanishing gradient problem here. In this case, what I found helpful was to change the random seed in https://github.com/fumin/ntm/blob/master/poem/train/main.go#L63 . In addition, I also found it helpful to use the pure-go implementation https://github.com/fumin/ntm/releases/tag/pure-go instead of the BLAS based one in the master branch. Although, BLAS is faster, it seems to be more numerical unstable than using pure go with for loops. I tried to ask one numerical expert whether this is expected, but couldn't reach a satisfying conclusion yet.
Why changing the random seed do help in this scenario?
As I understand it, the vanishing/exploding gradient problem really depends on the particular training dynamics, as in most non-linear systems. So changing the random seed may OR may not help. In terms of what specific aspects are affected by the seed, first of all the order in which we present the data depends on it in https://github.com/fumin/ntm/blob/master/poem/poem.go#L127 . In addition, the initialization of the network also depends on the seed https://github.com/fumin/ntm/blob/master/poem/train/main.go#L77 .
During my training, I find that the *sVal(https://github.com/fumin/ntm/blob/master/addressing.go#L194) can easily reach NaN, can this be properly solved, instead of panicing???