Open kush99993s opened 5 years ago
Hello @steph-likes-git,
I wanted to add clipping of gradient to make sure that we are not exploding our gradient especially for ReLU, Identity, or similar activation functions where gradient can explode very easily.
Hello @steph-likes-git,
I wanted to add clipping of gradient to make sure that we are not exploding our gradient especially for ReLU, Identity, or similar activation functions where gradient can explode very easily.