Open mseminatore opened 1 year ago
Need to update backpropagation for derivative of RELU
See #25
https://machinelearningmastery.com/how-to-avoid-exploding-gradients-in-neural-networks-with-gradient-clipping/
https://ai.stackexchange.com/questions/8491/does-it-make-sense-to-apply-softmax-on-top-of-relu
https://stackoverflow.com/questions/37448557/why-are-my-tensorflow-network-weights-and-costs-nan-when-i-use-relu-activations?rq=4
https://stackoverflow.com/questions/36498127/how-to-apply-gradient-clipping-in-tensorflow
https://www.tensorflow.org/api_docs/python/tf/compat/v1/train/Optimizer#processing_gradients_before_applying_them
Need to update backpropagation for derivative of RELU