MichalDanielDobrzanski / DeepLearningPython

neuralnetworksanddeeplearning.com integrated scripts for Python 3.5.2 and Theano with CUDA support
MIT License
2.79k stars 1.27k forks source link

repeated calculation problem #18

Closed shm007g closed 3 years ago

shm007g commented 5 years ago

I've found that you are using sigmoid_prime function in network.py and network2.py to calculate backpropagation. But it is not necessary.

You had calculated activation list in feedforward pass. sigmoid_prime is the derivative of sigmoid function, which is equal to activation * (1 - activation).

You have no need to repeated calculate this sigmoid derivative function, which is really what old-backpropa paper suggest, just one feedforward pass, and a backward pass.

shm007g commented 5 years ago

image

Do this will get just the same result, may be even better result, for np.exp calculation may have some precision problems.

MichalDanielDobrzanski commented 3 years ago

Closing as this library is just a ported update of neural netwotks Python code