Hello. I've been digging around neural networks lately. I made me a single hidden layer network from scratch and was not sure, if my results are right and I made a mistake. So I tried to find another implementations to compare results and I stumbled upon this repo & blog. So I made result comparison between your and my implementation and the results didn't match for the same initial conditions.
My question is: Is there a bug in your code? Shouldn't the parameter to applySigmoidPrime be outputLayerInput (resp. hiddenLayerInput)?
I've found some materials and there they compute the delta using sigmoid prime (derivation of sigmoid) with the same input as the parameter to sigmoid in input forward propagaion.
NOTE: Or since the output = sigmoid(outputLayerInput), you can compute the sigmoidPrime as output*(1-output).
Thank you for your time & I'm looking forward to your answer.
Hello. I've been digging around neural networks lately. I made me a single hidden layer network from scratch and was not sure, if my results are right and I made a mistake. So I tried to find another implementations to compare results and I stumbled upon this repo & blog. So I made result comparison between your and my implementation and the results didn't match for the same initial conditions.
After some time spent to reevaluate my code, some other examples and math background, I've decided to write tis issue to ask a question about your training code on following lines. https://github.com/dwhitena/gophernet/blob/09031b18f26cd2a02c040b22433bd1f6d99c67c9/main.go#L174-L176
My question is: Is there a bug in your code? Shouldn't the parameter to
applySigmoidPrime
beoutputLayerInput
(resp.hiddenLayerInput
)?I've found some materials and there they compute the delta using sigmoid prime (derivation of sigmoid) with the same input as the parameter to sigmoid in input forward propagaion.
NOTE: Or since the
output = sigmoid(outputLayerInput)
, you can compute the sigmoidPrime asoutput*(1-output)
.Thank you for your time & I'm looking forward to your answer.