Closed JacobTDC closed 2 years ago
The bias is functionally equivalent to a weight on a constant input. If you have some constant input to your network, and enough extra nodes, you could train an equivalent network with zero bias weights. If there are no constant inputs anywhere, I don't see a way you could manufacturer one unless you're transfer function returns non zero for a zero value input. If you have no constant input and no way to manufacture one, you limit what the net can possibly learn.
On Jun 13, 2017 8:53 AM, "JacobTDC" notifications@github.com wrote:
Is there a way to set the bias of a neuron to 0? If so, how will that affect a network if all of the neurons in a network are set to 0?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/cazala/synaptic/issues/231, or mute the thread https://github.com/notifications/unsubscribe-auth/AHjXj89YA52H7f-ohVRdnaE7jpGM6rcPks5sDrCEgaJpZM4N4szD .
You can set all the biases in the network to 0 - but you have to modify the source code to make sure that the bias doesn't get modified during backpropagation.
An interesting link to view that explains the rol of bias in neural networks: https://stackoverflow.com/questions/2480650/role-of-bias-in-neural-networks
Is there a way to set the bias of a neuron to 0? If so, how will that affect a network if all of the neurons in a network are set to 0?