The most of sources don't point that connection between input layer and hidden layer has no weights and just sum it's input and fairly propagate it forward. Moreover it looks like a potential cause of a tendency to greater excitability of some neurons.
Implement forward and backward propagation for neurons of the first layer.
The most of sources don't point that connection between input layer and hidden layer has no weights and just sum it's input and fairly propagate it forward. Moreover it looks like a potential cause of a tendency to greater excitability of some neurons.
Implement forward and backward propagation for neurons of the first layer.