Go through the practice results, it looks that NN need an activation function (like relu), otherwise it's just a trivial linear model.By applying relu, the accuracy is neural_network.py: 92% -> 95%,
Accuracy for neural_network_raw.py: 92% -> 94%.And the learning rate is too large, even for example. In practice, it is usually set between 1e-2 and 1e-4.
Go through the practice results, it looks that NN need an activation function (like relu), otherwise it's just a trivial linear model.By applying relu, the accuracy is neural_network.py: 92% -> 95%, Accuracy for neural_network_raw.py: 92% -> 94%.And the learning rate is too large, even for example. In practice, it is usually set between 1e-2 and 1e-4.