Closed pswaldia closed 5 years ago
How many layers we need to consider.? Considering a simple dataset like MNIST taking more than 2 makes the test accuracy go to 100.00 for almost all epochs.
Let's take 2 only.
Done with the changes! please review
@pswaldia You misunderstood me, we don't want the accuracy for different activation functions. But let's keep the activation function fixed and log the accuracy for different hyperparameters.
i made the changes, keeping the activations functions fixed and varying learning rate and momentum for adam as an optimizer. It is done with neural network having 3 and 2 hidden layers. Please review
This looks good to me. I am merging this, could you please do this for the four(2*2) combination:
Run just for five epochs.
Yes I'll do that!
Thanks for the PR. LGTM, but few issues: You have to train a separate model for each of the hyper-parameter.
And log them in a
.MD
file, like done hereKeep replying here!