Open drplanar opened 6 years ago
The main file https://github.com/liuchangjohn/sparse-evolutionary-artificial-neural-networks/blob/master/SET-MLP-Keras-Weights-Mask/dense_mlp_keras_cifar10.py is very short. Worth taking a look.
Ran set_mlp_srelu_sgd_cifar10_acc.txt
and reproduced the sparse result (72.4% vs 70% dense). Took about 6 hours.
keras-contrib
from githubsparse-evolutionary-artificial-neural-networks/SET-MLP-Keras-Weights-Mask/dense_mlp_keras_cifar10.py
. It uses keras to downloadcifar10
data automatically. This is the dense MLP baseline.Dense baseline took ~5 hours to run. Result is quite close to that reported in the paper.