Open christiancosgrove opened 7 years ago
Hello, I made the semi-supervised code but I used a totally different script from the ones in this repo. I can provide the Lasagne-based script (with also some blocks MainLoop stuff as is used here) if you would like.
That would help out enormously! I'm writing a TensorFlow implementation and would like to make sure my hyperparameters and losses match.
@olimastro Did you base your code on the original Theano/Lasagne code from OpenAI (https://github.com/openai/improved-gan/blob/master/mnist_svhn_cifar10/train_cifar_feature_matching.py)? I've been using the hyperparameters in the appendix of the ALI paper for semi-supervised learning.
yes I did
@olimastro Could you still provide that script? I am trying to reproduce the results on CIFAR-10 semi-supervised learning. Thanks!
crap did I never put the script? :( I will try to look for it, I am not sure if I still have the exact version that made these results, I will get back to you during the week.
Thanks, please do. That would be an enormous help!
I've been trying to reproduce your figures for semi-supervised learning on CIFAR-10 (19.98% with 1000 labels). This result is based on the technique proposed in Salimans et al. (2016), not SVMs. Is there any way you can include your code, or at least any changes to the hyperparameters in
ali_cifar10.py
?Thanks in advance for your help.