Open abdullahjamal opened 4 years ago
Hi, sorry that is another debug code here, we don't need to use so much epoch, because it will autosave the checkpoint so we did not change the maximum epochs here.
Cuz the for loop for epoch in range(args.epoch//100), so you may set 10000 to 30000 would be quite enough.We will modify it sooner. Sorry for that inconvience
Thanks, i will try the limits
if you remember, what was your maximum epoch for the experimentation?
Mnist would be very fast. 20000 would be quite enough
Can you please explain, if the model is loaded here or not?
meta_model.load_state_dict(pretrained_dict) something like this should be added here
The default epoch number is 54600 for mnist in meta-training. Is the number correct or should I use a different value? https://github.com/dydjw9/MetaAttack_ICLR2020/blob/master/meta_training/mnist_meta_training/mnist_train.py#L161