YyzHarry / ME-Net

[ICML 2019] ME-Net: Towards Effective Adversarial Robustness with Matrix Estimation
http://me-net.csail.mit.edu
MIT License
52 stars 10 forks source link

mask_train_cnt almost always be 1 #5

Closed HaoerSlayer closed 4 years ago

HaoerSlayer commented 4 years ago

https://github.com/YyzHarry/ME-Net/blob/a521e1e78178034ceeac9194f76178bc2dd907c8/train_adv.py#L409 Why there is a "math.ceil"?That seems to make "args.startp + mask_train_cnt*(args.endp-args.startp)/args.mask_num always equal to args.endp.

HaoerSlayer commented 4 years ago

or maybe I should change mask_num to 10 as train_pure?

HaoerSlayer commented 4 years ago

Also, hope for training details about adversarially training on MNIST and SVHN

HaoerSlayer commented 4 years ago

Hi, I changed line 409 ,removed the math.ceil and the model I trained get 79.1 acc of chean data and 64.4 acc of PGD-7 white attack, which is slightly lower than the ckpt you provided (85.5/67.3). I don't know why that happend. The problem can't be the modification I made because the p used in training will always be 0.6 in the current provided code train_adv.py. I wonder how many images are send to the model in each epoch? The args.mask_num is 1 in your setting ,which seems to be useless. However, the function get_data still has a loop to concatenate data. And in fact, my best model was got after epoch 71, adjust lr after epoch 100 doesn`t help. As in table 27 ,more masks help improve performance both on clean and adversarial data so I wonder if args.mask_num should be 10 ?

YyzHarry commented 4 years ago

Yes, you can change the mask_num. Larger values should bring better results. The default number is for computation efficiency.