MadryLab / mnist_challenge

A challenge to explore adversarial robustness of neural networks on MNIST.
MIT License
733 stars 179 forks source link

Does PGD not need to perform random restart in every iterative ? Is it enough to start with random noise at FGSM? #3

Closed lepangdan closed 6 years ago

lepangdan commented 6 years ago

I saw the line x = x_nat + np.random.uniform(-self.epsilon, self.epsilon, x_nat.shape) in function perturb in class LinfPGDAttack for adding random noise to original image, while there is not code for random restarting point. I am not sure if random restart step can be omitted.

dtsip commented 6 years ago

We are not performing random restarts in our code. We are simply starting PGD from a different random point each time. That is, instead of starting PGD from x_nat we are moving to a random point within the epsilon L_infty ball and performing PGD from there. Could you please elaborate on your question?

lepangdan commented 6 years ago

@dtsip I just want to confirm if I need to random restarts, because I saw the paper mentioned the random restarts while it is not implemented in the code. Another point I want to confirm is whether the PGD the same as Iterative-FGSM, except for adding random noise in the original image and random restarts?:D

dtsip commented 6 years ago

You don't need random restarts to train robust models. The code already implements random start by adding noise before running PGD. Simply running train.py as-is will train a robust model. (Random restarts were used in the paper for evaluation purposes only).

Yes, iterative-FGSM is the same as PGD starting from the original image. PGD is a standard method for constrained optimization used in numerous places over a long period of time. We are not the ones introducing this method. In the context of adversarial examples, the L_infinity version of PGD has been referred to as IFGSM.