gongzhitaao / tensorflow-adversarial

Crafting adversarial images
MIT License
223 stars 70 forks source link

Is there loop in your fast gradient method? #8

Closed YinYangOfDao closed 5 years ago

YinYangOfDao commented 6 years ago

as in title. I noticed that you have "tf.while_loop" in your code. as far as I know, fast gradient is called fast because it doesn't have loop. see section 2.1 in "ADVERSARIAL EXAMPLES IN THE PHYSICAL WORLD", of which the link appeared in the comment in your code: In this paper we refer to this method as “fast” because it does not require an iterative procedure to compute adversarial examples, and thus is much faster than other considered methods. Your kindness would be appreciated if you may explain where did I make mistakes or misunderstand.

gongzhitaao commented 6 years ago
  1. Your understanding is correct, the vanilla FSGM does not have loops.
  2. There is, however, a variant called iterative FGSM (referred to as basic iterative method in the paper you mentioned) which is essentially applying FGSM more than once.

For API design simplicity, I implement both versions in one function, you could set epoch=1 to get the original non-iterative FGSM.