For Basic Iterative Method (BIM) or Projected Gradient Descent (PGD), there are some papers using ADAM optimizer. For example, in the seminar paper 'Adversarial Risk and the Dangers of Evaluating Against Weak Attacks', they said "In practice, we replace the vanilla gradient update with Adam (Kingma and Ba, 2014), which tends to converge faster for this problem." (in section 4.1)
However, in my understanding, BIM or PGD attacks in CleverHans do not have option for Adam optimization.
It would be beneficial to have Adam as an option for some applications.
When the PGD gradient update is replaced by the Adam off-the-shelf optimizer, the attack is known as Carlini and Wagner attack and is available here for pytorch.
For Basic Iterative Method (BIM) or Projected Gradient Descent (PGD), there are some papers using ADAM optimizer. For example, in the seminar paper 'Adversarial Risk and the Dangers of Evaluating Against Weak Attacks', they said "In practice, we replace the vanilla gradient update with Adam (Kingma and Ba, 2014), which tends to converge faster for this problem." (in section 4.1)
However, in my understanding, BIM or PGD attacks in CleverHans do not have option for Adam optimization. It would be beneficial to have Adam as an option for some applications.