Open JieHarry-Hu opened 11 months ago
✨ Short description of the bug [tl;dr]
I find after line 520 in fab.py, there is no implement of L1, so when set norm to 'L1', there is no res and will raise a error.
speech_balloon Detailed code and results
if self.norm == 'Linf': res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).max(1)[0] elif self.norm == 'L2': res = ((x_to_fool - adv_curr) ** 2).view(x_to_fool.shape[0], -1).sum(dim=-1).sqrt() elif self.norm == 'L1': res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).sum(dim=-1) acc_curr = torch.max(acc_curr, res > self.eps)
Yes, there is a missing line here, you can modify it and submit it to the repo. 🥰🥰🥰
✨ Short description of the bug [tl;dr]
I find after line 520 in fab.py, there is no implement of L1, so when set norm to 'L1', there is no res and will raise a error.
💬 Detailed code and results