Harry24k / adversarial-attacks-pytorch

PyTorch implementation of adversarial attacks [torchattacks].
https://adversarial-attacks-pytorch.readthedocs.io/en/latest/index.html
MIT License
1.79k stars 337 forks source link

[BUG] miss of res for L1 in torchattack.fab.py #163

Open JieHarry-Hu opened 11 months ago

JieHarry-Hu commented 11 months ago

✨ Short description of the bug [tl;dr]

I find after line 520 in fab.py, there is no implement of L1, so when set norm to 'L1', there is no res and will raise a error.

💬 Detailed code and results

        if self.norm == 'Linf':
            res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).max(1)[0]
        elif self.norm == 'L2':
            res = ((x_to_fool - adv_curr) ** 2).view(x_to_fool.shape[0], -1).sum(dim=-1).sqrt()
        elif self.norm == 'L1':
            res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).sum(dim=-1)
        acc_curr = torch.max(acc_curr, res > self.eps)
rikonaka commented 11 months ago

✨ Short description of the bug [tl;dr]

I find after line 520 in fab.py, there is no implement of L1, so when set norm to 'L1', there is no res and will raise a error.

speech_balloon Detailed code and results

        if self.norm == 'Linf':
            res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).max(1)[0]
        elif self.norm == 'L2':
            res = ((x_to_fool - adv_curr) ** 2).view(x_to_fool.shape[0], -1).sum(dim=-1).sqrt()
        elif self.norm == 'L1':
            res = (x_to_fool - adv_curr).abs().view(x_to_fool.shape[0], -1).sum(dim=-1)
        acc_curr = torch.max(acc_curr, res > self.eps)

Yes, there is a missing line here, you can modify it and submit it to the repo. 🥰🥰🥰