Trusted-AI / adversarial-robustness-toolbox

Adversarial Robustness Toolbox (ART) - Python Library for Machine Learning Security - Evasion, Poisoning, Extraction, Inference - Red and Blue Teams
https://adversarial-robustness-toolbox.readthedocs.io/en/latest/
MIT License
4.61k stars 1.13k forks source link

Setting retain_graph to False in adversarial_patch_pytorch attack #2220

Open mphute opened 12 months ago

mphute commented 12 months ago

In adversarial_attacks_pytorch.py line 191 : loss.backward(retain_graph=True)

However retain graph being true retains the graph at every backward call and causes a high RAM usage. (processing 6 frames takes 40 GB if max tiers = 10). This heavily impacts our tracking defense as our defense improves significantly with a larger number of frames. Setting retain_graph = False allows us to use 20 frames for our defense, i.e. allows us to set batch_size = 20

We request retain_graph to be set to False : loss.backward(retain_graph=False)

We reviewed a number of online posts, such as these 1, 2, 3, 4 and they suggest retrain_graph would typically be set to False, unless setting to True is required for specific purposes. They suggest that a new graph is created every backward call, so we do not need to retain the graph.

We tested this by setting retain_graph to False and running the tracking scenarios. The scenarios ran without any issues and and we did not observe any changes in the metrics (e.g. HOTA). Therefore, we would like to request retrain_graph to be set to False.

beat-buesser commented 11 months ago

Hi @mphute Thank you very much for raising this issue! I think to to remember that we had been discussing this setting in an earlier issue. Let me check the older issues to see if there was a reason for this setting and com back to you here.

Are you able to train successful patches loss.backward(retain_graph=False) without your defense being applied?

mphute commented 11 months ago

Yes, when I ran my experiments I saw the adversarial attack functions similarly in the "undefended" case too.