This paper is accepted by CVPR'2021(Oral).
This paper proposed a dual attention supression attack approach, which exploits both the modle attention and human attention. Specifically, we distract the model attention to obtain a better attack ability, and moreover, we evade the human attention to help improving the naturalness.
you need:
src/data
..obj
and texture file .mtl
(eg. src/audi_et_te.obj
and src/audi_et_te.mtl
).txt
which need to be trained (eg. src/all_faces.txt
)python train.py --datapath=[path to dataset] --content=[path to seed content] --canny=[path to edge mask]
results will be stored in src/logs/
, include:
loss.txt
texture.npy
the trained texture filepython test.py --texture=[path to texture]
results will be stored in src/acc.txt