git-disl / TOG

Real-time object detection is one of the key applications of deep neural networks (DNNs) for real-world mission-critical systems. While DNN-powered object detection systems celebrate many life-enriching opportunities, they also open doors for misuse and abuse. This project presents a suite of adversarial objectness gradient attacks, coined as TOG, which can cause the state-of-the-art deep object detection networks to suffer from untargeted random attacks or even targeted attacks with three types of specificity: (1) object-vanishing, (2) object-fabrication, and (3) object-mislabeling. Apart from tailoring an adversarial perturbation for each input image, we further demonstrate TOG as a universal attack, which trains a single adversarial perturbation that can be generalized to effectively craft an unseen input with a negligible attack time cost. Also, we apply TOG as an adversarial patch attack, a form of physical attacks, showing its ability to optimize a visually confined patch filled with malicious patterns, deceiving well-trained object detectors to misbehave purposefully.
121 stars 41 forks source link

how to save attacked image to local? #5

Closed luoolu closed 3 years ago

luoolu commented 3 years ago

I need your help

khchow-gt commented 3 years ago

Following the variable names in our demonstration notebook, attacked images are those variables started with "x_adv". They are NumPy arrays that can be saved:

import numpy as np
x_adv = ****   # this variable stores the attacked image
np.save(path_to_destination, x_adv)
luoolu commented 3 years ago

Thanks!