git-disl / TOG

Real-time object detection is one of the key applications of deep neural networks (DNNs) for real-world mission-critical systems. While DNN-powered object detection systems celebrate many life-enriching opportunities, they also open doors for misuse and abuse. This project presents a suite of adversarial objectness gradient attacks, coined as TOG, which can cause the state-of-the-art deep object detection networks to suffer from untargeted random attacks or even targeted attacks with three types of specificity: (1) object-vanishing, (2) object-fabrication, and (3) object-mislabeling. Apart from tailoring an adversarial perturbation for each input image, we further demonstrate TOG as a universal attack, which trains a single adversarial perturbation that can be generalized to effectively craft an unseen input with a negligible attack time cost. Also, we apply TOG as an adversarial patch attack, a form of physical attacks, showing its ability to optimize a visually confined patch filled with malicious patterns, deceiving well-trained object detectors to misbehave purposefully.
121 stars 41 forks source link

Dont have from model.bbox_transform and net.vgg #24

Open shiyuf1004 opened 1 year ago

shiyuf1004 commented 1 year ago

For demo_frcnn.ipynb, in FRCNN.py :
from model.bbox_transform import clip_boxes, bbox_transform_inv from nets.vgg16 import vgg16 from model.config import cfg Can't find

eruditus-vir commented 1 year ago

did you manage to find this?