Open mehulrastogi opened 3 years ago
The following can be used to get useful stats for the tutorial `
attack_image = obj.perturbation_image(attack_result.x, image)
prior_probs = obj.model_predict(image)
predicted_probs = self.model_predict(attack_image)
predicted_class = np.argmax(predicted_probs)
actual_class = original_label
success = predicted_class != actual_class
cdiff = prior_probs[actual_class] - predicted_probs[actual_class]
`
Hey, I would like to work on this
great @Shreyas-Bhat you can take it up, comment on #78 and #79 too so we can assign it to you
Noting down the list of tasks to be completed for the tutorial. The implementation for the attack is in #65 .
Paper Link
For the sake of simplicity right now focusing on one model and one dataset. (VGG16 , CIFAR10(less noisy than kaggle cifar10) )