cg563 / simple-blackbox-attack

Code for ICML 2019 paper "Simple Black-box Adversarial Attacks"
MIT License
194 stars 56 forks source link

CIfar10 result on ResNet 18 #16

Closed Heimine closed 3 years ago

Heimine commented 3 years ago

Hi, thanks for the excellent work!

I have a minor doubt since I was trying to test your method on Cifar10 with resnet 18 (default implementation), and I use the default hyperparameter in your code with pixel attack and epsilon = 0.2. But I got an average of 1316 queries (and the attack success rate is about 80% when all 3072 iterations have been used). So I'm wondering is this normal or maybe I missed something that is critical?

Thanks in advance.

cg563 commented 3 years ago

What you're getting seems unusual to me. Can you post the arguments you used? Also a log of the program prints would be very helpful. Thanks!

Heimine commented 3 years ago

Thanks for answering! Here are the arguments I used:

device=None, dataset='cifar10', num_runs=1000, batch_size=500, num_iters=10000, 
log_every=200, epsilon=0.2, linf_bound=0.0, freq_dims=32, order='rand', stride=7, targeted=False, 
pixel_attack=True

And here is the log of the program:

utils/simba_util.py:99: UserWarning: volatile was removed and now has no effect. Use `with torch.no_grad():` instead.
  input_var = torch.autograd.Variable(input.cuda(), volatile=True)
simba_attack.py:214: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
  preds[idx], _ = utils.get_preds(model, images[idx], args.dataset, batch_size=batch_size)
simba_attack.py:38: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
  probs = torch.index_select(torch.nn.Softmax()(output).data, 1, y)
Iteration 200: queries = 176.6720, prob = 0.5641, remaining = 0.2500
Iteration 400: queries = 270.1660, prob = 0.5538, remaining = 0.2300
Iteration 600: queries = 359.2860, prob = 0.5457, remaining = 0.2160
Iteration 800: queries = 444.7180, prob = 0.5422, remaining = 0.2120
Iteration 1000: queries = 527.6720, prob = 0.5390, remaining = 0.2040
Iteration 1200: queries = 607.7640, prob = 0.5359, remaining = 0.1980
Iteration 1400: queries = 685.4320, prob = 0.5318, remaining = 0.1900
Iteration 1600: queries = 760.8820, prob = 0.5303, remaining = 0.1880
Iteration 1800: queries = 835.2080, prob = 0.5279, remaining = 0.1840
Iteration 2000: queries = 908.2820, prob = 0.5266, remaining = 0.1820
Iteration 2200: queries = 981.0820, prob = 0.5266, remaining = 0.1820
Iteration 2400: queries = 1053.8800, prob = 0.5266, remaining = 0.1820
Iteration 2600: queries = 1126.6801, prob = 0.5266, remaining = 0.1820
Iteration 2800: queries = 1199.4800, prob = 0.5266, remaining = 0.1820
Iteration 3000: queries = 1272.2800, prob = 0.5266, remaining = 0.1820
Iteration 3072: queries = 1298.4880, prob = 0.5266, remaining = 0.1820
Iteration 200: queries = 182.2260, prob = 0.5722, remaining = 0.2780
Iteration 400: queries = 280.3240, prob = 0.5533, remaining = 0.2360
Iteration 600: queries = 370.3400, prob = 0.5433, remaining = 0.2180
Iteration 800: queries = 455.8560, prob = 0.5407, remaining = 0.2120
Iteration 1000: queries = 538.4780, prob = 0.5357, remaining = 0.2040
Iteration 1200: queries = 619.6680, prob = 0.5346, remaining = 0.2020
Iteration 1400: queries = 699.1080, prob = 0.5325, remaining = 0.1980
Iteration 1600: queries = 776.8140, prob = 0.5289, remaining = 0.1920
Iteration 1800: queries = 852.8600, prob = 0.5279, remaining = 0.1900
Iteration 2000: queries = 928.1500, prob = 0.5268, remaining = 0.1880
Iteration 2200: queries = 1003.3500, prob = 0.5268, remaining = 0.1880
Iteration 2400: queries = 1078.5500, prob = 0.5268, remaining = 0.1880
Iteration 2600: queries = 1153.4800, prob = 0.5258, remaining = 0.1860
Iteration 2800: queries = 1227.8020, prob = 0.5257, remaining = 0.1860
Iteration 3000: queries = 1301.4940, prob = 0.5244, remaining = 0.1840
Iteration 3072: queries = 1327.9900, prob = 0.5244, remaining = 0.1840

Please let me know if you need other information as well. Thanks a lot!

cg563 commented 3 years ago

I don't see anything obvious in the log... Before diving deeper, I have a few more questions:

Heimine commented 3 years ago

Here are answers to the questions you mentioned:

  1. I only use correctly predicted examples
  2. I load the cifar10 dataset first, and do the normalization as shown in your github normalize() function in run_simba.py which refers to apply_normalization() in utils.
  3. I print the first few iterations and checked that the starting probability is near 1, here's a snippet of the log:
    Iteration 1: queries = 1.6560, prob = 0.9832, remaining = 1.0000
    Iteration 2: queries = 3.3460, prob = 0.9812, remaining = 0.9930
    Iteration 3: queries = 5.0180, prob = 0.9797, remaining = 0.9890
    Iteration 4: queries = 6.6900, prob = 0.9768, remaining = 0.9890
    Iteration 5: queries = 8.3720, prob = 0.9753, remaining = 0.9880
    Iteration 6: queries = 10.0450, prob = 0.9733, remaining = 0.9850
    Iteration 7: queries = 11.6870, prob = 0.9714, remaining = 0.9800
    Iteration 8: queries = 13.3600, prob = 0.9698, remaining = 0.9760
    Iteration 9: queries = 15.0120, prob = 0.9673, remaining = 0.9740
    Iteration 10: queries = 16.6800, prob = 0.9665, remaining = 0.9690
    Iteration 11: queries = 18.3090, prob = 0.9628, remaining = 0.9670
    Iteration 12: queries = 19.9320, prob = 0.9593, remaining = 0.9630
    Iteration 13: queries = 21.5380, prob = 0.9555, remaining = 0.9620
    Iteration 14: queries = 23.1610, prob = 0.9543, remaining = 0.9610
    Iteration 15: queries = 24.7990, prob = 0.9524, remaining = 0.9590
    Iteration 16: queries = 26.3880, prob = 0.9456, remaining = 0.9550
    Iteration 17: queries = 27.9800, prob = 0.9385, remaining = 0.9460
    Iteration 18: queries = 29.5530, prob = 0.9355, remaining = 0.9320
    Iteration 19: queries = 31.1080, prob = 0.9300, remaining = 0.9300
    Iteration 20: queries = 32.7010, prob = 0.9280, remaining = 0.9230
  4. Yes, i tried to change epsilon to 0.5 and that makes the average queries for getting 90% success rate to be 320.
cg563 commented 3 years ago

Sorry for the late response.

I think I managed to reproduce your problem. For CIFAR-10, it is easier for SimBA to do a targeted attack than untargeted. Can you check to see if this is the case for you as well?

If targeted attack works, you can simulate untargeted attack by always attacking a random target. The caveat is that you would need to assume access to probability of the target class rather than the correct class, but in the case of CIFAR-10 this is okay.