eth-sri / mn-bab

[ICLR 2022] Complete Verification via Multi-Neuron Relaxation Guided Branch-and-Bound
https://www.sri.inf.ethz.ch/publications/ferrari2022complete
7 stars 6 forks source link

Confusion of normalization setup for VNNCOMP networks #4

Open JacksonZyy opened 1 year ago

JacksonZyy commented 1 year ago

Hi,

I am reading your documentation and config files to try out MN-BAB. I noticed that you set up normalization as follows for cifar10_conv_small.json, which seems to be reasonable as the original onnx model from ERAN indeed includes the sub and div layers.

  "normalization_means": [
    0.4914,
    0.4822,
    0.4465
  ],

But I wonder why you setup the same normalization for configs/cifar10_resnet_3b2_bn.json? I noticed that this network is from VNNCOMP and it doesn't contain normalization layer like sub/div. Then isn't it wrong to add this normalization as it changes the computation? I would be grateful if you could provide some insight of this problem. Thank you!