Closed aknirala closed 1 month ago
Hi and thank you for your interest in the work!
The tools
references are linked to the OVAL (https://github.com/oval-group/oval-bab) branch-and-bound framework.
As mentioned in the README, we used the framework to verify the trained models.
In order to install it, please follow the instructions at: https://github.com/oval-group/oval-bab?tab=readme-ov-file#installation
Please note that this dependency is not required by the training process, and that trained networks can be alternatively verified through any network verifier supporting feedforward ReLU networks.
Thank You. Ho long would a typical verification run (via verify.py) takes? Say for MNIST with 0.1 perturbation?
The verification time through a complete verifier heavily depends on the dataset, model and perturbation radius. For the verification settings in the README, MNIST 0.1 should take a few hours. On the other hand, it can take 3-4 days on ImageNet64 1/255. You can get an estimate of the overall verification time by running verification on a subset of the images, for instance by setting ---end_idx 200
. Verification times can be significantly reduced (at the expense of the overall number of verified properties) by decreasing the timeout, for instance using --oval_bab_timeout 60
, or by only using inexpensive incomplete verifiers (use --crown_ver
for verify.py
). See Table 6 in the appendix of the paper to see the verified accuracy corresponding to CROWN-IBP and IBP, for instance.
It seems that the library uses a custom library named tools. in var_utils.py there are many referebces of that as:
How can I get this library?