Verified-Intelligence / auto_LiRPA

auto_LiRPA: An Automatic Linear Relaxation based Perturbation Analysis Library for Neural Networks and General Computational Graphs
https://arxiv.org/pdf/2002.12920
Other
282 stars 69 forks source link

Not all layers are storing upper and lower bounds. #46

Closed haydn-jones closed 1 year ago

haydn-jones commented 1 year ago

I'm interested in retrieving upper and lower bounds on the output of all layers, rather than only bounds on the last layer. I've noticed that when looping over the layers stored in a BoundedModule, a good number of them are not storing their bounds after I've called compute_bounds (i.e. they do not have a lower, upper, or interval attribute). Is there some way for me to retrieve bounds on all layers?

shizhouxing commented 1 year ago

Hi @haydn-jones , it is because not all the intermediate bounds are needed for bounding the last layer, and thus many of them may be skipped. If you want the intermediate bounds of a particular layer, you may set the final_node_name argument to a a different node other than the last layer: https://github.com/Verified-Intelligence/auto_LiRPA/blob/master/auto_LiRPA/bound_general.py#L1107

haydn-jones commented 1 year ago

Ok that makes sense. Suppose I have a deep network, I could run compute_bounds with final_node_name set properly N times, but this would be quite costly in terms of runtime. Is this the only way?

shizhouxing commented 1 year ago

What are the missing layers in your case? If it is a ReLU network, then only layers after the ReLU activations are missing, while the layers before the ReLU activations have available bounds (pre-activation bounds). In this case, you may just apply ReLU on the pre-activation bounds to get the post-activation bounds.

haydn-jones commented 1 year ago

I seem to have identified part of the issue. When the method is CROWN (and CROWN-Optimized, perhaps others), I don't get bounds on every layer, but when it is CROWN-IBP I do. I've attached some code to show the issue:

import torch
import torch.nn as nn
from auto_LiRPA import BoundedModule, BoundedTensor, PerturbationLpNorm
import numpy as np

class Net(nn.Module):
    def __init__(self):
        super().__init__()
        self.layers = nn.Sequential(
            nn.Linear(2, 8),
            nn.ReLU(),
            nn.Linear(8, 16),
            nn.ReLU(),
            nn.Linear(16, 32),
            nn.ReLU(),
            nn.Linear(32, 1),
        )

    def forward(self, x):
        return self.layers(x)

def print_bounds(module):
    for name, mod in module._modules.items():
        op = type(mod).__name__
        if op in ('BoundBuffers', 'BoundParams', 'BoundInput'):
            continue
        print("##########################")
        print(name)
        print(op)
        if hasattr(mod, 'lower'):
            print("Has Lower")
        if hasattr(mod, 'upper'):
            print("Has Upper")
        if hasattr(mod, 'interval'):
            print("Has Interval")

net = Net()

x = torch.randn(1, 2)

bmod = BoundedModule(net, x)
inp = BoundedTensor(x, ptb=PerturbationLpNorm(norm=np.inf, eps=0.1))
lb, ub = bmod.compute_bounds(x=(inp,), method='CROWN')

print_bounds(bmod)
print("+++++++++++++++++++++++++++++++++++++++")

bmod = BoundedModule(net, x)
inp = BoundedTensor(x, ptb=PerturbationLpNorm(norm=np.inf, eps=0.1))
lb, ub = bmod.compute_bounds(x=(inp,), method='CROWN-IBP')
print_bounds(bmod)

In this example, the first two linear layers don't have bounds under CROWN but under CROWN-IBP they do.

haydn-jones commented 1 year ago

I got myself mixed up above by looking at multiple models, sorry about that. I've attached an ONNX model and a script to show the issue. The first two linear layers do not have bounds for example under CROWN but they do under CROWN-IBP. files.zip

shizhouxing commented 1 year ago

Hi @haydn-jones , this is what I saw:

image

It is normal that the ReLU nodes do not have bounds because they are not needed to bound the output of the model, and thus their bounds are not computed by default to save some cost.

Their bounds can be obtained by applying ReLU on the bounds of the corresponding previous layer. For example, bounds for "/10" can be obtained by applying ReLU on the bounds of "/input".

It is not recommended to use CROWN-IBP here, unless your model has been trained by CROWN-IBP or IBP. It would simply compute the intermediate bounds using IBP which will generally give looser bounds.

haydn-jones commented 1 year ago

Right, thats what I get when I run CROWN-IBP, all layers have bounds. However, if you scroll up it has also printed the nodes when I simply ran CROWN which gives results like this: image

Linear Nodes /29, /30, /33, and /36 (among others) dont have bounds.

Ideally I'd like to be using CROWN-Optimized to get tighter bounds, but I'm currently not able to retrieve bounds for every layer with it.

shizhouxing commented 1 year ago

I see. I was looking at the model defined in pytorch. If it's the ONNX model, the problem is that it has three linear layers before the ReLU. In this case, the bounds of the first two linear layers are also not needed in CROWN. Is there a particular reason to use multiple separate linear layers? Otherwise I think they can be merged into one which is generally equivalent.

image
shizhouxing commented 1 year ago

@haydn-jones If you really want their bounds, you may use the way I mentioned in my first reply.

haydn-jones commented 1 year ago

Alright, sounds good. Thanks for the help!