Verified-Intelligence / auto_LiRPA

auto_LiRPA: An Automatic Linear Relaxation based Perturbation Analysis Library for Neural Networks and General Computational Graphs
https://arxiv.org/pdf/2002.12920
Other
282 stars 69 forks source link

`nan` CROWN bounds when clamping #63

Open cherrywoods opened 9 months ago

cherrywoods commented 9 months ago

Describe the bug I am trying to bound a clamp operation. Since using torch.clamp directly produces an error stating that Cast is an unsupported operation, I instead tried torch.minimum(torch.maximum(x, mins), maxs) to replace clamp. This does not report any unsupported operations but yields nan bounds everywhere.

To Reproduce

import torch
from torch import nn
from auto_LiRPA import BoundedModule, BoundedTensor, PerturbationLpNorm

class Test(nn.Module):
    def __init__(self):
        super().__init__()
        x = nn.Parameter(0.5 * torch.ones(1, 4))
        y = nn.Parameter(0.75 * torch.ones(1, 4))
        self.register_buffer("x", x)
        self.register_buffer("y", y)
    def forward(self, z):
        return torch.minimum(torch.maximum(z, self.x), self.y)

module = BoundedModule(Test(), torch.empty(1, 4))
ptb = PerturbationLpNorm(x_L=torch.zeros(1, 4), x_U=torch.ones(1, 4))
t = BoundedTensor(torch.zeros(1, 4), ptb)
bounds = module.compute_bounds(x=(t,), method="ibp")  # produces the correct bounds
print(bounds)
# (tensor([[0.5000, 0.5000, 0.5000, 0.5000]], grad_fn=<MinimumBackward0>), tensor([[0.7500, 0.7500, 0.7500, 0.7500]], grad_fn=<MinimumBackward0>))
bounds = module.compute_bounds(x=(t,), method="CROWN")  # produces nan
print(bounds)
# (tensor([[nan, nan, nan, nan]], grad_fn=<ViewBackward0>), tensor([[nan, nan, nan, nan]], grad_fn=<ViewBackward0>))

System configuration: