aleximmer / Laplace

Laplace approximations for Deep Learning.
https://aleximmer.github.io/Laplace
MIT License
436 stars 63 forks source link

How can I use pretrained models with inplace ReLUs? #139

Closed arunpatro closed 4 months ago

arunpatro commented 7 months ago
net = torchvision.models.resnet18(pretrained=True).to(device)
from laplace import Laplace
from laplace.curvature import AsdlGGN

# Pre-trained model
from backpack import extend
net = extend(net)
model = net 

# User-specified LA flavor
la = Laplace(model, 'classification',
             subset_of_weights='all',
             hessian_structure='lowrank',
            #  backend=AsdlGGN,
             )

la.fit(trainloader)
# la.optimize_prior_precision(method='CV', val_loader=valloader)

The errors ->

...
-> [1469](https://vscode-remote+ssh-002dremote-002bpaperspace.vscode-resource.vscode-cdn.net/home/paperspace/nyu/~/miniconda3/envs/arun/lib/python3.10/site-packages/torch/nn/functional.py:1469)     result = torch.relu_(input)
   [1470](https://vscode-remote+ssh-002dremote-002bpaperspace.vscode-resource.vscode-cdn.net/home/paperspace/nyu/~/miniconda3/envs/arun/lib/python3.10/site-packages/torch/nn/functional.py:1470) else:
   [1471](https://vscode-remote+ssh-002dremote-002bpaperspace.vscode-resource.vscode-cdn.net/home/paperspace/nyu/~/miniconda3/envs/arun/lib/python3.10/site-packages/torch/nn/functional.py:1471)     result = torch.relu(input)

RuntimeError: Output 0 of BackwardHookFunctionBackward is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
Out

This fails because I am not able to replace the inplace ReLUs in the pretrained model. How can I fix this?

aleximmer commented 7 months ago

Hi Arun, did you try using the AsdlGGN backend, yet?