Closed NizarIslah closed 1 year ago
Hi,
thanks for reporting your issue. For BackPACK to work with second-order extensions, you should build your model as a nn.Sequential
of nn.Module
s. Note that you can use PyTorch's nn.Flatten
instead of defining your own version.
I believe you can simply replace your model with the following code and it should work:
from torch import nn
model = nn.Sequential(
nn.Conv2d(3, 6, 5),
nn.ReLU(),
nn.MaxPool2d(2),
nn.Conv2d(6, 16, 5),
nn.ReLU(),
nn.MaxPool2d(2),
nn.Flatten(),
nn.Linear(1655, 120),
nn.ReLU(),
nn.Linear(120, 84),
nn.ReLU(),
nn.Linear(84, 10),
)
Cheers, Felix
It works, thanks
Hi,
Really confused why am getting this error. The network I am trying to extend with backpack and compute individual gradients for;
`class Flatten(nn.Module): def forward(self, input): return input.view(input.size(0), -1)
class LeNet(nn.Module): def init(self): super(LeNet, self).init() self.conv1 = nn.Conv2d(3, 6, 5) self.relu = nn.ReLU() self.mp = nn.MaxPool2d(2) self.conv2 = nn.Conv2d(6, 16, 5) self.flatten = Flatten() self.fc1 = nn.Linear(1655, 120) self.fc2 = nn.Linear(120, 84) self.fc3 = nn.Linear(84, 10)
I also get AttributeError: 'Parameter' object has no attribute 'grad_batch'` Any help would be appreciated, thanks.