Closed imirzadeh closed 4 years ago
I tried with Pytorch 0.4.0 and I still get the error.
Apparently the problem is with cross entropy loss. Because I can run the tests without error with MSE Loss.
Okay. My bad.
I had some modules that I was not using for forward pass.
For example, below I had nn.batchnorm, but I wasn't using for froward pass! But when I used loss_grad = torch.autograd.grad(loss, model.parameters(), create_graph=True)
, the model.parameters()
had batchnorms.
class MLP(nn.Module):
def __init__(self, hidden_layers, config):
super(MLP, self).__init__()
self.W1 = nn.Linear(784, hidden_layers[0])
self.relu = nn.ReLU(inplace=True)
.....
self.batchnorm = nn.BatchNorm1d(hidden_layers[0])
def foward(self, x):
out = self.relu(self.W1(x)) #no batchnorm!
Hi,
I can't use this code to compute Hessian eigenvalues. Probably because I'm using the new version of Pytorch?
I'm using torch 1.4.0 with Python 3.6.9