Open ghost opened 7 years ago
I think that it happened because they changed default parameters for some functions recently. Try to check all "sum", "mean" and similar functions and set keep dim to True.
which pytorch version are you using?
Tried putting keepdim=True with pytorch 0.3, did not work.
I got the same error. And the pytorch version is 0.2.
Try this.
return torch.cat((x1.view(-1,1), x2.view(-1,1), 1)
I solved this problem, use return torch.cat((x1.view(-1,1), x2.view(-1,1)), 1) to update utils.py‘s output. And update meta_optimizer.py line135 to inputs = Variable(torch.cat((preprocess_gradients(flat_grads), flat_params.data.unsqueeze(1), loss.unsqueeze(1)), 1))
Hi, I am using pytorch 0.2 and when I run the script, it generated following errors:
Is it because I am using a different version of pytorch?