Falling back to previous checkpoint.R_masked 11.899794 PSNR 11.899794
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-16-f44ea9e7035f> in <module>()
50 # Run
51 p = get_params(OPT_OVER, net, net_input)
---> 52 optimize(OPTIMIZER, p, closure, LR=LR, num_iter=num_iter)
/nfs1/code/aniruddha/deep-image-prior/utils/common_utils.py in optimize(optimizer_type, parameters, closure, LR, num_iter)
227 for j in range(num_iter):
228 optimizer.zero_grad()
--> 229 closure()
230 optimizer.step()
231 else:
<ipython-input-16-f44ea9e7035f> in closure()
25
26 for new_param, net_param in zip(last_net, net.parameters()):
---> 27 net_param.copy_(new_param.cuda())
28
29 return total_loss*0
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
Backtracking condition fails and gives an error.
I could reproduce this multiple times.