Closed lon91ong closed 4 years ago
bug描述 grad_fn的测试代码在PyTorch 1.3.1中运行结果与说明不符
y = x + 2 ... print(y.grad_fn) #output None
2.3.3小节第一段代码运行报错
out.backward() # 等价于 out.backward(torch.tensor(1.)) print(x.grad)
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) <ipython-input-23-5b0e34440c47> in <module> ----> 1 out.backward() # 等价于 out.backward(torch.tensor(1.)) 2 print(x.grad) /mnt/e/Programs/SageMath/local/lib/python3.7/site-packages/torch/tensor.py in backward(self, gradient, retain_graph, create_graph) 164 products. Defaults to ``False``. 165 """ --> 166 torch.autograd.backward(self, gradient, retain_graph, create_graph) 167 168 def register_hook(self, hook): /mnt/e/Programs/SageMath/local/lib/python3.7/site-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables) 97 Variable._execution_engine.run_backward( 98 tensors, grad_tensors, retain_graph, create_graph, ---> 99 allow_unreachable=True) # allow_unreachable flag 100 101 RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
版本信息 pytorch:1.3.1+cpu ...
重启kernel之后, 莫名其妙的不报错了!
bug描述 grad_fn的测试代码在PyTorch 1.3.1中运行结果与说明不符
2.3.3小节第一段代码运行报错
版本信息 pytorch:1.3.1+cpu ...