ezyang / pytorch-unattached

Tensors and Dynamic neural networks in Python with strong GPU acceleration
http://pytorch.org
Other
20 stars 8 forks source link

Give a better error message when num_derivatives is too low #244

Open ezyang opened 7 years ago

ezyang commented 7 years ago

At the moment, if you set num_derivatives to something lower than the actual number of derivatives you compute, you'll get something like:

----------------------------------------------------------------------
Traceback (most recent call last):
  File "test/test_jit.py", line 539, in test_mini_wlm
    z.sum().backward()
  File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/variable.py", line 158, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
  File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/__init__.py", line 98, in backward
    variables, grad_variables, retain_graph)
RuntimeError: vector::_M_range_check