Open Xanthorapedia opened 2 years ago
PS: For my hypothetical use case above, I was able to workaround by:
def func2(x_, p_):
return SE3.Identity(x_.shape[:-1]).retr(x_).act(p_).sum()
print(torch.autograd.functional.jacobian(func2, (x, p)))
Output:
(tensor([[1000., 1000., 1000., 0., 0., 0.]]), tensor([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
...,
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]))
Hi, thanks for creating this wonderful library!❤️ However, I ran into the following bug when using it with
torch.autograd
.For example, if I try to find the Jacobian w.r.t. a group parameter:
The problem seems to be that during the preprocessing stage of
torch.autograd
, it clones the input data, but forLieGroupParameter
, this only retrieves the tangent space data. Somehow the forward function was happy with those 1-dim-less inputs and didn't throw an error until the backward call. The same error appears if I do:It would be nice if you can help look into the above error. Also, it's probably a good idea for
LieGroup.__init__
or the op implementations to have some sanity check on the input dimensions so that it catches the problem earlier. Otherwise, thank you for your hard work!😃