BoChenYS / BPnP

Back-propagatable PnP
MIT License
308 stars 34 forks source link

Tensor inversion RuntimeError and the possibility to use customized forward PnP solver #7

Closed qiyan98 closed 3 years ago

qiyan98 commented 4 years ago

Hi Bo CHEN,

Thanks for sharing the wonderful code. I have some questions regarding the BPnP implementations and I would appreciate it if you could give me a hand:

  1. torch.inverse() RuntimeError. I encounter this error when running the demoPoseEst.py. The other 2 demo scripts work perfectly though. The complete error message is:

    (bpnp) qiyan@qiyan-qdlpc:~/Documents/BPnP$ python demoPoseEst.py
    i:    0, loss:21164.001953125
    i:    1, loss:21162.431640625
    i:    2, loss:21160.376953125
    i:    3, loss:21158.195312500
    i:    4, loss:21156.015625000
    i:    5, loss:21153.923828125
    i:    6, loss:21151.771484375
    i:    7, loss:21149.613281250
    i:    8, loss:21147.451171875
    i:    9, loss:21145.251953125
    i:   10, loss:21143.001953125
    Traceback (most recent call last):
    File "demoPoseEst.py", line 67, in <module>
    loss.backward()
    File "/home/qiyan/anaconda3/envs/bpnp/lib/python3.7/site-packages/torch/tensor.py", line 195, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph)
    File "/home/qiyan/anaconda3/envs/bpnp/lib/python3.7/site-packages/torch/autograd/__init__.py", line 99, in backward
    allow_unreachable=True)  # allow_unreachable flag
    File "/home/qiyan/anaconda3/envs/bpnp/lib/python3.7/site-packages/torch/autograd/function.py", line 77, in apply
    return self._forward_cls.backward(self, *args)
    File "/home/qiyan/Documents/BPnP/BPnP.py", line 96, in backward
    inv_J_fy = torch.inverse(J_fy)
    RuntimeError: inverse_cuda: U(6,6) is zero, singular U.

    I tried using cpu instead of cuda and the problem remains. BTW, as you suggested in here, torch 1.4.0 and torchvision 0.5.0 are used. Accordingly, kornia 0.2.2 is installed for compatibility as shown in here, rather than using the latest version.

  2. Can we use another customized PnP solver in the BPnP forward loop? https://github.com/BoChenYS/BPnP/blob/114f731dbf2b287f818b8afdce0ac479068fa4f1/BPnP.py#L20-L42 For the vanilla BPnP forward function, the output pose doesn't come with autograd and it seems fine to use alternatives to solve 2D-3D correspondences. So do you think it feasible to implement BPnP directly for other PnP solvers by replacing line 32, 36? For example, those based on P3P/EPNP which are also supported by the openCV.

Many thanks! Have a nice day & week-end. Qi Yan

BoChenYS commented 3 years ago

Hi Qi, I couldn't reproduce your first error, so not sure where went wrong. It seems that you get a singular Jacobian J_fy, which is highly unlikely. If it happens a lot, can you try adding small values to J_fy to see if it solves the problem, such as: J_fy = J_fy + 1e-12*torch.eye(m)

For the second question, the answer is yes. The PnP solver can be replaced with other solvers in the forward pass. Yet the effect should be depending on your application and the solver.

Cheers Bo

qiyan98 commented 3 years ago

Hi Bo,

Thanks for your helpful feedback! For the first one, unfortunately I cannot solve it by adding small values. This issue can be avoided by using numpy matrix inversion: inv_J_fy = torch.from_numpy(np.linalg.inv(J_fy.cpu().numpy())).to(device). This might not be an elegant solution though.

For the second one, could you please elaborate a bit on it? In my understanding, it is because of the stationary condition required by eq. (13), and as long as a local minimum is obtained, we can freely use alternative PnP solvers.

Many thanks, Qi Yan

BoChenYS commented 3 years ago

Hi Qi,

Yes, you are right. I should have added that the objective of the PnP solver is used in the backward pass for constructing the f function. Currently backward assumes that the objective is the sum squared reprojection residuals, which is true for most PnP solvers, I think. But if your PnP solver specifically optimises a different objective, then you might want to watch out for the gradients.

Hope that helps.

Bo

qiyan98 commented 3 years ago

Thanks for your message!