magicleap / SuperGluePretrainedNetwork

SuperGlue: Learning Feature Matching with Graph Neural Networks (CVPR 2020, Oral)
Other
3.24k stars 663 forks source link

Derving permutation matrix from Sinkhorn results #131

Closed henokyen closed 1 year ago

henokyen commented 1 year ago

Hello

I was suggested by one of the authors of the Polyworld paper (https://arxiv.org/pdf/2111.15491.pdf) to adapt your Sinkhorn implementation. I have a couple of questions:

  1. In that paper it is indicated that a permutation matrix is calculated by performing 100 Sinkhorn iterations. A permutation matrix represents the fact that each vertex is connected to only one vertex. I tried running your implementation of Sinkhorn (at lines 143 & 152 in superglue.py) but I couldn't get a permutation matrix. How do I derive a permutation matrix from Sinkhorn results?

  2. I was thinking that indices0 and indices1 have some clue in creating a permutation matrix. What do indices0 and indices1 represent? indices0 = torch.where(valid0, indices0, indices0.new_tensor(-1)) indices1 = torch.where(valid1, indices1, indices1.new_tensor(-1))

henokyen commented 1 year ago

I figured it out

lansfair commented 1 year ago

Have you reproduced the code for Polyworld?