Hi, nice work that warps the source image robustly with dense correspondences even in corrupted situations. I am trying to take advantage of DiffMatch for better relative pose estimation and visual localization, thus eager to predict high-quality sparse correspondences. While other dense matching ones like PDC-Net and DKM provide uncertainty estimation to choose reliable sparse correspondences, similar approaches are not found in DiffMatch. So how could I extract exact sparse correspondences from the predicted dense warping? Thanks a lot.
Hi, nice work that warps the source image robustly with dense correspondences even in corrupted situations. I am trying to take advantage of DiffMatch for better relative pose estimation and visual localization, thus eager to predict high-quality sparse correspondences. While other dense matching ones like PDC-Net and DKM provide uncertainty estimation to choose reliable sparse correspondences, similar approaches are not found in DiffMatch. So how could I extract exact sparse correspondences from the predicted dense warping? Thanks a lot.