Closed zebin-dm closed 3 weeks ago
Thank you for your attention. The release of the SuperGlue model is under preparation, as we need to consider license and legal issues. We will complete it as soon as possible.
Thank you for your attention. The release of the SuperGlue model is under preparation, as we need to consider license and legal issues. We will complete it as soon as possible.
Congratulations for your paper acception! I am wondering whether the weights of superglue model also needs approval of license checking?
Thank you for your attention. The release of the SuperGlue model is under preparation, as we need to consider license and legal issues. We will complete it as soon as possible.
Congratulations for your paper acception! I am wondering whether the weights of superglue model also needs approval of license checking?
@gujiaqivadin Since MagicLeap was involved in propose SuperGlue, it's a commercial company, so we have to be careful with it.
Thank you for your attention. The release of the SuperGlue model is under preparation, as we need to consider license and legal issues. We will complete it as soon as possible.
Congratulations for your paper acception! I am wondering whether the weights of superglue model also needs approval of license checking?
@gujiaqivadin Since MagicLeap was involved in propose SuperGlue, it's a commercial company, so we have to be careful with it.
Would love to see the lightglue version instead since both the training and inference code are apache 2!
@zebin-dm @gujiaqivadin @pablovela5620
Thank you for your attention; we have now released gim_lightglue
and appreciate your patience.
This is awesome, thank you. Are the gim_lightglue
weights a drop in replacement? So if I wanted to use them with the original lightglue repo or in hloc, could I just replace the original weights and things should work? I'd love to do some testing with them
@pablovela5620
Thank you 😊 It may not be possible to directly replace it; it's necessary to reorganize the ckpt file so that the original lightglue can directly read it. Additionally, I found that training lightglue with 2048 points leads to normal performance of gim_lightglue
at 2048 points. However, with more points, such as 4096, it might perform strangely because lightglue has not encountered them during training. Therefore, I suggest using the parameters provided by demo.py
when utilizing gim_lightglue
.
congraduation, thanks for your work. could you please release the superglue model?