verlab / accelerated_features

Implementation of XFeat (CVPR 2024). Do you need robust and fast local feature extraction? You are in the right place!
https://www.verlab.dcc.ufmg.br/descriptors/xfeat_cvpr24
Apache License 2.0
880 stars 87 forks source link

how to increase the weight of reliability loss #35

Closed noahzn closed 1 month ago

noahzn commented 1 month ago

Hi, I want to increase the weight of reliability loss, which term should I adjust? The naming of losses confuses me.

noahzn commented 1 month ago

If I only increase the number here it doesn't help to get a better training.

guipotje commented 1 month ago

If I only increase the number here it doesn't help to get a better training.

Hi @noahzn , this is the correct place to increase the reliability loss weighting. Could you please provide more details on what happens? The loss does not improve or you mean the matching results is worse on inference?

noahzn commented 1 month ago

image when I trained on my own dataset, I can always observe that the reliability loss was increasing.

guipotje commented 1 month ago

Hi @noahzn, sorry for the delay. I experienced this during my training as well. My hypothesis is that reliability loss is easier to optimize when the descriptors are random (i.e., when the network is initialized with random weights). However, as training progresses and the descriptors become non-random, the network must learn to identify 'reliable' descriptors in the embedding space.