Closed noahzn closed 1 month ago
If I only increase the number here it doesn't help to get a better training.
If I only increase the number here it doesn't help to get a better training.
Hi @noahzn , this is the correct place to increase the reliability loss weighting. Could you please provide more details on what happens? The loss does not improve or you mean the matching results is worse on inference?
when I trained on my own dataset, I can always observe that the reliability loss was increasing.
Hi @noahzn, sorry for the delay. I experienced this during my training as well. My hypothesis is that reliability loss is easier to optimize when the descriptors are random (i.e., when the network is initialized with random weights). However, as training progresses and the descriptors become non-random, the network must learn to identify 'reliable' descriptors in the embedding space.
Hi, I want to increase the weight of reliability loss, which term should I adjust? The naming of losses confuses me.