Open whu-lyh opened 9 months ago
In this example the repulsion went down at the same time. Guess both losses are fighting against each other. So the contrastive wants that the positives are close, while the repulsion loss wants that there is some margin. But one would need to check which parts are increasing and which one decreasing to do a better analysis.
But I guess it's also not that problematic. What we are interested is that the validation metrics are high. The loss is just a means to an end.
Thanks, get you. I also have anothor question about the max_dist
argument at repulsion_loss, what its usage? what is the value of it should be?
We want to repulse points which are to close together to avoid that the descriptors collapse to one position. If a point is closer to another point than max dist the loss tries to push them apart.
Thanks louis. I tried your momentum contrast design at my place recognition, which means that I set the positive and negative samples, but it seem the model can not divergence and the loss is fluctuated heavily. Do you have some advice? thanks.
Are the features in the end (the descriptors) normalized? Make sure they have for example norm = 1.
I believe yes. here are the normalization code:
pc_input = pc_input.view((-1, 1, pts_num, pc_channels))
pc_embeds = self.pc_encoder(pc_input)
# normalize the pc global feature
pc_global_feat = F.normalize(self.pc_proj(pc_embeds), dim=-1)
pc_global_feat = pc_global_feat.view(batch_size, -1, self.embed_dim)
You also have a key and query encoder?
yes, just same as yours :)
My encoder is a transformer based model and a aggregate layer
Thanks. For 1&3. it works but cost much more time and this is why I tried your strategy. For 2. the learning rate is tricky to set, and I am struggle to test :sob: For 4-th, I tried triplet loss, but the results are not good enough...
Hi, I retrained your model with your datasets, and found that the loss of validation set is increasing, especially the repulsion part. Can you give me some tips of how this happens? Thanks!