-
https://twitter.com/alfcnz/status/1133372277876068352
Unfortunately that triplet loss is flawed. The most offending negative sample has zero gradient. That power of 2 should be a power of ½.I feel …
-
Hello, Does anybody having the pre-trained model with you?
I used "CASIA-WebFace" and "MS-Celeb-1M" pre-trained models, but the one-to-one comparison is also not accurate. Images of two different p…
-
Hi!,
I am trying to do fine-tuning of the last layer of the arcface model, for this leaving frozen all the weights of the network, except those belonging to the last layers. The problem is that the…
-
![image](https://user-images.githubusercontent.com/13074332/96235954-fd75cc00-0fcd-11eb-8a5b-bfa1ea0db6ae.png)
tensor([0.8285, 0.7775, 0.6481, 0.6993, 0.6650, 0.7633, 0.7279, 0.6936, 0.6447,
…
-
首先非常感谢您提供的代码,有两个问题希望您能提供帮助
1、在我用batch_hard_triplet_loss方法训练手写数字网络时,我发现如果我的网络不加BN层,训练一会之后loss就会固定在margin的大小不在下降,并且acc(acc使用KNN求得)变成了0,我查看了一下_pairwise_distances,发现距离矩阵也差不多都变成了?这是什么原因造成的呢,必须要加bn层吗?
2、我…
-
https://bindog.github.io/blog/2019/10/23/why-triplet-loss-works/
0x00 triplet loss简介0x00 triplet loss简介深度学习领域有一块非常重要的方向称之为metric learning,其中一个具有代表性的方法就是triplet loss,triplet loss的基本思想很清晰,就是让同一类别样本…
-
In version 1.2.0, Centroid Triplet Loss is unstable. When the batch size is small, the below error always occurs. If larger batch size, it is relatively be relieved.
`File "/root/miniconda3/lib/pyt…
-
Hi Everyone,
Thank you for your continuous efforts, I'm really enjoying the new release.
One confusing concept for me is the CachedGISTEmbedLoss, CachedMultipleNegativesRankingLoss
From what…
-
Thank you very much for your excellent work.
One problem I am confused about is the definition of the `crossmodal loss function` and` coseparation loss function`. In the train.py, why random numbers …
-
I had try to train the model by myself.
I download CASIA-WebFace which had been celan and then follow the instructions in wiki "Triplet loss training" to train the mode.
But it seems the accuracy …