Closed BoomShakaY closed 3 years ago
Hi @BoomShakaY
If you have labels for your dataset, you only need to prepare your dataset like Market with $n$ subfolder, and change the class number $n$ in the config file. For example, your dataset contain 300 classes, your dataset folder should contain 300 subfolder.
You need to train your own teacher model to use KLDivLoss
.
Thanks for your reply!
I want to extract other kinds of feature based on ReID just like you, yes, I have labels and the class number is only 9, if I change the class_num straightly from 751 to 9, won't it deviate from the original intention?
I just wondering if I add a classifier beside the f_netAB last two classifiers and use another loss to constrain it, will it seem to work well? I'm still trying because I'm a green hand in coding, by the way, thanks for your code again, it really helps a lot!
@BoomShakaY You mean adding the third classifier? It will work if you set an appropriate loss weight. You may have a try.
Thanks for your code! I have a question about how the loss Lprim constrains the encoder to extract the appearance code. If I want to change the loss to extract other kinds of features( eg. weather), where should I modify?
If I change Lprim , then the teacher model is unused.
I have seen in #40 , you said "
f
is the appearance code for image generation; We do not want the generation losses to updatef
. Thus, we use thedetach
here. In this way,f
is mainly updated via the re-id related losses."The re-id related losses including
CrossEntropyLoss
andKLDivLoss
, they are calculated between labels, so I'm confused.Thank you!