yxgeee / MMT

[ICLR-2020] Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification.
https://yxgeee.github.io/projects/mmt
MIT License
469 stars 73 forks source link

If the classifier C^t need to be re-set after each clustering? #32

Closed haohang96 closed 4 years ago

haohang96 commented 4 years ago

Both the paper you referred ''Unsupervised Person Re-identification: Clustering and Fine-tuning'' and another unsupervised paper: Deep Cluster, they mentioned that the parameter of classifier head need to be re-set because there is no mapping between two consecutive pseudo-label assignment.

But in your paper, I can not find such operation in Algorithm 1, is it be omitted, or the proposed method do not need re-set classifier head after each clustering?

Thanks!

yxgeee commented 4 years ago

Hi, It is omitted in the paper. As for the code, you could refer to https://github.com/yxgeee/MMT/blob/master/examples/mmt_train_dbscan.py#L202

haohang96 commented 4 years ago

It seems that the centroid vector is used as classifier weight, is it right?

yxgeee commented 4 years ago

Yes

haohang96 commented 4 years ago

Because classifier c^t is initialized by centroid vector, if the learning rate of classifier c^t should be smaller?

From the code, I think the treatment of classifier's weight is same as weights in conv layers(including bp and ema update) except it has a special initialization (initalized from centroid vector)

Is there exist some special treatment for the weights of c^t I ignored?

Thanks !

yxgeee commented 4 years ago

There’s no special treatment except the centroid initialization for classifiers in my experiments. You could try different learning rates, but I am not sure it makes sense, since the cluster centroids are dynamically updated.