tim-learn / SHOT

code released for our ICML 2020 paper "Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation"
MIT License
437 stars 78 forks source link

Difference between code and paper #6

Closed valencebond closed 4 years ago

valencebond commented 4 years ago

thanks for your code and paper, the proposed new problem is interesting. However, there are some inconsistencies between code and paper.

  1. according to the paper, the total loss consists of three parts, a entropy loss Lent, a diversity loss Ldiv and a minus cross entropy loss. In the code, classifier_loss corresponds to minus cross entropy loss, entropy_loss corresponds to Lent, gentropy_loss corresponds to Ldiv

  2. As paper said, cross entropy loss is negative and Ldiv is positive, but cross entropy loss is positive and Ldiv is negative in code.

please correct me if there is something wrong.

tim-learn commented 4 years ago

I don't know that our paper have said "cross entropy loss is negative and Ldiv is positive". I hope you can check what is KL loss and what is cross-entropy loss carefully, especially the existence of minus symbol.