roysubhankar / dwt-domain-adaptation

Code for paper "Unsupervised Domain Adaptation using Feature-Whitening and Consensus Loss" (CVPR 2019)
64 stars 19 forks source link

Code to replicate the paper results #3

Open viniciusarruda opened 4 years ago

viniciusarruda commented 4 years ago

Hi, First of all, nice work! I would like to obtain the source code to replicate the other experiments, such as the MNIST <-> SVHN. Could you make it available? Thanks.

vcoyette commented 4 years ago

Hi, I am also struggling to replicate the results from the papers on the digit datasest, firstly on the easiest MNIST - USPS adaptation.

I think there are two things that can change from the implementation on Office from the repo:

Many thanks !

roysubhankar commented 4 years ago

Hi, I will try to upload the code for MNIST->USPS by this weekend.

vcoyette commented 4 years ago

Thank you !

roysubhankar commented 4 years ago

Hi @vcoyette ,

I have uploaded the code for usps <-> mnist. This implementation contains DWT with Entropy loss. But you can easily replace the entropy loss with the MEC loss following the implementation for the office-home dataset.

  1. Yes you are right, thats the order of the operations. For the usps <-> mnist case we use DWT layers for the first two conv layers and then use BN based alignment layers for the FC layers. We may have missed to report these details in the paper.

  2. Horizontal flip is not used for digits experiments. Rest are the same.

vcoyette commented 4 years ago

Hi @roysubhankar,

That is indeed the BN alignment I was missing after the FC layers. Without it, the network rapidly get stuck in predicting the same number on the target dataset (e.g always 4, or always 5). That's probably the network overfitting to mimimize entropy . I tried to reduce the lambda on the entropy loss, which resolves the problem, but the results are not as good as with the BN (~86-87%). I didn't really optimize the value of lambda though.

Anyway, thanks for taking the time to answer, and congratulation, these are amazing results !