xiangjjj / implicit_alignment

Code for ICML2020 "Implicit Class-Conditioned Domain Alignment for Unsupervised Domain Adaptation"
Other
90 stars 10 forks source link

Creation Details for Label Shift #2

Closed huang123ying closed 4 years ago

xiangjjj commented 4 years ago

If you are asking for the office-home dataset, you can download it from google drive https://drive.google.com/file/d/0B81rNlvomiwed0V1YUxQdC1uOTg/view. I've also uploaded it to BaiduNetDisk https://pan.baidu.com/s/1XEzXOBECxV5_x6LnwbrNHw (extraction code: qvq6) if you can't access google drive.

huang123ying commented 4 years ago

thanks! I have downloaded it from google drive. Do you evaluate your model with Digits dataset?

xiangjjj commented 4 years ago

In our paper, we evaluated MNIST<->SVHN in section 4.4.4 under a simulated class distribution shift. The intention is to show implicit alignment can help improve DANN under class distribution shift; therefore, we didn't experiment with other domain adaptation techniques for digits. We have not evaluated the standard digits dataset. The code for the digits experiments is in a private repo, as the data preprocessing pipeline is quite different from other datasets.

huang123ying commented 4 years ago

I am interested in the creation details for label shift and imbalance, so can you send me the unbalanced label files for Digits dataset? thanks!

xiangjjj commented 4 years ago

I see. For the Office-Home-RS-UT, it was created in COAL using a Pareto distribution.

For the digits dataset, I used a Pareto distribution to simulate extreme imbalance and a triangle-like distribution to simulate mild imbalance. It was implemented by putting a wrapper around the digits dataset in pytorch. I will release the dataset wrapper for my digits dataset before the next Monday, August 17th. (It needs a bit refactoring.)

xiangjjj commented 4 years ago

Hi, I've just uploaded two files:

HESTIA00 commented 1 year ago

hi,Can you tell me where the code for the COAL model is? I have been learning this model recently