KaiyangZhou / Dassl.pytorch

A PyTorch toolbox for domain generalization, domain adaptation and semi-supervised learning.
MIT License
1.21k stars 169 forks source link

Low accuracy on SelfEnsembling; mnist -> mnist_m; #30

Open georgepachitariu opened 3 years ago

georgepachitariu commented 3 years ago

Hi, thanks for sharing your code.

I tried it out trainer=SelfEnsembling, source_domain=mnist and traget_domain=mnist_m and I was expecting to get around 95% accuracy on the test subset, target_domain. But I wasn't able to get more than 65% Accuracy.

Can you please have a look if I am missing any important parameters? I run it like this:

        python tools/train.py \
            --backbone resnet18 \
            --root "datasets" \
            --trainer SelfEnsembling \
            --source-domains "mnist" \
            --target-domains "mnist_m" \
            --output-dir "$job_dir" \
            --dataset-config-file "configs/datasets/da/digit5.yaml" \
            DATALOADER.K_TRANSFORMS 2 \
            DATALOADER.TRAIN_X.BATCH_SIZE 128 \
            DATALOADER.NUM_WORKERS 10 \
            TRAINER.SE.EMA_ALPHA 0.999 \
            OPTIM.LR 0.0003 \
            OPTIM.MAX_EPOCH 200 \

I also did a small hyper-parameter sweep and tried different LR=(3e-3 3e-4 3e-5) and EMA_ALPHA=(0.99 0.999 0.9999), but I didn't find a combination with a better score.

KaiyangZhou commented 3 years ago

Is 95% the reported number? If so, have you checked whether your param setting is the same as the paper?

For the sgd optimizer, I'd suggest using a larger LR, e.g. 0.002 or 0.001