Closed bregmangh closed 3 years ago
when we set 'algorithm=DANN', and run the code directly, the results in the paper are not available. Here is the output of the program run. why? Thanks. Args: algorithm: DANN checkpoint_freq: None data_dir: /dataset/DG/ dataset: PACS holdout_fraction: 0.2 hparams: None hparams_seed: 0 output_dir: train_output save_model_every_checkpoint: False seed: 0 skip_model_save: False steps: None task: domain_generalization test_envs: [0] trial_seed: 0 uda_holdout_fraction: 0 HParams: batch_size: 32 beta1: 0.5 class_balanced: False d_steps_per_g_step: 1 data_augmentation: True grad_penalty: 0.0 lambda: 1.0 lr: 5e-05 lr_d: 5e-05 lr_g: 5e-05 mlp_depth: 3 mlp_dropout: 0.0 mlp_width: 256 nonlinear_classifier: False resnet18: False resnet_dropout: 0.0 weight_decay: 0.0 weight_decay_d: 0.0 weight_decay_g: 0.0 env0_in_acc env0_out_acc env1_in_acc env1_out_acc env2_in_acc env2_out_acc env3_in_acc env3_out_acc epoch gen_loss mem_gb step step_time 0.2092739475 0.1931540342 0.2953091684 0.3119658120 0.3083832335 0.2724550898 0.2414122137 0.2242038217 0.0000000000 0.8979552984 7.9268550873 0 0.7343387604 disc_loss env0_in_acc env0_out_acc env1_in_acc env1_out_acc env2_in_acc env2_out_acc env3_in_acc env3_out_acc epoch gen_loss mem_gb step step_time 1.2359811942 0.8078096400 0.7652811736 0.8528784648 0.8482905983 0.9812874251 0.9580838323 0.8374681934 0.8382165605 7.1856287425 -0.685258171 8.2017307281 300 0.4992028658 11.189239563 0.4655277608 0.4621026895 0.7425373134 0.7371794872 0.7559880240 0.7335329341 0.7468193384 0.7414012739 14.371257485 3.4806638861 8.2017307281 600 0.4944124389 238.60761779 0.1061622941 0.1442542787 0.2739872068 0.3055555556 0.1474550898 0.1886227545 0.2512722646 0.2369426752 21.556886227 5.7296134837 8.2017307281 900 0.4963528244 2082.6883511 0.2245271507 0.2371638142 0.3187633262 0.3290598291 0.5022455090 0.4760479042 0.3985368957 0.3885350318 28.742514970 873.57823590 8.2017307281 1200 0.4959800331 1.0357982743 0.2556436852 0.2567237164 0.3928571429 0.3696581197 0.5381736527 0.4940119760 0.4109414758 0.4152866242 35.928143712 3.9007051329 8.2017307281 1500 0.4922414263 24.230077548 0.4002440513 0.3936430318 0.5986140725 0.5854700855 0.7020958084 0.6437125749 0.5540712468 0.5528662420 43.113772455 569.27031482 8.2017307281 1800 0.4939069223 1.1945504181 0.3837705918 0.3911980440 0.5772921109 0.5641025641 0.7215568862 0.6706586826 0.6141857506 0.6394904459 50.299401197 1.0432499917 8.2017307281 2100 0.5033365639 382.80088246 0.4051250763 0.4034229829 0.6753731343 0.6752136752 0.8203592814 0.7754491018 0.7019720102 0.7082802548 57.485029940 392.48228594 8.2017307281 2400 0.4943927431 429.56468669 0.3715680293 0.3716381418 0.6988272921 0.7179487179 0.8248502994 0.7485029940 0.7659033079 0.7605095541 64.670658682 2884.0271361 8.2017307281 2700 0.4916381876 1.2151116145 0.4130567419 0.4107579462 0.6023454158 0.6047008547 0.7754491018 0.7365269461 0.6669847328 0.6433121019 71.856287425 0.6902019918 8.2017307281 3000 0.4960203767 244.13614944 0.3343502135 0.3496332518 0.6476545842 0.6517094017 0.7133233533 0.6826347305 0.7045165394 0.6917197452 79.041916167 314.87799603 8.2017307281 3300 0.4902488399 70.464921302 0.5009151922 0.4938875306 0.6087420043 0.5940170940 0.7717065868 0.7664670659 0.6428117048 0.6471337580 86.227544910 170.52687933 8.2017307281 3600 0.4870569730 2427.8547357 0.1696156193 0.1809290954 0.3678038380 0.3632478632 0.3016467066 0.2814371257 0.4736005089 0.4840764331 93.413173652 1438.1708783 8.2017307281 3900 0.4942047254 890.91614838 0.1848688225 0.1882640587 0.4024520256 0.4273504274 0.5254491018 0.4880239521 0.3555979644 0.3350318471 100.59880239 1838.5080866 8.2017307281 4200 0.4956636135 9005.7036366 0.3123856010 0.2787286064 0.4205756930 0.4059829060 0.5853293413 0.5658682635 0.3619592875 0.3656050955 107.78443113 1594.6786078 8.2017307281 4500 0.4901909248 10238.970430 0.2733374009 0.2591687042 0.5042643923 0.4935897436 0.5209580838 0.4670658683 0.5063613232 0.4777070064 114.97005988 12094.406260 8.2017307281 4800 0.4918605773 385962.36794 0.1928004881 0.2542787286 0.3155650320 0.3354700855 0.3031437126 0.3353293413 0.3005725191 0.3095541401 119.76047904 111583.26512 8.2017307281 5000 0.4945144749
when we set 'algorithm=DANN', and run the code directly, the results in the paper are not available. Here is the output of the program run. why? Thanks. Args: algorithm: DANN checkpoint_freq: None data_dir: /dataset/DG/ dataset: PACS holdout_fraction: 0.2 hparams: None hparams_seed: 0 output_dir: train_output save_model_every_checkpoint: False seed: 0 skip_model_save: False steps: None task: domain_generalization test_envs: [0] trial_seed: 0 uda_holdout_fraction: 0 HParams: batch_size: 32 beta1: 0.5 class_balanced: False d_steps_per_g_step: 1 data_augmentation: True grad_penalty: 0.0 lambda: 1.0 lr: 5e-05 lr_d: 5e-05 lr_g: 5e-05 mlp_depth: 3 mlp_dropout: 0.0 mlp_width: 256 nonlinear_classifier: False resnet18: False resnet_dropout: 0.0 weight_decay: 0.0 weight_decay_d: 0.0 weight_decay_g: 0.0 env0_in_acc env0_out_acc env1_in_acc env1_out_acc env2_in_acc env2_out_acc env3_in_acc env3_out_acc epoch gen_loss mem_gb step step_time
0.2092739475 0.1931540342 0.2953091684 0.3119658120 0.3083832335 0.2724550898 0.2414122137 0.2242038217 0.0000000000 0.8979552984 7.9268550873 0 0.7343387604 disc_loss env0_in_acc env0_out_acc env1_in_acc env1_out_acc env2_in_acc env2_out_acc env3_in_acc env3_out_acc epoch gen_loss mem_gb step step_time
1.2359811942 0.8078096400 0.7652811736 0.8528784648 0.8482905983 0.9812874251 0.9580838323 0.8374681934 0.8382165605 7.1856287425 -0.685258171 8.2017307281 300 0.4992028658 11.189239563 0.4655277608 0.4621026895 0.7425373134 0.7371794872 0.7559880240 0.7335329341 0.7468193384 0.7414012739 14.371257485 3.4806638861 8.2017307281 600 0.4944124389 238.60761779 0.1061622941 0.1442542787 0.2739872068 0.3055555556 0.1474550898 0.1886227545 0.2512722646 0.2369426752 21.556886227 5.7296134837 8.2017307281 900 0.4963528244 2082.6883511 0.2245271507 0.2371638142 0.3187633262 0.3290598291 0.5022455090 0.4760479042 0.3985368957 0.3885350318 28.742514970 873.57823590 8.2017307281 1200 0.4959800331 1.0357982743 0.2556436852 0.2567237164 0.3928571429 0.3696581197 0.5381736527 0.4940119760 0.4109414758 0.4152866242 35.928143712 3.9007051329 8.2017307281 1500 0.4922414263 24.230077548 0.4002440513 0.3936430318 0.5986140725 0.5854700855 0.7020958084 0.6437125749 0.5540712468 0.5528662420 43.113772455 569.27031482 8.2017307281 1800 0.4939069223 1.1945504181 0.3837705918 0.3911980440 0.5772921109 0.5641025641 0.7215568862 0.6706586826 0.6141857506 0.6394904459 50.299401197 1.0432499917 8.2017307281 2100 0.5033365639 382.80088246 0.4051250763 0.4034229829 0.6753731343 0.6752136752 0.8203592814 0.7754491018 0.7019720102 0.7082802548 57.485029940 392.48228594 8.2017307281 2400 0.4943927431 429.56468669 0.3715680293 0.3716381418 0.6988272921 0.7179487179 0.8248502994 0.7485029940 0.7659033079 0.7605095541 64.670658682 2884.0271361 8.2017307281 2700 0.4916381876 1.2151116145 0.4130567419 0.4107579462 0.6023454158 0.6047008547 0.7754491018 0.7365269461 0.6669847328 0.6433121019 71.856287425 0.6902019918 8.2017307281 3000 0.4960203767 244.13614944 0.3343502135 0.3496332518 0.6476545842 0.6517094017 0.7133233533 0.6826347305 0.7045165394 0.6917197452 79.041916167 314.87799603 8.2017307281 3300 0.4902488399 70.464921302 0.5009151922 0.4938875306 0.6087420043 0.5940170940 0.7717065868 0.7664670659 0.6428117048 0.6471337580 86.227544910 170.52687933 8.2017307281 3600 0.4870569730 2427.8547357 0.1696156193 0.1809290954 0.3678038380 0.3632478632 0.3016467066 0.2814371257 0.4736005089 0.4840764331 93.413173652 1438.1708783 8.2017307281 3900 0.4942047254 890.91614838 0.1848688225 0.1882640587 0.4024520256 0.4273504274 0.5254491018 0.4880239521 0.3555979644 0.3350318471 100.59880239 1838.5080866 8.2017307281 4200 0.4956636135 9005.7036366 0.3123856010 0.2787286064 0.4205756930 0.4059829060 0.5853293413 0.5658682635 0.3619592875 0.3656050955 107.78443113 1594.6786078 8.2017307281 4500 0.4901909248 10238.970430 0.2733374009 0.2591687042 0.5042643923 0.4935897436 0.5209580838 0.4670658683 0.5063613232 0.4777070064 114.97005988 12094.406260 8.2017307281 4800 0.4918605773 385962.36794 0.1928004881 0.2542787286 0.3155650320 0.3354700855 0.3031437126 0.3353293413 0.3005725191 0.3095541401 119.76047904 111583.26512 8.2017307281 5000 0.4945144749