Open wwx1474446236 opened 1 year ago
Your config is right, the problem came from a bug, see https://github.com/arthurdouillard/dytox/blob/main/erratum_distributed.md
You can see updated results for dytox and dytox+ in table 17 of the last version of my paper. Unfortunatly i've never re-run dytox++ because it's slower with SAM.
Hi sir! There is no dytox++ configuration for imagenet100 and imagenet1000 in your code, I appended the following configuration to imagenet_dytox_plus.yaml to pretend that this is imagenet_dytox_plusplus.yaml:
# SAM sam_rho: 3.0 sam_adaptive: true # ASAM sam_skip_first: true sam_mode: [tr]
and I ran the following command (using 2GPUs):
bash train.sh 0,1 --options options/data/imagenet100_10-10.yaml options/data/imagenet100_order1.yaml options/model/imagenet_dytox_plus.yaml --name dytox --data-path MY_PATH_TO_DATASET --output-basedir PATH_TO_SAVE_CHECKPOINTS --memory-size 2000
and I got the final result is : "acc": 70.64, "avg_acc": 78.93 compare with "acc": 72.46, "avg_acc": 80.76 in your Appendix Table 10, my result is much lower.
Can you update the configuration of dytox++ for imagenet100 and imagenet1000?