Open caisl1 opened 1 year ago
Have you figured out the setting on CIFAR? I found the mentioned setting a bit weird, as feature dim=512 and MLP dim=128 for Resnet32. While the other benchmarked methods such as LDAM and Paco use 64 as the feature dim. The comparison is unfair if backbone hyper-params are different from benchmarks methods.
邮件已收到,谢谢!
Have you figured out the setting on CIFAR? I found the mentioned setting a bit weird, as feature dim=512 and MLP dim=128 for Resnet32. While the other benchmarked methods such as LDAM and Paco use 64 as the feature dim. The comparison is unfair if backbone hyper-params are different from benchmarks methods.
I think the paper mentioned 512 as the hidden layer dimension of MLP. (2-layered MLP, 64 -> 512 -> 128)
Perhaps you can refer to the implementation here, I found the code reproduce the reported performance.
The authors mentioned in the paper that their method implemented Sota on the Cifar dataset, but there is no implementation code for this aspect on GitHub, can you provide the implementation code of the Cifar dataset?