Hi! I retrained all models with attention_module='cbam_block' ,except Inception series ,but the accuracy is not improved compared to the models with no attention_module. I didn't modify the training scrip main.py.
Did you train on cifar10 dataset? How is the effect?
Thanks!
Hi! I retrained all models with attention_module='cbam_block' ,except Inception series ,but the accuracy is not improved compared to the models with no attention_module. I didn't modify the training scrip main.py. Did you train on cifar10 dataset? How is the effect? Thanks!