Closed mzhaoshuai closed 1 year ago
Could you please explain why AugMix is not applied to the ImageNet, ImageNet-A, ImageNet-V2, ImageNet-R, and ImageNet-Sketch datasets?
I am pretty curious about this.
Again, thx for your work:smile:!
Hi, thanks for your wonderful work!
When I load the pre-trained weights of CoOp, it reports the error in the title.
I fix it by removing the subscript.
Thank you for bringing this up! This should be a typo when I was cleaning up the code. You are welcome to create a pull request.
Could you please explain why AugMix is not applied to the ImageNet, ImageNet-A, ImageNet-V2, ImageNet-R, and ImageNet-Sketch datasets?
I am pretty curious about this.
Again, thx for your work😄!
Thank you for your interest in our work! When evaluating TPT on natural distribution shifts, we use ImageNet as the validation set. We found that strong data augmentation does not improve much on ImageNet. We conjecture this has something to do with CLIP's pretraining procedure: CLIP is trained with random crops as the only data augmentation.
When evaluating cross-dataset generalization, we use the validation split of each dataset for hyperparameter tuning. We found that stronger augmentation performs better than random crop on a majority of these datasets. It's probably because images of many fine-grained datasets are more object-centric than ImageNet, so random crop doesn't work well for them.
Thanks for your reply!
Hi, thanks for your wonderful work!
When I load the pre-trained weights of CoOp, it reports the error in the title. https://github.com/azshue/TPT/blob/63ecbace79694205d7884e63fdc3137a200f0b0e/tpt_classification.py#L118
I fix it by removing the subscript.