RanXu2000 / continual-mae

[CVPR2024] Continual-MAE: Adaptive Distribution Masked Autoencoders for Continual Test-Time Adaptation
MIT License
8 stars 1 forks source link

About pre-trained vit model #1

Open consortiumE opened 1 month ago

consortiumE commented 1 month ago

Hello, thank you for sharing the code. About the classification experiment cifar100 to cifar100c and imagenet to imagenetc, .can you provide download files for source pre-trained model?

Yangsenqiao commented 1 month ago

Hi, thanks for your interest in our work. You can download the files from ViDA.

consortiumE commented 1 month ago

Your excellent work inspired me to study CTTA. Thank you very much for the source model you provided. I used the source model to get the same results as in the paper in the cifar100C data set. After adding the mae method, the model had a high error rate in 15 contaminated domains in the cifar100C data set. It may be that there is an adaptation problem with the code I wrote myself. If it is convenient for you, can you provide the complete code for running cifar100C and imagenetC?