Closed HuangChiEn closed 8 months ago
thanks for the question. tune_*.py
here, tune
means hyper-parameter tuning. This is different from the notion of fine-tuning
or visual prompt tuning
.
You are correct that this paper does not do any pre-training process.
I have saw the issue, so I wonder that this repo did not provide any script or code for pretraining (supervised pretraining, or MAE, MoCo pretraining for imagenet22k) ?
Instead, this repo directly load the pretrained weights and apply for various kind of tuning ? So, train.py is the script for Tuning on VTAB-Structured, VTAB-Natural, VTAB-Specialized subset ? And, tune_vtab.py, tune_fgvc.py are also for Tuning on VTAB-caltech101 subset and CUB dataset, respectively ?
Any clarification will be appreciated!!