Closed msw6468 closed 3 years ago
Hi, thanks for your interest in our work!
This ablation study aims to find out how different random initialization strategies perform. This procedure is simply about whether to randomly initialize the novel class weights before the few-shot fine-tuning
. This is different from few-shot fine-tuning
which will jointly fine-tune the base classes and the novel classes together on a balanced few-shot dataset.
Here the novel set
is the same as novel classes as used few-shot fine-tuning
.
Hope it helps.
Feel free to reopen it if there is still an issue.
Just to clarify, fine-tuning a predictor on the novel set means training only on the novel set, without any images of the base classes. And then you are resuming training of this model on a balanced set of both base classes and novel classes.
Hello. First, thank you for providing code~!
And my question is, what is the meaning of "fine-tuning a predictor on the novel set and using the classifier's weights as initialization"?
Is that process same as "few-shot fine-tuning"? Is that "novel set" different from "novel class"?
Best regards,