Open littlepae opened 2 years ago
Hi @littlepae, thank you for appreciating EasyFSL!
In TransductiveFinetuning.__init__()
we set the backbone to not require grad (here). So when you try to compute the gradient during loss.backward()
it gives this error.
Before training, set model.requires_grad_(True)
, and then for evaluation model.requires_grad_(False)
Silently disabling gradients for the backbone in the initialization of transductive method is blatantly a bad practice. It will cause errors like this and will not be easy to debug for the user. Either we only freeze the backbone during the forward
method and unfreeze it later, either we signal the freezing in the logs. Option 1 seems best.
I'm marking this as enhancement, thank you for pointing this out to me!
Problem Hello. This is nice repo and make me easily to understand and implement FSL model to my project. But i would like to ask you How to implement and train Transductive Fine-tuning model
Since this model is classical training (if i understand correctly) so i use the same classical training tutorial and just replace
PrototypicalNetworks
toTransductiveFinetuning
infew_shot_classifier()
But in training stage, This error is show up
So i comment
loss.backward()
andoptimizer.step()
line as it already have inforward
function of Transductive Fine-tuning model but next problem is the loss of model in training process does not reduce and i still don't know why?Can you provide me how to fix this error and how to implement Transductive Fine-tuning or other classical model in EasyFSL?
Thanks a lot