Closed TRANDONGXYZ closed 1 year ago
Hi, the optimizer only optimizes the capnet, so the vgg's weights will not change.
optimizer = Adam(capnet.parameters(), lr=opt.lr, betas=(opt.beta1, 0.999))
Regarding the freezing weights code, it was used for another experiment that I tried to finetune a part of the VGG (not included in this repository).
Why is the weight of VGG-19 not saved after training VGG-19 Does that affect the testing process?
You have initialized
vgg_ext
andcapnet
, but you only savecapnet
.It doesn't really freeze the first 9 layers because when we check with the command:
then all are equal to
True
.Thank you.