sksq96 / pytorch-summary

Model summary in PyTorch similar to `model.summary()` in Keras
MIT License
3.98k stars 412 forks source link

In transfer learning as fixed feature extractor, all parameters are counted as trainable parameters. #156

Open brpy opened 3 years ago

brpy commented 3 years ago

The params of layers with requires_grad=False are counted as trainable parameters.

esherman9 commented 3 years ago

Could you explain the situation a bit more? Did you encounter this after creating a new model or after loading from a saved checkpoint? If recreating from a save, my understanding is that the requires_grad flags are re-initialized as they are not stored in state_dict, but rather nn.parameters. https://discuss.pytorch.org/t/how-to-save-the-requires-grad-state-of-the-weights/52906

brpy commented 3 years ago

Thanks for the reply. It was a vgg16 model that I set requires_grad=False for most of the initial layers. I didn't use checkpoint.

The model worked as intended but torch summary counted non trainable layers as trainable and gave number of trainable params a huge number.

Sorry It was a while ago I encountered this, so cannot provide more info.