Closed maayange17 closed 3 months ago
Honestly, pytorch-summary implementation is a bit strange. A usual way to get the total number of parameters is
total_num_params = sum(p.numel() for p in model.parameters())
trainable_num_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
Here numel()
method returns exactly what you want: the total number of elements in the tensor.
This works perfectly with convkan.
EDIT: by the way, you can probably check out the successor of pytorch-summary - torchinfo
Using pytorch-summary, I saw that the parameters number is counted as zero. I also found a solution.
See the next issue: https://github.com/sksq96/pytorch-summary/issues/65