sksq96 / pytorch-summary

Model summary in PyTorch similar to `model.summary()` in Keras
MIT License
3.98k stars 412 forks source link

Forward/backward pass size with relu layers #163

Open xtyDoge opened 3 years ago

xtyDoge commented 3 years ago

If relu layer is implemented by torch.nn.Relu(inplace=True) , it doesn't cost any additional memory when forward pass. But it is still summarized