Open xtyDoge opened 3 years ago
If relu layer is implemented by torch.nn.Relu(inplace=True) , it doesn't cost any additional memory when forward pass. But it is still summarized
torch.nn.Relu(inplace=True)
If relu layer is implemented by
torch.nn.Relu(inplace=True)
, it doesn't cost any additional memory when forward pass. But it is still summarized