zhanghang1989 / PyTorch-Encoding

A CV toolkit for my papers.
https://hangzhang.org/PyTorch-Encoding/
MIT License
2.04k stars 450 forks source link

About replace in ReLU and Dropout #399

Open TomMao23 opened 3 years ago

TomMao23 commented 3 years ago

I noticed that a lot of your code is written like this

nn.ReLU(True),
nn.Dropout(0.1, False),

why you use inplace operator in activation fuction but not in dropout? Does this have any special meaning?