Closed cedricgsh closed 3 years ago
There's another activation on the return statement. See https://github.com/e-lab/ENet-training/blob/master/train/models/decoder.lua#L96
Thanks for your answer. Maybe my expression is not very clear. The activation on the return statement is marked in red. However, I mean, you add a needless PReLU after 1x1 expansion block, marked in blue, see https://github.com/davidtvs/PyTorch-ENet/blob/a67d048ec837849eb79dfb8ec51b629a9738b362/models/enet.py#L440 @davidtvs
You are correct. I see it now. I'll make the change in the repository when I manage to retrain the corrected models.
Thanks for raising this to my attention!
Fixed now. Thanks!
@davidtvs The official code does not have any activation functions in 1x1 expansion block in all Bottleneck blocks. The paper said that they place Batch Normalization and PReLU between all convolutions, not after all convolutions This is official code fragment in Bottleneck block. I cannot find any activation functions after the region marked in red. I hope you can check this problem.