davidtvs / PyTorch-ENet

PyTorch implementation of ENet
MIT License
389 stars 129 forks source link

a question about UpsamplingBottleneck block #50

Closed cedricgsh closed 3 years ago

cedricgsh commented 3 years ago

@davidtvs The official code does not have any activation functions in 1x1 expansion block in all Bottleneck blocks. The paper said that they place Batch Normalization and PReLU between all convolutions, not after all convolutions This is official code fragment in Bottleneck block. I cannot find any activation functions after the region marked in red. I hope you can check this problem. 1613355171(1)

davidtvs commented 3 years ago

There's another activation on the return statement. See https://github.com/e-lab/ENet-training/blob/master/train/models/decoder.lua#L96

cedricgsh commented 3 years ago

Thanks for your answer. Maybe my expression is not very clear. The activation on the return statement is marked in red. However, I mean, you add a needless PReLU after 1x1 expansion block, marked in blue, see https://github.com/davidtvs/PyTorch-ENet/blob/a67d048ec837849eb79dfb8ec51b629a9738b362/models/enet.py#L440 1613438077(1) @davidtvs

davidtvs commented 3 years ago

You are correct. I see it now. I'll make the change in the repository when I manage to retrain the corrected models.

Thanks for raising this to my attention!

davidtvs commented 3 years ago

Fixed now. Thanks!