ZitongYu / CDCN

Central Difference Convolutional Networks (CVPR'20)
Other
553 stars 179 forks source link

ReLU Activation for depth map. Why not sigmoid? #39

Closed himansh1314 closed 2 years ago

himansh1314 commented 3 years ago

Why is the relu activation function being used as the final activation in the lastconv layers? Shouldn't it be sigmoid as the values of our depth map needs to be in the range of 0-255. ReLU is unbounded function, so wouldn't sigmoid be better activation for depth map generation?

milky-tea-1997 commented 2 years ago

Excuse me, I have the same question. Do you know the reason?