TimoSaemann / ENet

ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation
583 stars 275 forks source link

Should not BN layers be removed in final prototxt? [BUG?] #30

Closed mitalbert closed 7 years ago

mitalbert commented 7 years ago

The tutorial says that after running bn absorber the BN layers should be removed in final prototxt files, however this script removed only dropout layers and changed BN layers parameters, but did not remove them. BN layers are also not removed in https://github.com/TimoSaemann/ENet/blob/master/prototxts/enet_deploy_final.prototxt

Edit: Apparently only BN layers following conv layers being merged. The rest 3 BN layers make no big difference, this is why closing the issue.