titu1994 / DenseNet

DenseNet implementation in Keras
MIT License
707 stars 294 forks source link

"#14 Fix missing link from last up block to Conv2D" #15

Closed bguisard closed 7 years ago

bguisard commented 7 years ago

Includes 'x_up' on the upsampling dense blocks allows the 'include_top' layer to get the full feature map at the end of the upsampling branch.

This commit also changes the 'concatenate' inside __dense_block as the previous method was failing to run on TensorFlow backend.

This was tested using Keras 2.0.2 and TensorFlow 1.1.0

I was having issues when running the concatenate with your list method and my code works on tf, but I didn't test it with th backend.

Also I double checked that the dimensions make sense by comparing with the Lasagne implementation from Simon Jégou. The only thing that is still weird is that this has fewer connections than the table on their paper suggests. I trained it to find vehicles on a video stream and got very good results, but it's possible that there are still something missing.

titu1994 commented 7 years ago

Lol. I got the "bug". The default upsampling technique is Subpixel Convolutions. However, if you switch upsample_type="deconv", you will get exactly 9.4 million parameters now.

Sorry for the confusion, and I will push the change now.

bguisard commented 7 years ago

Nice! I completely forgot about the fact we are using a different upsampling method. Great to see that it's all working properly.

Good luck with your project.