ShichenLiu / CondenseNet

CondenseNet: Light weighted CNN for mobile devices
MIT License
694 stars 131 forks source link

Full Dense Connectivity #17

Closed ghost closed 5 years ago

ghost commented 6 years ago

Hi this is not an issue. I am trying to reproduce condenseNet in tensorflow. But in which part of the code is the implementation of we connect input layers to all subsequent layers in the network, even if these layers are located in different dense blocks as mentioned in the paper available. I can see only one place inside the dense layers where torch.cat function is used where it connects the inputs inside one dense block. Thanks in advance.

ShichenLiu commented 5 years ago

Hi Elio,

When we talk about fully connectivity, we are talking about the connectivity between two dense blocks. DenseNets use transition blocks to reduce the number of channels (i.e. reduce to half with 1x1-conv) for better computational efficiency. While aiming for smaller computation budget, since the channels are relatively smaller, the design of Transition blocks are no longer needed. The full connectivity is then simply using pooling layers and concat the outputs of two previous blocks.