tdeboissiere / DeepLearningImplementations

Implementation of recent Deep Learning papers
MIT License
1.81k stars 650 forks source link

concat_axis missing for the transition layer call inside Densenet #77

Open haramoz opened 6 years ago

haramoz commented 6 years ago

Hello,

I am trying to learn how to use Densenet, so i started with this repository. When i try to run this code I find error in the call for the transition layer inside the Densenet function. When I look inside I do see that the line number 176 x = transition(x, nb_filter, dropout_rate=dropout_rate,...) clearly is not passing the "concat_axis" as the second parameter as it should be in the transition function call. Do you think that is a mistake? Could you please let me know? Thank you very much.

Best, Haramoz

XiangxiangXu commented 6 years ago

I have encountered the same problem, then fixed it by adding the argument concat_axis manually.

haramoz commented 6 years ago

Ah thanks for writing back. Yes that seems to have fixed the issue. I had done it the same way. However now I am concerned about other things related to the implementation, like where are the bottleneck layers? and instead of "compression" layers we have weight decay here. I am in middle of the process to evaluate this implementation, soon may be have to move to another one or write mine ...... just wanted to inform you about my thoughts, @XiangxiangXu . Let me know if you agree or disagree to this. Cheers!