Closed sdudeck closed 6 years ago
Yes indeed, there is a bug in CntkDenseConverter, in cntk_converters.py. This line:
if self.activation:
should be:
if self.activation is not None:
I'll push a fix.
thanks a lot - changing this line did the trick :-)
Hello, I am trying to import a simple cntk-model consisting of one fully connected layer only (18 inputs, 2 outputs, relu-activation). It seems as if the ELL-import skips the activation function at all. Independend of the activation function I define for the layer (cntk.tanh, cntk.sigmoid, cntk.relu, None) the resulting ELL-json-file looks exactly the same (beside the weights). Is this a known issue? I think my problems with importing a darknet model a couple of month ago was quite similar, that the activation layer was not imported properly.
Thank you very much, Sven
Model_ell.zip Model_cntk.zip