Open masakinakada opened 7 years ago
Hi! Right now, there is softmax and Relu activation implemented. Adding linear activation should be easy. Would you add it? In function keras::LayerActivation::compute_output
Yes I added it! As you said, linear layer was easy. Just wanted to make sure other parts are fine too!
Thanks you @pplonski !
Great! Would you like to pull your changes?
push my change?
Sure, please make pull request.
Hi @pplonski Thank you so much for this amazing keras converter. I have MLP trained using keras and want to use in C++.
Is this working with MLP which has relu and linear as an activation function? I noticed that you mentioned you focused on CNN.
Thank you very much in advance. Masaki