pluskid / Mocha.jl

Deep Learning framework for Julia
Other
1.29k stars 254 forks source link

back-propagation for softmax layer #160

Closed jeff-regier closed 8 years ago

jeff-regier commented 8 years ago

Does SoftmaxLayer not support back-propagation because you haven't needed that functionality yet, perhaps because softmax is usually followed by a logistic loss, and you can use SoftmaxLossLayer instead? (Do you mainly use SoftmaxLayer to normalized blobs from a data layer?) Or is implementing back-propagation for SoftmaxLayer likely to be challenging (or to not scale well) from some reason? I'll work on it and submit a PR if you don't foresee any problems.

pluskid commented 8 years ago

@jeff-regier back propagation for softmax is actually implemented a while ago. Sorry I forgot to update the document. Just updated it! Thanks for reporting!