Open kasungayan opened 6 years ago
Thanks for the suggestion, you may implement it using universal learner. We also welcome community contributes to new features like this.
How about this? https://github.com/souravsingh/nips-cocob-cntk
Thanks for sharing this @kkc0923 , probably this can be integrated to upcoming CNTK release @KeDengMS ?
This implementation uses UserLearner which may not be optimal in performance. It might be better to use Universal Learner by writing the learner as CNTK expression instead of numpy. We welcome third party pull request to have it under bindings/python/cntk/contrib/learners, as well as necessary tests.
Hi Team,
COntinuous COin Betting (COCOB) is a novel algorithm for stochastic subgradient descent (SGD) that does not require any learning rate setting, which was originally introduced in [1]. This would be very beneficial and handy for Deep learning tasks since this doesn't require any learning rate to tune. Also, it converges considerably faster than the traditional momentum-based optimizers. I have seen many other deep learning platforms implementing this recently.
Do you guys have any plans of integrating this feature to the CNTK stack in near future?
Thanks in advance.
[1] COCOB Paper