microsoft / CNTK

Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
https://docs.microsoft.com/cognitive-toolkit/
Other
17.53k stars 4.28k forks source link

[Feature] Training Deep Networks without Learning Rates Through Coin Betting #2992

Open kasungayan opened 6 years ago

kasungayan commented 6 years ago

Hi Team,

COntinuous COin Betting (COCOB) is a novel algorithm for stochastic subgradient descent (SGD) that does not require any learning rate setting, which was originally introduced in [1]. This would be very beneficial and handy for Deep learning tasks since this doesn't require any learning rate to tune. Also, it converges considerably faster than the traditional momentum-based optimizers. I have seen many other deep learning platforms implementing this recently.

Do you guys have any plans of integrating this feature to the CNTK stack in near future?

Thanks in advance.

[1] COCOB Paper

ke1337 commented 6 years ago

Thanks for the suggestion, you may implement it using universal learner. We also welcome community contributes to new features like this.

KichangKim commented 6 years ago

How about this? https://github.com/souravsingh/nips-cocob-cntk

kasungayan commented 6 years ago

Thanks for sharing this @kkc0923 , probably this can be integrated to upcoming CNTK release @KeDengMS ?

ke1337 commented 6 years ago

This implementation uses UserLearner which may not be optimal in performance. It might be better to use Universal Learner by writing the learner as CNTK expression instead of numpy. We welcome third party pull request to have it under bindings/python/cntk/contrib/learners, as well as necessary tests.