RUCAIBox / RecBole

A unified, comprehensive and efficient recommendation library
https://recbole.io/
MIT License
3.47k stars 615 forks source link

[💡SUG] Sparse Optimizer training #558

Closed rowedenny closed 3 years ago

rowedenny commented 3 years ago

Is your feature request related to a problem? Please describe. Some optimizers in Pytorch allows sparse update, which could be especially fit in the recommendation settings that in each batch we get access to the part of the embeddings. Apply sparse optimization may speedup the training.

Describe the solution you'd like Allow the users to specify the sparse optimizers.

Additional context https://pytorch.org/docs/stable/optim.html?highlight=sparseadam#torch.optim.SparseAdam

EliverQ commented 3 years ago

Hi, @rowedenny. Thank you for your suggestion! We have added SparseAdam Optimizer in #570. You can use this optimizer by setting 'trainer' to 'sparse_adam' in your .yaml file. If you want to know more about config settings, please click here https://recbole.io/docs/user_guide/config_settings.html