Closed rowedenny closed 3 years ago
Hi, @rowedenny. Thank you for your suggestion! We have added SparseAdam Optimizer in #570. You can use this optimizer by setting 'trainer' to 'sparse_adam' in your .yaml file. If you want to know more about config settings, please click here https://recbole.io/docs/user_guide/config_settings.html
Is your feature request related to a problem? Please describe. Some optimizers in Pytorch allows sparse update, which could be especially fit in the recommendation settings that in each batch we get access to the part of the embeddings. Apply sparse optimization may speedup the training.
Describe the solution you'd like Allow the users to specify the sparse optimizers.
Additional context https://pytorch.org/docs/stable/optim.html?highlight=sparseadam#torch.optim.SparseAdam