TheMody / No-learning-rates-needed-Introducing-SALSA-Stable-Armijo-Line-Search-Adaptation

SaLSa Optimizer implementation (No learning rates needed)
MIT License
28 stars 0 forks source link

Adam SLS us Adam SaLSa #2

Closed katealenic closed 2 months ago

katealenic commented 2 months ago

It's interesting paper!

I found implementation for AdamSLS and is it Adam+SLS? Or where I could find implementation for Adam SaLSa?

Thanks for help!

TheMody commented 2 months ago

The Adam + Salsa as described in the paper is from salsa.SaLSA import SaLSA. If you set the momentum parameter to (0,0,0.9) you will have SGD + Salsa. Adam + SLS as described in the paper and previous publications is from salsa.adam_sls import AdamSLS.