kozistr / pytorch_optimizer

optimizer & lr scheduler & loss function collections in PyTorch
https://pytorch-optimizers.readthedocs.io/en/latest/
Apache License 2.0
241 stars 21 forks source link

Ranger sign inversion #232

Open i404788 opened 7 months ago

i404788 commented 7 months ago

Describe the bug

From my experiments it seems like the sign for the Ranger is inverted. All other optimizers (including Ranger21) has steps in the opposite direction of Ranger.

Note that I'm testing context-free step directions/magnitudes using a 'perfect' gradient (scaled by 4), so if Ranger somehow reverts course when gradients from different directions are accumulated that would be missed from my test. Hyperparameters: {'betas': (0.003344506587403595, 0.9685357345548955), 'lr': 0.4616639698903086} (found through hyperparameter search, also done for the other optimizers) and evaluated on the Ackley (dim=2) function.

(I didn't want to create a PR before discussing if this might be intended)

To Reproduce

Log

Ranger: image

For comparison SGD: image

kozistr commented 6 months ago

hi! sorry for the late reply

the current Ranger implementation in this repository is based on here.

(I might be wrong) I guess there's no big difference between the original implementation and mine. (let me know if I'm wrong)

could you please test with the original implementation and reproduce the same result?

thank you!