juntang-zhuang / Adabelief-Optimizer

Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"
BSD 2-Clause "Simplified" License
1.05k stars 109 forks source link

AttributeError: 'AdaBeliefOptimizer' object has no attribute '_set_hyper' #66

Open SamMohel opened 11 months ago

SamMohel commented 11 months ago

I'm facing the problem with AdaBeliefOptimizer

AttributeError: 'AdaBeliefOptimizer' object has no attribute '_set_hyper'

optimizer = AdaBeliefOptimizer(learning_rate=1e-3, epsilon=1e-14, rectify=False)

juntang-zhuang commented 11 months ago

Are you using pytorch or tensorflow version? I don't think I have written or used any function named _set_hyper

SamMohel commented 11 months ago

Tensorflow version with 2.15.0. The exact error is

s~/.local/lib/python3.10/site-packages/adabelief_tf/AdaBelief_tf.py:148, in AdaBeliefOptimizer.__init__(self, learning_rate, beta_1, beta_2, epsilon, weight_decay, rectify, amsgrad, sma_threshold, total_steps, warmup_proportion, min_lr, name, print_change_log, **kwargs)
    145     print(Style.RESET_ALL)
    146 # ------------------------------------------------------------------------------
--> 148 self._set_hyper("learning_rate", kwargs.get("lr", learning_rate))
    149 self._set_hyper("beta_1", beta_1)
    150 self._set_hyper("beta_2", beta_2)

https://github.com/juntang-zhuang/Adabelief-Optimizer/blob/update_0.2.0/pypi_packages/adabelief_tf0.2.1/adabelief_tf/AdaBelief_tf.py#L148

juntang-zhuang commented 11 months ago

No idea with tensorflow, it's contributed by another user. Maybe you can try tensorflow addon https://www.tensorflow.org/addons/api_docs/python/tfa/optimizers/AdaBelief

SamMohel commented 11 months ago

Thanks i already used optimizer = optim.AdaBelief(model.parameters(), lr=1e-3) and worked but the accuracy doesn't greater than using Adam optimizer. My task is for classification