tensorflow / addons

Useful extra functionality for TensorFlow 2.x maintained by SIG-addons
Apache License 2.0
1.69k stars 610 forks source link

LAMB optimizer fails to work with tf.keras.optimizers.schedules.PolynomialDecay #2798

Closed ma7555 closed 1 year ago

ma7555 commented 1 year ago

System information

Describe the bug

LAMB fails to work with PolynomialDecay learning rate

File c:\Users\ma7mo\mambaforge\envs\bsc\lib\site-packages\keras\utils\generic_utils.py:965, in Progbar.update(self, current, values, finalize)
    963 value_base = max(current - self._seen_so_far, 1)
    964 if k not in self._values:
--> 965     self._values[k] = [v * value_base, value_base]
    966 else:
    967     self._values[k][0] += v * value_base

TypeError: unsupported operand type(s) for *: 'PolynomialDecay' and 'int'

Code to reproduce the issue

LR = tf.keras.optimizers.schedules.PolynomialDecay(
    1e-2,
    1000,
    end_learning_rate=0.0001,
)
model = ...
model.compile(loss=tf.keras.losses.BinaryCrossentropy(),
              metrics=['accuracy'],
              optimizer=tfa.optimizers.LAMB(LR, weight_decay=0)
              )
model.fit(...)
ma7555 commented 1 year ago

My bad, I forgot to remove a ReduceLROnPlateau from callbacks.. please ignore.