tensorflow / addons

Useful extra functionality for TensorFlow 2.x maintained by SIG-addons
Apache License 2.0
1.69k stars 611 forks source link

ExponentialCyclicalLearningRate - TypeError: Cannot convert 1.0 to EagerTensor of dtype int64 #2799

Open ImSo3K opened 1 year ago

ImSo3K commented 1 year ago

System information

Describe the bug

When I use ExponentialCyclicalLearningRate and I fit m model with a TensorBoard instance, I get the following error TypeError: Cannot convert 1.0 to EagerTensor of dtype int64

After a little bit of debugging, I have found out that the issue is here: https://github.com/tensorflow/addons/blob/b2dafcfa74c5de268b8a5c53813bc0b89cadf386/tensorflow_addons/optimizers/cyclical_learning_rate.py#L86-L102

Specifically at:

            return initial_learning_rate + (
                maximal_learning_rate - initial_learning_rate
            ) * tf.maximum(tf.cast(0, dtype), (1 - x)) * self.scale_fn(mode_step)

It seems that self.scale_fn(mode_step) fails internally when trying to compute self.gamma ** x when x (mode_step) is of type int64. I saw a similar issue here https://github.com/tensorflow/addons/issues/2593 with some fix that was supposedly about to me merged but since I'm using the latest version I guess that the merge wasn't implemented.

Code to reproduce the issue

Same as https://github.com/tensorflow/addons/issues/2593

Potential Fix

Change self.scale_fn(mode_step) to self.scale_fn(step_as_dtype) since it is of type float32, it does work for that specific line, I just don't know if it can potentially break future dependencies.