OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10
TensorFlow version and how it was installed (source or binary): 2.10.0 binary (with pip)
TensorFlow-Addons version and how it was installed (source or binary): 0.19.0 binary (with pip)
Python version: 3.9.7
Is GPU used? (yes/no): no
Describe the bug
When I use ExponentialCyclicalLearningRate and I fit m model with a TensorBoard instance, I get the following error
TypeError: Cannot convert 1.0 to EagerTensor of dtype int64
It seems that self.scale_fn(mode_step) fails internally when trying to compute self.gamma ** x when x (mode_step) is of type int64.
I saw a similar issue here https://github.com/tensorflow/addons/issues/2593 with some fix that was supposedly about to me merged but since I'm using the latest version I guess that the merge wasn't implemented.
Change self.scale_fn(mode_step) to self.scale_fn(step_as_dtype) since it is of type float32, it does work for that specific line, I just don't know if it can potentially break future dependencies.
System information
Describe the bug
When I use ExponentialCyclicalLearningRate and I fit m model with a TensorBoard instance, I get the following error
TypeError: Cannot convert 1.0 to EagerTensor of dtype int64
After a little bit of debugging, I have found out that the issue is here: https://github.com/tensorflow/addons/blob/b2dafcfa74c5de268b8a5c53813bc0b89cadf386/tensorflow_addons/optimizers/cyclical_learning_rate.py#L86-L102
Specifically at:
It seems that
self.scale_fn(mode_step)
fails internally when trying to computeself.gamma ** x
when x (mode_step
) is of typeint64
. I saw a similar issue here https://github.com/tensorflow/addons/issues/2593 with some fix that was supposedly about to me merged but since I'm using the latest version I guess that the merge wasn't implemented.Code to reproduce the issue
Same as https://github.com/tensorflow/addons/issues/2593
Potential Fix
Change
self.scale_fn(mode_step)
toself.scale_fn(step_as_dtype)
since it is of typefloat32
, it does work for that specific line, I just don't know if it can potentially break future dependencies.