Open AdamPrystupiuk opened 2 years ago
For me 'modrelu' became functional again when I replaced it's code in cvnn/activations.py
from : " def modrelu(z: Tensor, b: float = 1., c: float = 1e-3) -> Tensor: abs_z = tf.math.abs(z) return tf.cast(tf.keras.activations.relu(abs_z + b), dtype=z.dtype) * z / tf.cast(abs_z + c, dtype=z.dtype) "
to: " def modrelu(z: Tensor, b: float = 1., c: float = 1e-3) -> Tensor: abs_z = tf.math.abs(z) r_relu = tf.keras.activations.relu(abs_z + b) return tf.complex(r_relu tf.math.real(z) / (abs_z + c), r_relu tf.math.imag(z) / (abs_z + c)) "
It might be less efficient but it solved the bug for me.
Hello! Recently there was an update of tensorflow from 2.8.0 to 2.8.2 on google collab and since then my codes involving custom activation functions stopped working properly (it works fine after reinstalling older version of tensorflow).
The error i get is the following: ("c_modu" is the new activation function i added in cvnn/activations.py)
Any input on why that is would be greatly appreciated!