Open Arthaj-Octopus opened 11 months ago
Hey can you do these following changes and check once??
1) Remove ReLU
from the imports as it is not being used in the code anywhere.
2) Instead of x = Activation(activation=PReLU())(x)
can you try PReLU()(x) since it is implemented as a Layer.
This might solve the problem, if not lmk :) happy to help you further.
Hello everyone, There seems to be an issue with the PReLU activation layer , as it gives the error:
whenever called. I have attempted with several networks and I always get the same issue, and if I replace it with any other activaiton layer, such as ReLU or eLU, the error occurs
for example:
Note that the error occurs if you call it as a layer "Prelu()(x)" or if you try to set it as the activation function of the activation layer. I am using tensorflow 2.14.0