keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.58k stars 19.42k forks source link

PReLU Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor #18520

Open Arthaj-Octopus opened 11 months ago

Arthaj-Octopus commented 11 months ago

Hello everyone, There seems to be an issue with the PReLU activation layer , as it gives the error:

Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor.

Call arguments received by layer "activation" (type Activation):
  • inputs=tf.Tensor(shape=(None, None, 64), dtype=float32)

whenever called. I have attempted with several networks and I always get the same issue, and if I replace it with any other activaiton layer, such as ReLU or eLU, the error occurs

for example:

import tensorflow as tf
from tensorflow.keras.layers import Conv1D, BatchNormalization, Activation, Add
from tensorflow.keras.layers import ReLU, PReLU

def resnet_block(inputs, filters, kernel_size, stride):
    # Shortcut connection
    shortcut = inputs

    # First convolutional layer
    x = Conv1D(filters, kernel_size, strides=stride, padding='same')(inputs)
    x = BatchNormalization()(x)
    x = Activation(activation=PReLU())(x)

    # Second convolutional layer
    x = Conv1D(filters, kernel_size, padding='same')(x)
    x = BatchNormalization()(x)

    # Shortcut connection for identity mapping
    if stride > 1 or inputs.shape[-1] != filters:
        shortcut = Conv1D(filters, 1, strides=stride, padding='same')(inputs)
        shortcut = BatchNormalization()(shortcut)

    # Add shortcut connection to the main path
    x = Add()([x, shortcut])
    x = PReLU()(x)

    return x

# Example usage
input_shape = (None, 1)  # Example input shape
inputs = tf.keras.layers.Input(shape=input_shape)
x = resnet_block(inputs, filters=64, kernel_size=3, stride=1)
model = tf.keras.Model(inputs=inputs, outputs=x)
model.summary()
model.compile(
    optimizer=Adam(),
    loss="mse",
)
model.fit(x_data, y_data, epochs=5, batch_size=16, validation_split=0.11,shuffle=True,verbose=1)

Note that the error occurs if you call it as a layer "Prelu()(x)" or if you try to set it as the activation function of the activation layer. I am using tensorflow 2.14.0

ajayspatil7 commented 2 months ago

Hey can you do these following changes and check once??

1) Remove ReLU from the imports as it is not being used in the code anywhere. 2) Instead of x = Activation(activation=PReLU())(x) can you try PReLU()(x) since it is implemented as a Layer.

This might solve the problem, if not lmk :) happy to help you further.