philipperemy / keras-tcn

Keras Temporal Convolutional Network.
MIT License
1.87k stars 454 forks source link

The following Variables were used a Lambda layer's call (tf.nn.bias_add_252), but are not present in its tracked objects #209

Closed Kartik-Singhal26 closed 2 years ago

Kartik-Singhal26 commented 3 years ago

Describe the bug WARNING:tensorflow: The following Variables were used a Lambda layer's call (tf.nn.bias_add_252), but are not present in its tracked objects: <tf.Variable 'tcn_17/residual_block_11/conv1D_0/bias:0' shape=(32,) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer. WARNING:tensorflow: The following Variables were used a Lambda layer's call (tf.nn.convolution_119), but are not present in its tracked objects: <tf.Variable 'tcn_17/residual_block_11/conv1D_1/kernel:0' shape=(4, 32, 32) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer. WARNING:tensorflow: The following Variables were used a Lambda layer's call (tf.nn.bias_add_253), but are not present in its tracked objects: <tf.Variable 'tcn_17/residual_block_11/conv1D_1/bias:0' shape=(32,) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer.

Paste a snippet batch_size, timesteps, input_dim = None, None, X_train_transformed.shape[2] inp = Input(batch_shape=(batch_size, timesteps, input_dim))

out1 = TCN(nb_filters = 64, # Integral Number of Filters to be used kernel_size = 3, # Integral Size of Kernel for each layer nb_stacks = 2, # Integral Number of stacks of residual blocks to use activation = 'tanh',

dilations = (2, 4),

       use_batch_norm = False, 
       dropout_rate = 0.2, # Fraction of input units to# drop
       padding = 'causal', # Same for non causal network
       return_sequences = True)(inp)  # The TCN layers are here.

out2 = TCN(nb_filters = 16, # Integral Number of Filters to be used kernel_size = 3, # Integral Size of Kernel for each layer nb_stacks = 2, # Integral Number of stacks of residual blocks to use

dilations = (2, 4),

       activation = 'tanh',
       use_batch_norm = False, 
       #dropout_rate = 0.2, # Fraction of input units to# drop
       padding = 'causal', # Same for non causal network
       return_sequences = True)(out1)  # The TCN layers are here.

out3 = TCN(nb_filters = 32, # Integral Number of Filters to be used kernel_size = 4, # Integral Size of Kernel for each layer nb_stacks = 2, # Integral Number of stacks of residual blocks to use

dilations = (2, 4),

       activation = 'tanh',
       use_batch_norm = False, 
       #dropout_rate = 0.2, # Fraction of input units to# drop
       padding = 'causal', # Same for non causal network
       return_sequences = False)(out2)  # The TCN layers are here.

out = Dense(1, activation = 'linear')(out3) TCNmodel_Try1 = Model(inputs = [inp], outputs = [out])

%Save Best Model filepath = "TCN1_best.hdf5" checkpoint = ModelCheckpoint(filepath, monitor = 'val_coeff_determination', verbose = 1, save_best_only = True, mode = 'max') callbacks_list = [checkpoint]

%Early Stopping es = EarlyStopping(monitor = 'val_coeff_determination', mode = 'max', verbose = 1, patience = 45) %Log History csv_logger = CSVLogger('training.log', separator=',', append=False)

%Compile Model TCNmodel_Try1.compile(optimizer = 'adam', loss = 'mae', metrics = ['RootMeanSquaredError', 'mse'])' `

Dependencies tensorflow 2.5.0

philipperemy commented 2 years ago

@Kartik-Singhal26 thanks for reporting. This WARNING seems pretty harmless. You should not worry about it.

I cannot reproduce it with tensorflow 2.5.0 on my laptop with CPU. Neither with 2.7.0.

So I will close this issue. Feel free to comment if I am wrong about it.