WARNING:tensorflow:
The following Variables were used a Lambda layer's call (lambda), but
are not present in its tracked objects:
<tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/kernel:0' shape=(16, 1) dtype=float32>
<tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/bias:0' shape=(1,) dtype=float32>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.
WARNING:tensorflow:
The following Variables were used a Lambda layer's call (lambda), but
are not present in its tracked objects:
<tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/kernel:0' shape=(16, 1) dtype=float32>
<tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/bias:0' shape=(1,) dtype=float32>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.
Traceback (most recent call last):
File "/home/lyw/PycharmProjects/DSIN/code/train_din.py", line 49, in
att_hidden_size=(64, 16,))
File "/home/lyw/PycharmProjects/DSIN/code/models/din.py", line 87, in DIN
query_emb, keys_emb])
File "/home/lyw/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/python/keras/engine/base_layer.py", line 922, in call
outputs = call_fn(cast_inputs, *args, **kwargs)
File "/home/lyw/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/python/autograph/impl/api.py", line 265, in wrapper
raise e.ag_error_metadata.to_exception(e)
AttributeError: in user code:
/home/lyw/anaconda3/envs/py36/lib/python3.6/site-packages/deepctr/layers/sequence.py:198 call *
outputs._uses_learning_phase = attention_score._uses_learning_phase
AttributeError: 'Tensor' object has no attribute '_uses_learning_phase'
WARNING:tensorflow: The following Variables were used a Lambda layer's call (lambda), but are not present in its tracked objects: <tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/kernel:0' shape=(16, 1) dtype=float32> <tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/bias:0' shape=(1,) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer. WARNING:tensorflow: The following Variables were used a Lambda layer's call (lambda), but are not present in its tracked objects: <tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/kernel:0' shape=(16, 1) dtype=float32> <tf.Variable 'attention_sequence_pooling_layer/local_activation_unit/bias:0' shape=(1,) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer. Traceback (most recent call last): File "/home/lyw/PycharmProjects/DSIN/code/train_din.py", line 49, in
att_hidden_size=(64, 16,))
File "/home/lyw/PycharmProjects/DSIN/code/models/din.py", line 87, in DIN
query_emb, keys_emb])
File "/home/lyw/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/python/keras/engine/base_layer.py", line 922, in call
outputs = call_fn(cast_inputs, *args, **kwargs)
File "/home/lyw/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/python/autograph/impl/api.py", line 265, in wrapper
raise e.ag_error_metadata.to_exception(e)
AttributeError: in user code: