Open hccho2 opened 4 years ago
Due to cross-interactions between Graph tensors and Python code flow deep inside TensorFlow, as a quick workaround you have to switch back into the Graph mode with tf.compat.v1.disable_eager_execution()
.
@failure-to-thrive Are cross-interactions my fault? Or is it a bug?
@hccho2 Seems not your.
class BahdanauAttention(_BaseAttentionMechanism):
And in_BaseAttentionMechanism:
Also note that this layer does not work with Keras model when
model.compile(run_eagerly=True)
due to the fact that this layer is stateful. The support for that will be added in a future version.
/cc @qlzh727
I don't know if in your case could be resolved with manual memory reset in the PR introduced fixing https://github.com/tensorflow/addons/issues/535
This is a known issue when the Keras functional API, stateful layers, and eager mode are used at the same time.
The Keras functional API creates symbolic tensors that are saved in stateful layers, but the layer is executed in eager mode where the TensorFlow runtime does not expect to find symbolic tensors.
Has this been resolved?
No, the issue is still open.
System information
Q1. What is the cause of the error?
If set
memory_sequence_length=None
---> No Error,Q2. a,b,c are non-numeric tensors. Why are there no numerical values?