Closed talhakabakus closed 2 years ago
@thushv89 Please solve this issue
Has anyone solved these issues?
Hi,
Sorry about the delay and thanks for raising this. Yes there's an unknown issue with the tf.keras.backend.rnn
operation in tensorflow>2.6
. I haven't been able to look into this in too much depth. I have tried few potential fixes but they all failed. So looks like something major has shifted since tensorflow>2.6
.
The easiest solution would be to downgrade to tensorflow==2.6
until the issue is understood. But will keep you posted on the findings.
Hi @thushv89 any news? I just tried to use tensorflow==2.6 but i still have the same problem....
what about using the keras additive attention layer _attn_out = AdditiveAttention()([decoder_outputs, encoderoutputs]) would it be a fix ?
@rudy-becarelli: Apologies about the delay. Could you try tensorflow==2.5.0
?
@AliMi001: It would be a close substitute but not a complete one. For example you can see here how TensorFlow uses AdditiveAttention
for implementing Bahdanau attention. However, it does not compute attention sequentially as a sequence would be processed in a RNN
. Rather, it computes all outputs first and then apply attention on top of that. I'm not sure what the performance difference is, but it is different from the original Bahdanau attention proposed in the paper.
I have tried tensorflow==2.5.0 with no success
Good news! A fix is on the way on https://github.com/thushv89/attention_keras/tree/tf2-fix Should work fine once merged
Let me know if there are any issues.
I can not access repo
I have similar problem, is this issue solved?
Try replacing
tensorflow.python.keras.layers
with tensorflow.keras.layers
on imports. See if it works.
I'm trying to re-implement the text summarization tutorial here. Getting the following error when I employ the Attention Layer:
How can I overcome this error? I've added my software stack below: