AUT-Data-Group / PREDICT-Persian-Reverse-Dictionary

The first intelligent Persian reverse dictionary
2 stars 2 forks source link

Error on using AdditiveAttention #1

Open Kobe972 opened 7 months ago

Kobe972 commented 7 months ago

AdditiveAttention cannot output weights. weights, context_vector = AdditiveAttention(name='attention')([state_h, lstm]) should be modified to context_vector = AdditiveAttention(name='attention')([state_h, lstm]) and remove everything related with weights. Refer to https://stackoverflow.com/questions/63289566/keras-attention-layer-on-sequence-to-sequence-model-typeerror-cannot-iterate-ov.

My tensorflow version: 2.8.0

amingheibi commented 7 months ago

@arm-on

arm-on commented 7 months ago

AdditiveAttention cannot output weights. weights, context_vector = AdditiveAttention(name='attention')([state_h, lstm]) should be modified to context_vector = AdditiveAttention(name='attention')([state_h, lstm]) and remove everything related with weights. Refer to https://stackoverflow.com/questions/63289566/keras-attention-layer-on-sequence-to-sequence-model-typeerror-cannot-iterate-ov.

My tensorflow version: 2.8.0

Hi and thanks for informing us about the problem.

Just a quick check: Have you used the attention.py file? It is inside the project's Google Drive folder. The AdditiveAttention defined there might be a little bit different than the one used today by Tensorflow.

Also, the Tensorflow version we used back then was 2.3. I suggest you try running the code with that version.