tensorflow / addons

Useful extra functionality for TensorFlow 2.x maintained by SIG-addons
Apache License 2.0
1.69k stars 610 forks source link

seq2seq.BahdanauAttention raise TypeError: probability_fn is not an instance of str #2820

Open BrandonStudio opened 1 year ago

BrandonStudio commented 1 year ago

System information

Describe the bug

call BahdanauAttention with default probability_fn value ("softmax") raises type error

Tried to debug and found that probability_fn was a function when checking type

Code to reproduce the issue

import tensorflow_addons as tfa
tfa.seq2seq.BahdanauAttention(1, 1, 1, normalize=False, name='BahdanauAttention')

Other info / logs

We-here commented 1 year ago

Hello, have you solved this issue? i also have the same error when initializing tea.seq2seq.LuongAttention.

bhack commented 1 year ago

Do you have a very minimal gist to run to reproduce this?

BrandonStudio commented 1 year ago

@We-here unfortunately no, I have not found a way to bypass type check

BrandonStudio commented 1 year ago

@bhack I think the code above is enough. Exception is thrown when initializing the instance of the class, not afterwards

bhack commented 1 year ago

Yes cause _process_probability_fn was going to transform str back to the function.

Can you send a PR to the specific test to help reproduce this case?

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/seq2seq/tests/attention_wrapper_test.py

@seanpmorgan

HybridNeos commented 1 year ago

Also seeing this issue on google colab.

DangMinh24 commented 1 year ago

@BrandonStudio I skip the type checking by comment out @typechecked at AttentionMechanism and its derived classes

BrandonStudio commented 1 year ago

@DangMinh24 Did you just modify the library source code?

DangMinh24 commented 1 year ago

@BrandonStudio Yeah, I modify tensorflow-addons code in local environment.