ningshixian / LSTM_Attention

attention-based LSTM/Dense implemented by Keras
https://github.com/ningshixian/LSTM_Attention
292 stars 77 forks source link

self_attention 第167行的问题,谢谢! #7

Closed Guo-Alex closed 4 years ago

Guo-Alex commented 4 years ago

你好,我在运行你的这个程序时,第167行报错:TypeError: add_weight() got multiple values for argument 'name',我用的是anaconda3+python3.6+keras2.3.1+tensorflow2.1.0,不知道你有什么建议吗?

Guo-Alex commented 4 years ago

问题解决了,是weight()的形参和实参的问题,修改(input_shape[-1], input_shape[-1],)-->shape=(input_shape[-1], input_shape[-1],)