issues
search
gabrielloye
/
Attention_Seq2seq-Translation
34
stars
23
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Why using tanh function
#7
karimmahalian
opened
1 year ago
0
bug in BahdanauDecoder
#6
moseshu
closed
2 years ago
1
discrepancy with original Badahnau paper on the self.weight parameter
#5
xiaolongwu0713
opened
3 years ago
0
question about discrepancy with Pytorch NLP seq2seq tutorial
#4
xiaolongwu0713
opened
3 years ago
0
Bug: In Luong Decoder
#3
thakursc1
opened
4 years ago
3
attn_combine is defined but not used in BahdanauDecoder forward process
#2
RHzhongzju
closed
5 years ago
1
evaluation on unseen examples
#1
TheZaraKhan
opened
5 years ago
0