issues
search
lancopku
/
Prime
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
Other
87
stars
9
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
IWSLT'14 DE-EN Numbers
#9
nikhil-iyer-97
opened
4 years ago
2
Reproducing IWSLT14-de-en results
#8
dguo98
closed
4 years ago
3
The multi-scale gate is modeled by parameterized weights instead of depending on the input data. So, why should term it 'dymanticlly' rather than 'adaptively'?
#7
luogen1996
closed
4 years ago
1
anyone running into 'nan'
#6
xinqipony
closed
4 years ago
4
TypeError: argument of type 'NoneType' is not iterable
#5
jude-nlp
closed
4 years ago
1
TypeError: argument of type 'NoneType' is not iterable
#4
jude-nlp
closed
4 years ago
0
TypeError: argument of type 'NoneType' is not iterable
#3
jude-nlp
closed
4 years ago
0
Spelling in the paper appendix
#2
lumaku
closed
4 years ago
1
muse code?
#1
xinqipony
closed
4 years ago
1