issues
search
lsdefine
/
attention-is-all-you-need-keras
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
702
stars
188
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Licence
#38
BDUG
opened
1 year ago
0
after embedding layer
#37
lidongxing
opened
4 years ago
1
reshape may not match
#36
pengxingang
opened
4 years ago
0
startup error
#35
rafaleo
opened
4 years ago
8
Update transformer.py
#34
zhanjunlang
closed
2 years ago
0
layer norm end of the encoder?
#33
salihgunduz
opened
4 years ago
0
Time series forecasting?
#32
salihgunduz
closed
4 years ago
0
Difference between decode_sequence_fast and decode_sequence_readout?
#31
renjithamadeus
opened
4 years ago
0
Using the approach for video encoding.
#30
kristosh
opened
4 years ago
0
why get same output with different input?
#29
maozezhong
opened
5 years ago
4
the mask of attention
#28
zjjzyl
opened
5 years ago
0
seq2seq confused with shape
#26
thomasyue
closed
5 years ago
1
ScaledDotProductAttention
#25
t-kong
opened
5 years ago
0
the test demo
#24
chenjun2hao
opened
5 years ago
0
dimension in GetSubMask
#23
ichenjia
opened
5 years ago
0
Using the transformer instead of a simple LSTM layer
#22
basma-b
opened
5 years ago
2
Why wasn't K and V weren't passed from the top encoder to bottom decoder model?
#21
ichenjia
closed
5 years ago
0
K.mean() in computing loss doesn't make any sense.
#20
mayurnewase
opened
5 years ago
0
Transformer encoder layer instead of Bidirectional LSTM
#19
Eugen2525
opened
5 years ago
1
after run your demo i get a error result like this.why?
#18
flyboyer
opened
5 years ago
0
Skip-connection in Transformer
#17
hoangcuong2011
opened
5 years ago
1
Issue with attention mask
#16
LorrinWWW
closed
5 years ago
2
Reshape : Dimension mismatch
#15
shashwattrivedi
closed
5 years ago
1
Decoding a sentence give same translation
#14
mayurnewase
opened
5 years ago
0
'nan' loss function when using layer normalization
#13
McKracken
opened
5 years ago
1
maybe i find a point should be change
#12
alphanlp
opened
5 years ago
2
when i run the pinyin_main.py, get UserWarning like below
#11
alphanlp
opened
5 years ago
2
embedding dropout
#10
JulesGM
closed
5 years ago
0
Keras and Tensorflow Versions
#9
amirveyseh
opened
5 years ago
1
Save model to json
#8
jingyuanz
opened
6 years ago
2
pure language model
#7
XiaoLiuAI
opened
6 years ago
1
mask for decoder
#6
XiaoLiuAI
closed
6 years ago
6
Is the LayerNormalization class is the transformer needed?
#5
chaitjo
closed
6 years ago
2
How to perform translation?
#4
lchunleo
closed
5 years ago
4
Issues with Keras Lambda Layers
#3
wfmonster
closed
6 years ago
3
MultiHeadAttention
#2
AMSakhnov
closed
6 years ago
1
LayerNormalization
#1
AMSakhnov
closed
6 years ago
1