issues
search
lucidrains
/
local-attention
An implementation of local windowed attention for language modeling
MIT License
370
stars
40
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
update deprected
#24
Abdelrahman350
opened
1 month ago
0
update deprected
#23
Abdelrahman350
closed
1 month ago
0
'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'
#22
MarcusLoppe
closed
2 months ago
6
sequence length 2 must be divisible by window size 8 for local attention
#21
JonasLi-19
closed
5 months ago
0
May I be allowed to delete einops and replace it with the operations provided by torch?
#20
wencan
closed
2 months ago
1
LocalTransformer Encoder Layer
#19
AmitMY
opened
8 months ago
0
The look_around function seems to be incorrect
#18
datvuthanh
closed
1 year ago
1
About the performance
#17
ThyrixYang
closed
1 year ago
2
Attention weight
#16
emanuele-mincato
opened
1 year ago
0
Wrong shape for attention bias vs sim tensor
#15
inspirit
closed
1 year ago
1
xPos Rotary Embeddings
#14
ilya16
closed
1 year ago
5
Fixing error message {t} not defined.
#13
gieoon
closed
1 year ago
0
A bug of torch.arange for long sequence with fp16 data type
#12
renll
closed
1 year ago
1
Which is exactly the attention pattern?
#11
beleen23
closed
1 year ago
3
Transformer implementation with local attention
#10
serkansulun
closed
1 year ago
1
More control over attention masking
#9
Mindful
opened
3 years ago
1
Include einops in setup script
#8
Mindful
closed
3 years ago
1
replace shaws with rotary embeddings
#7
lucidrains
closed
3 years ago
0
Maybe the function `shift` can be simpler and clearer.
#6
lartpang
closed
3 years ago
1
Bug in exact_window_size masking for Causal Attention?
#5
xravitejax
closed
3 years ago
3
Local Attention vs Standard when length < window_size
#4
gautierdag
closed
3 years ago
2
Please add option for exact window_size masking
#3
usamec
closed
3 years ago
2
question about the look around operation
#2
benywon
closed
3 years ago
2
question about the local attention
#1
benywon
closed
3 years ago
0