issues
search
OpenNLPLab
/
lightning-attention
Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models
MIT License
182
stars
15
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
question about benchmark
#13
Nalilik
closed
4 months ago
2
Does using lightning-attention need retraining?
#12
ranpin
closed
5 months ago
4
No-decay unusable
#11
jmercat
closed
4 months ago
2
The methods for saving the Lightning-Attention model
#10
wsleepybear
closed
6 months ago
1
The code stuck when running example_lightning_attn.py
#9
yang-yk
closed
6 months ago
2
Tests fail
#8
catid
closed
7 months ago
2
Add environment variable checks for BLOCK_SIZE and CBLOCK_SIZE
#7
relic-yuexi
opened
7 months ago
1
Can the lightning_attn support V100?
#6
relic-yuexi
closed
1 month ago
11
TypeError("unhashable type: 'tensor'")
#5
caihaihua057200
closed
4 months ago
7
When running lightning_attn_func two or more times, an error occurred.
#4
wsleepybear
closed
7 months ago
10
assert d in supports_dim and e in supports_dim ?
#3
XintianHan
opened
7 months ago
4
No module named 'lightning_attn'
#2
EddieEduardo
closed
4 months ago
11
Cannot run the triton kernels
#1
jmercat
closed
4 months ago
1