issues
search
ayulockin
/
neurips-llm-efficiency-challenge
Starter pack for NeurIPS LLM Efficiency Challenge 2023.
https://llm-efficiency-challenge.github.io/challenge
Apache License 2.0
118
stars
44
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Error in finetuning while working in Google Colab.
#14
Vaibhav170216
opened
14 hours ago
0
can not install PyTorch 2.1 nightly in Mac
#13
ruqianq
opened
7 months ago
0
Issue about flash-atten
#12
mamba824824
opened
8 months ago
3
Issue with flash-attention
#11
kandeldeepak46
opened
11 months ago
0
LoRA finetune error
#10
pandeydeep9
opened
11 months ago
0
ImportError: libcublas.so.11: cannot open shared object file: No such file or directory
#9
kandeldeepak46
opened
11 months ago
0
Go from LoRA to submission?
#8
nkasmanoff
opened
1 year ago
0
QLoRA finetune throws error
#7
nahidalam
opened
1 year ago
0
Set up submission pipeline without sudo
#6
tranxuantuyen
closed
1 year ago
3
OOM error while running LoRa
#5
bmanikan
opened
1 year ago
11
Minor readme update
#4
nahidalam
closed
1 year ago
1
Error building flash attention
#3
nahidalam
closed
1 year ago
3
Minor README.md update
#2
akhilravidas
closed
1 year ago
1
Where is Flash Attention used in the finetuning process?
#1
nahidalam
closed
1 year ago
1