issues
search
HKUNLP
/
ChunkLlama
[ICML'24] Data and code for our paper "Training-Free Long-Context Scaling of Large Language Models"
Apache License 2.0
275
stars
14
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Perplexity validation on PG19 error and Passkey Retrieval error
#19
khfs
opened
1 week ago
8
Does it support vllm now?
#18
skykiseki
opened
1 month ago
0
An error occurred when I used flash_decoding_chunkllama in run_chunkllama_100k.py
#17
smilelite
opened
1 month ago
5
Does it supprts batch inference?
#16
SkyAndCloud
opened
1 month ago
1
Tanks for your Research, Can you provide a CUDA version code?
#15
nanmi
opened
1 month ago
4
Does it support the Qwen1.5-72B-Chat-GPTQ-Int4 quantized model?
#14
huliangbing
closed
1 month ago
4
finetune error
#13
MarsMeng1994
opened
2 months ago
2
Validation issue on long context extension without support from training process
#12
yiakwy-xpu-ml-framework-team
closed
2 months ago
3
needle test with llama2
#11
MarsMeng1994
closed
2 months ago
5
Does this work with LLama 3?
#10
thomasgauthier
closed
2 months ago
3
fix filename error
#9
liveuptosyf
opened
2 months ago
0
a confusing issue
#8
lilhongxy
closed
3 months ago
8
info about compared models in your paper
#7
Jihuai-wpy
opened
3 months ago
2
Llama-2-70b-chat-hf model fails to pass the 125k passkey test
#6
2793145003
closed
3 months ago
8
Issue of CLEX results
#5
guanzhchen
closed
4 months ago
1
is P_k wrong in the code release?
#4
DavideHe
closed
3 months ago
6
How do I use it in vllm deployment
#3
jchang98
opened
4 months ago
6
Add gitignore file to ignore Mac OSX .DS_Store and Python temp files
#2
dikarel
closed
4 months ago
0
Fix typo: MyMuPDF => PyMuPDF
#1
dikarel
closed
3 weeks ago
0