issues
search
hao-ai-lab
/
LookaheadDecoding
[ICML 2024] Break the Sequential Dependency of LLM Inference Using Lookahead Decoding
https://arxiv.org/abs/2402.02057
Apache License 2.0
1.15k
stars
67
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
NameError: name 'F' is not defined
#17
CSWellesSun
closed
11 months ago
1
Any analysis on the impact on accuracy
#16
qizzzh
closed
1 year ago
2
Support for GPTQ?
#15
liHai001
opened
1 year ago
4
How to generate the n-grams - which to keep, which to discard?
#14
bobqianic
opened
1 year ago
27
Lookahead Decoding Development Roadmap
#13
Viol2000
opened
1 year ago
6
Does the module support fine-tuned Llama2?
#12
spring1915
closed
11 months ago
4
Support Baichuan models
#11
ghost
closed
11 months ago
1
How about batching throughput and energy consumption
#10
Light-of-Hers
opened
1 year ago
2
Is there a plan to rebuild the code in a clear style?
#9
hhhh12345678
closed
11 months ago
1
Question on Initial guess tokens
#8
DaehanKim
closed
1 year ago
1
AttributeError: module 'lade' has no attribute 'config_pading'
#7
liHai001
closed
1 year ago
1
Gumbel noise for sampling in-place of greedy decoding
#6
knagrecha
closed
6 months ago
36
Fix broken reference
#5
bilal-aamer
closed
1 year ago
0
Broken links in ReadMe
#4
bilal-aamer
closed
1 year ago
0
import necessary packages
#3
WrRan
closed
1 year ago
0
import necessary packages
#2
WrRan
closed
1 year ago
0
Integration with other open-source libraries
#1
shermansiu
opened
1 year ago
19
Previous