hao-ai-lab / LookaheadDecoding

[ICML 2024] Break the Sequential Dependency of LLM Inference Using Lookahead Decoding
https://arxiv.org/abs/2402.02057
Apache License 2.0
1.15k stars 67 forks source link

Does Lade support topp/topk/temperature sampleing? #29

Open AlvL1225 opened 11 months ago

AlvL1225 commented 11 months ago

Amazing work! Does this support sampling decoding?

Viol2000 commented 11 months ago

Hi, thanks for your interest! Currently, we do not support sampling, but we will do it later!

shermansiu commented 11 months ago

See #6.