hao-ai-lab / LookaheadDecoding

[ICML 2024] Break the Sequential Dependency of LLM Inference Using Lookahead Decoding
https://arxiv.org/abs/2402.02057
Apache License 2.0
1.11k stars 66 forks source link

Support Baichuan models #11

Closed ghost closed 9 months ago

ghost commented 10 months ago

Thanks for sharing Repo. Can you please provide model file to support Baichuan models.

Viol2000 commented 10 months ago

Hi, support for models beyond Llama will be available later.