microsoft / LLMLingua

To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.
https://llmlingua.com/
MIT License
4.18k stars 222 forks source link

[bug]: Can not use local model as input #148

Closed zjcDM closed 1 month ago

zjcDM commented 2 months ago

Describe the bug

I download a model store in local path like this: image

when pass local model path to LLMLingua, raise error: image image image

plse tell me how to fix this, thx.

iofu728 commented 2 months ago

Hi @zjcDM, thanks for your support. It seems like you are encountering the same issue as in #106, where your environment cannot connect to openaipublic.blob.core.windows.net.

You can follow the solution in #106 and comment out the tiktoken-related code.

zjcDM commented 1 month ago

Thanks for help.