KwaiKEG / KwaiAgents

A generalized information-seeking agent system with Large Language Models (LLMs).
Other
1.07k stars 104 forks source link

with use_local_llm , the local deployment service , Will download model files from Hugging Face #27

Closed dgo2dance closed 7 months ago

dgo2dance commented 7 months ago

command:kagentsys --query="Who is Andy Lau's wife?" --llm_name="kagentlms_qwen_7b_mat" \ --use_local_llm --local_llm_host="https://127.0.0.1" --local_llm_port=80 --lang="zh" use_local_llm,error: OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like kwaikeg/kagentlms_qwen_7b_mat is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Will the local deployment service also download model files from Hugging Face?

ScarletPan commented 7 months ago

Nope, but tokenizer will be loading from huggingface. Can you set a proxy?

Vincentyua commented 7 months ago

This issue arises because a prompt truncation strategy has been implemented in the system, necessitating the initialization of the corresponding tokenizer. If you are using a local model, you can replace model_name with the local model path in KwaiAgents/kwaiagents/agents/kagent.py,eliminating the need to re-download it from Hugging Face.

image