Closed dessun888 closed 10 months ago
nohup $PY_path/bin/python -m fastchat.serve.controller --host $IP >> ./serve.log &
nohup $PY_path/bin/python -m fastchat.serve.vllm_worker --model-path /data/model/modelscope_hub/qwen/kagentlms_qwen_7b_mat --dtype half --controller-address http://$IP:21001 --trust-remote-code >> model.log & nohup $PY_path/bin/python -m fastchat.serve.openai_api_server --host $IP --port 21010 --controller-address http://$IP:21001 >> openai.log &
The error maybe occurs when KAgentSys try to load tokenizer from huggingface. Can you simply run the following code to check this idea?
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
"kwaikeg/kagentlms_qwen_7b_mat",
use_fast=False,
padding_side='left',
trust_remote_code=True
)
Hi, I meet the same problem. How did you solve it?
I have check that curl http://10.22.51.10:21010/v1/chat/completions -H "Content-Type: application/json" -d '{"model": "kagentlms_qwen_7b_mat", "messages": [{"role": "user", "content": "刘德华是谁"}]}' return ok
but when run kagentsys --query="刘德华老婆是谁?" --llm_name="kagentlms_qwen_7b_mat" --use_local_llm --local_llm_host="10.22.51.10" --local_llm_port=21010 --lang="zh" met error We couldn't connect to 'https://huggingface.co'
how to fix this