Open Aseisman opened 5 days ago
本地模型chatglm-6b,放入了llm_models里面,同时按照配置进行配置之后,启动后发送hallo命令,返回报错。如何验证本地模型已经正常启动了呢
python examples/llm_api.py
时信息:
model_config.py[line:33] - ERROR: No module named 'zdatafront'
.....................
当前启动的LLM模型:['chatglm-6b'] @ cpu
.......................
ERROR: config: {'model_path': 'THUDM/chatglm-6b', 'device': 'cpu'}, chatglm-6b, dict_keys(['gpt-3.5-turbo'])
.............................................
Traceback (most recent call last):
File "/home/codefuse-chatbot/examples/llm_api.py", line 830, in start_main_server
controller_started.wait() # 等待controller启动完成
File "/opt/conda/envs/devopsgpt/lib/python3.9/multiprocessing/managers.py", line 1085, in wait
return self._callmethod('wait', (timeout,))
File "/opt/conda/envs/devopsgpt/lib/python3.9/multiprocessing/managers.py", line 810, in _callmethod
kind, result = conn.recv()
File "/opt/conda/envs/devopsgpt/lib/python3.9/multiprocessing/connection.py", line 250, in recv
buf = self._recv_bytes()
File "/opt/conda/envs/devopsgpt/lib/python3.9/multiprocessing/connection.py", line 414, in _recv_bytes
buf = self._recv(4)
File "/opt/conda/envs/devopsgpt/lib/python3.9/multiprocessing/connection.py", line 379, in _recv
chunk = read(handle, remaining)
File "/home/codefuse-chatbot/examples/llm_api.py", line 735, in f
raise KeyboardInterrupt(f"{signalname} received")
KeyboardInterrupt: SIGINT receive