xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
5.31k stars 430 forks source link

0.14.4 版本不能正常启动 #2204

Closed knightcn1983 closed 1 month ago

knightcn1983 commented 2 months ago

System Info / 系統信息

采用conda python=3.11,在mac平台上安装 pip install -U xinference

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

Version info / 版本信息

xinference=0.14.4

The command used to start Xinference / 用以启动 xinference 的命令

除前面issue提到的自定义模型路径出错外,还会出现下面的错误信息:XINFERENCE_HOME=/Users/qytian/Desktop/localai/chat xinference-local --host 0.0.0.0 --port 1234 Traceback (most recent call last): File "/opt/anaconda3/envs/xinference/bin/xinference-local", line 5, in from xinference.deploy.cmdline import local File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/init.py", line 37, in _install() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/init.py", line 34, in _install install_model() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/init.py", line 19, in _install llm_install() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/llm/init.py", line 278, in _install user_defined_llm_family = CustomLLMFamilyV1.parse_obj(json.load(fd)) ^^^^^^^^^^^^^ File "/opt/anaconda3/envs/xinference/lib/python3.11/json/init.py", line 293, in load return loads(fp.read(), ^^^^^^^^^ File "", line 707, in read File "", line 507, in read UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 3131: invalid start byte

Reproduction / 复现过程

XINFERENCE_HOME=/Users/qytian/Desktop/localai/chat xinference-local --host 0.0.0.0 --port 1234 Traceback (most recent call last): File "/opt/anaconda3/envs/xinference/bin/xinference-local", line 5, in from xinference.deploy.cmdline import local File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/init.py", line 37, in _install() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/init.py", line 34, in _install install_model() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/init.py", line 19, in _install llm_install() File "/opt/anaconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/llm/init.py", line 278, in _install user_defined_llm_family = CustomLLMFamilyV1.parse_obj(json.load(fd)) ^^^^^^^^^^^^^ File "/opt/anaconda3/envs/xinference/lib/python3.11/json/init.py", line 293, in load return loads(fp.read(), ^^^^^^^^^ File "", line 707, in read File "", line 507, in read UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 3131: invalid start byte

Expected behavior / 期待表现

修复问题

github-actions[bot] commented 1 month ago

This issue is stale because it has been open for 7 days with no activity.

github-actions[bot] commented 1 month ago

This issue was closed because it has been inactive for 5 days since being marked as stale.