xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
5.21k stars 421 forks source link

BUG XINFERENCE_MODEL_SRC not work? #1581

Open QiuZiXian opened 4 months ago

QiuZiXian commented 4 months ago

Describe the bug

env : docker install the v0.11.3 lauch bge-reranker-large use modelscope but lauch baichuan-inc/Baichuan-13B-Base use hugging? XINFERENCE_MODEL_SRC=modelscope not work?

To Reproduce

To help us to reproduce this bug, please provide information below: 1、docker install the v0.11.3 2、first try lauch an rerank model, fail;/ 3、refer troublshooting rm the container and the volume set -e XINFERENCE_MODEL_SRC=modelscope to restart a contain 4、lauch bge-reranker-large from gui ,see log use modelscope; a few little moment... I don't kwon its success or not? image xinference list can show the model but web can't find 5、lauch baichuan from gui use hugging? image

  1. docker
  2. v0.11.3.
  3. Versions of crucial packages.

Expected behavior

lauch all model from modelScope

Additional context

image

github-actions[bot] commented 2 months ago

This issue is stale because it has been open for 7 days with no activity.