Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
env : docker install the v0.11.3
lauch bge-reranker-large use modelscope
but lauch baichuan-inc/Baichuan-13B-Base use hugging?
XINFERENCE_MODEL_SRC=modelscope not work?
To Reproduce
To help us to reproduce this bug, please provide information below:
1、docker install the v0.11.3
2、first try lauch an rerank model, fail;/
3、refer troublshooting
rm the container and the volume
set -e XINFERENCE_MODEL_SRC=modelscope to restart a contain
4、lauch bge-reranker-large from gui ,see log use modelscope;
a few little moment...
I don't kwon its success or not?
xinference list can show the model but web can't find
5、lauch baichuan from gui use hugging?
Describe the bug
env : docker install the v0.11.3 lauch bge-reranker-large use modelscope but lauch baichuan-inc/Baichuan-13B-Base use hugging? XINFERENCE_MODEL_SRC=modelscope not work?
To Reproduce
To help us to reproduce this bug, please provide information below: 1、docker install the v0.11.3 2、first try lauch an rerank model, fail;/ 3、refer troublshooting rm the container and the volume set -e XINFERENCE_MODEL_SRC=modelscope to restart a contain 4、lauch bge-reranker-large from gui ,see log use modelscope; a few little moment... I don't kwon its success or not? xinference list can show the model but web can't find 5、lauch baichuan from gui use hugging?
Expected behavior
lauch all model from modelScope
Additional context