xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
3.63k stars 303 forks source link

xinference用pip安装后如何修改huggingface模型默认位置? #1636

Open chenqp opened 3 weeks ago

chenqp commented 3 weeks ago

xinference用docker安装时可以用-v </your/home/path>/.cache/huggingface:/root/.cache/huggingface改变huggingface模型的默认位置,但是用pip安装后设置 HF_HOME不管用,还是在XINFERENCE_HOM生成huggingface目录,模型下载到里面,如何将huggingface模型目录设置到指定位置?

chenqp commented 3 weeks ago

而且在docker 和pip安装huggingface模型下载的位置不一致,docker中-v </your/home/path>/.cache/huggingface:/root/.cache/huggingface,模型下载到huggingface/hub下,但是pip中模型直接下载到huggingface目录,导致软链接的方式也不行

ChengjieLi28 commented 2 weeks ago

@chenqp 启动xinference的时候指定HUGGINGFACE_HUB_CACHE这个环境变量试试?

HUGGINGFACE_HUB_CACHE=<your path> xinference-local xxx
qinxuye commented 2 weeks ago

问题有解决吗?