xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
4.77k stars 375 forks source link

自定义embedding模型缺少model_id #1582

Open edisonzf2020 opened 3 months ago

edisonzf2020 commented 3 months ago

Describe the bug

A clear and concise description of what the bug is. 按照说明文档,注册自定义embedding模型,无法实现自动从huggingface下载模型,发现没有model_id输入选项

To Reproduce

To help us to reproduce this bug, please provide information below: 文档的介绍:

image

我的操作ru如下图:

截屏2024-06-04 22 40 20
  1. Your Python version. Docker部署官方仓库最新版本
  2. The version of xinference you use. Docker部署官方仓库最新版本
  3. Versions of crucial packages.
  4. Full stack of the error.
  5. Minimized code to reproduce the error.

Expected behavior

A clear and concise description of what you expected to happen.

Additional context

Add any other context about the problem here.

github-actions[bot] commented 1 month ago

This issue is stale because it has been open for 7 days with no activity.