xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
4.95k stars 393 forks source link

when i user src modelscope lanch jina ,find this bug https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like jinaai/jina-bert-implementation is not the path to a directory containing a file named configuration_bert.py #1842

Closed likenamehaojie closed 2 months ago

likenamehaojie commented 2 months ago

System Info / 系統信息

cuda 12 ubuntu

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

Version info / 版本信息

lasted

The command used to start Xinference / 用以启动 xinference 的命令

docker lanch

Reproduction / 复现过程

install xinference by docker

lanch jina-embading-base-zh

Expected behavior / 期待表现

lanch success

likenamehaojie commented 2 months ago

我用的modelscope的源为何还会从huggingface.co去下载东西?并且我也把报错上提到的模型下载到对对应的hub上也不行

github-actions[bot] commented 2 months ago

This issue is stale because it has been open for 7 days with no activity.

github-actions[bot] commented 2 months ago

This issue was closed because it has been inactive for 5 days since being marked as stale.

DankerMu commented 1 week ago

解决了吗?

DankerMu commented 1 week ago

秦续业: You should update your local config.json and update:

"auto_map": { "AutoConfig": "jinaai/jina-bert-implementation--configuration_bert.JinaBertConfig", "AutoModel": "jinaai/jina-bert-implementation--modeling_bert.JinaBertModel", "AutoModelForMaskedLM": "jinaai/jina-bert-implementation--modeling_bert.JinaBertForMaskedLM", "AutoModelForQuestionAnswering": "jinaai/jina-bert-implementation--modeling_bert.JinaBertForQuestionAnswering", "AutoModelForSequenceClassification": "jinaai/jina-bert-implementation--modeling_bert.JinaBertForSequenceClassification", "AutoModelForTokenClassification": "jinaai/jina-bert-implementation--modeling_bert.JinaBertForTokenClassification" }, to

"auto_map": { "AutoConfig": "configuration_bert.JinaBertConfig", "AutoModel": "modeling_bert.JinaBertModel", "AutoModelForMaskedLM": "modeling_bert.JinaBertForMaskedLM", "AutoModelForQuestionAnswering": "modeling_bert.JinaBertForQuestionAnswering", "AutoModelForSequenceClassification": "modeling_bert.JinaBertForSequenceClassification", "AutoModelForTokenClassification": "modeling_bert.JinaBertForTokenClassification" },

and add two python file into model directory configuration_bert.md modeling_bert.md