xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
4.55k stars 356 forks source link

BUG: Error occurs when using xinference v0.5.6 on a dify chat app #598

Closed JiaYaobo closed 1 week ago

JiaYaobo commented 10 months ago

Describe the bug

After deploy dify@HEAD locally with docker, and start xinference with xinference -p 9997 -H 0.0.0.0, using a chatglm2 model. Encountering the following error:

截屏2023-11-01 08 21 57

And xinference itself works fine on gradio web app.

To Reproduce

To help us to reproduce this bug, please provide information below:

  1. Your Python version. 3.10
  2. The version of xinference you use. v0.5.6
  3. Versions of crucial packages. dify@HEAD and install xinference with pip install xinference[all]

Additional context

Same issue is https://github.com/xorbitsai/inference/issues/574, seems it has been solved with xinference v0.5.4

UranusSeven commented 10 months ago

Thank you for reporting this issue! We will attempt to reproduce it and work on a fix.

UranusSeven commented 10 months ago

@JiaYaobo Hello, I was unable to reproduce this issue. Can you confirm if this problem is reproducible in your environment?

github-actions[bot] commented 2 weeks ago

This issue is stale because it has been open for 7 days with no activity.

github-actions[bot] commented 1 week ago

This issue was closed because it has been inactive for 5 days since being marked as stale.