xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
5.36k stars 434 forks source link

docker image from hub.docker.com cannot load qwen2-vl #2480

Closed Tint0ri closed 7 hours ago

Tint0ri commented 2 weeks ago

System Info / 系統信息

ubuntu 20.04

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

Version info / 版本信息

0.16

The command used to start Xinference / 用以启动 xinference 的命令

xinference-local -H 0.0.0.0

Reproduction / 复现过程

  1. load qwen2-vl-instruct
  2. ImportError: [address=0.0.0.0:35113, pid=104] cannot import name 'Qwen2VLForConditionalGeneration' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/init.py)

Expected behavior / 期待表现

no error and work as other models

github-actions[bot] commented 6 days ago

This issue is stale because it has been open for 7 days with no activity.

github-actions[bot] commented 7 hours ago

This issue was closed because it has been inactive for 5 days since being marked as stale.