bentoml / OpenLLM

Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
9.33k stars 595 forks source link

feat: support Qwen1.5 #948

Closed sudazzk closed 5 days ago

sudazzk commented 3 months ago

Feature request

https://github.com/QwenLM/Qwen1.5 https://huggingface.co/collections/Qwen/qwen15-65c0a2f577b1ecb76d786524

Motivation

No response

Other

No response

yufeng1684 commented 3 months ago

You can refer to configuration_qwen.py and write a configuration file that is proven to work through personal testing. And also need to modify configuration_auto.py

bojiang commented 5 days ago

qwen2 supported in openllm 0.6. Pls try it!