bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.11k stars 640 forks source link

feat: Include starcoder2 #1025

Open kobiche opened 5 months ago

kobiche commented 5 months ago

Feature request

The new version of Starcoder is out (about 4 months ago). Can you include it to the supported models. This comes when I try to start the server with this model:

RuntimeError: Failed to determine config class for bigcode/starcoder2-15b. Got ['Starcoder2ForCausalLM'], which is not yet supported (Supported: ['BaichuanForCausalLM', 'ChatGLMModel', 'CohereForCausalLM', 'DbrxForCausalLM', 'GPTNeoXForCausalLM', 'FalconForCausalLM', 'GemmaForCausalLM', 'LlamaForCausalLM', 'MistralForCausalLM', 'MixtralForCausalLM', 'MPTForCausalLM', 'OPTForCausalLM', 'Phi3ForCausalLM', 'QWenLMHeadModel', 'GPTBigCodeForCausalLM', 'YiForCausalLM'])

This is surprising, since vllm should have no problems loading the new model: https://docs.vllm.ai/en/latest/models/supported_models.html

Motivation

No response

Other

No response