Open vincent-pli opened 1 month ago
make the vllm example with latest vllm version(v0.4.3) works, by follow the current example from https://docs.ray.io/en/master/serve/tutorials/vllm-example.html I got exception:
AttributeError: 'list' object has no attribute 'max_model_len'
cause by missing parameters in: https://github.com/ray-project/ray/blob/c4a87ee474041ab7286a41378f3f6db904e0e3c5/doc/source/serve/doc_code/vllm_openai_example.py#L53
the OpenAIServingChat requires ModelConfig as the second parameter
OpenAIServingChat
ModelConfig
I will make a pr the fix it latter
https://docs.ray.io/en/master/serve/tutorials/vllm-example.html
Any updates on the fix?
Description
make the vllm example with latest vllm version(v0.4.3) works, by follow the current example from https://docs.ray.io/en/master/serve/tutorials/vllm-example.html I got exception:
cause by missing parameters in: https://github.com/ray-project/ray/blob/c4a87ee474041ab7286a41378f3f6db904e0e3c5/doc/source/serve/doc_code/vllm_openai_example.py#L53
the
OpenAIServingChat
requiresModelConfig
as the second parameterI will make a pr the fix it latter
Link
https://docs.ray.io/en/master/serve/tutorials/vllm-example.html