vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
29.66k stars 4.47k forks source link

[Doc]: Does vllm support function call #10066

Closed leoneyar closed 5 hours ago

leoneyar commented 5 hours ago

📚 The doc issue

For custom LLM, use vllm for whether or not to support function call

Suggest a potential alternative/fix

No response

Before submitting a new issue...

DarkLight1337 commented 5 hours ago

Yes, please read the docs.