vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
26.61k stars 3.9k forks source link

[Doc]: Has the offline chat inference function been updated? #7623

Open waylonli opened 3 weeks ago

waylonli commented 3 weeks ago

📚 The doc issue

I did see the official documentation contains the offline inference chat function. But I still get the Attribute Error where LLM object does not have chat() attribute. Has this been updated in the latest package?

Suggest a potential alternative/fix

No response

DarkLight1337 commented 3 weeks ago

The latest documentation refers to the current state of main branch, not the most recent release (that would be the stable version). You have to build vLLM from source to access brand-new features that haven't been released on PyPI yet.