Open waylonli opened 3 weeks ago
The latest
documentation refers to the current state of main
branch, not the most recent release (that would be the stable
version). You have to build vLLM from source to access brand-new features that haven't been released on PyPI yet.
📚 The doc issue
I did see the official documentation contains the offline inference chat function. But I still get the Attribute Error where LLM object does not have chat() attribute. Has this been updated in the latest package?
Suggest a potential alternative/fix
No response