Open AryanSakhala opened 1 day ago
Correct, vllm
does not implement FunctionCallingLLM
Although if I'm reading this right, vllm does support function calling https://github.com/vllm-project/vllm/issues/7912
But only for certain llms (like llama3.1)
I'd suggest using llama3.2 + the openai server + the OpenAILike
class
There's a small example here that you can follow for setup https://colab.research.google.com/drive/1h-eHhoKUnkJQ2TTsw4HxOd0f1zVd97Ol?usp=sharing
(note, I haven't actually tried this yet with vllm + openai like, but in theory, should work)
The other option is re-implementing the workflow to implement a react loop instead of a function calling loop, but it will be less reliable imo https://docs.llamaindex.ai/en/stable/examples/workflow/react_agent/
Getting trouble in implementing this with vLLms for all llm functions.
Getting
raise ValueError("LLM must be a function calling model!") ValueError: LLM must be a function calling model!
specific error.For all the llm callingfunctions like