Open tituslhy opened 1 month ago
@tituslhy it would just have to extend the FunctionCallingLLM
base class and implement a few extra functions -- I welcome a PR to add this :)
Oops I cheated. I just used LlamaIndex's ReAct workflow cookbook recipe (https://docs.llamaindex.ai/en/stable/examples/workflow/react_agent/) and used LiteLLM as my llm of interest HAHA! I also just initiated ReAct agent from tools and used LiteLLM. Heh.
I did experiment a little with inheriting from FunctionCallingLLM but I found this way harder - when I called Gemini from LIteLLM it also had tool calling json errors though it fit LiteLLM's schema perfectly. I think that it's really difficult to cater to every single LLM service provider.
Feature Description
LiteLLM is a wrapper over LLMs from non-OpenAI providers to harmonize their APIs to OpenAI's APIs. This ensures harmonization and ease of LLM switchability.
Most LLMs (like Bedrock) have function-calling capabilities, and LiteLLM already handles this on their end. Would it be too difficult to add function-calling capabilities to LlamaIndex's LiteLLM class?
Here's an example of function calling failure once we wrap any LLM into LlamaIndex's LiteLLM class (including OpenAI)
This throws the error:
Funnily enough LlamaIndex has code to identify that this LLM within LiteLLM is a function calling LLM in lines 160-169 of llama_index/llama
printing
llm.metadata
returns:As we can see,
is_function_calling_model = True
. But the reason LiteLLM function calling still returns an error is because the LiteLLM class inherits from "LLM" instead of "FunctionCallingLLM". I sense that this is not a trivial fix.Reason
I could use LiteLLM's library directly but it would be great to use LlamaIndex's capabilities!
Value of Feature
LiteLLM is one of the commonly used libraries to harmonize LLM provider APIs to the OpenAI API, and the library continues to be used despite more libraries being harmonized to OpenAI APIs. LiteLLM continues to update their offerings to remain current across LLM providers.