Open vkameswaran opened 6 months ago
To implement function calling support for models like LiteLLM, Cohere, Anthropic, and Fireworks, you'll need to adapt the existing infrastructure for LLM integration and custom tool creation. Here's a high-level approach based on the current codebase:
Extend LLM Provider Support: In libs/superagent/app/agents/langchain.py
, add new classes for each of the new LLM providers that support function calling. Use ChatOpenAI
and AzureChatOpenAI
as references for how to structure these classes.
Adapt LLM Initialization: Modify the _get_llm
method to initialize the correct LLM provider class based on the provider
attribute of the LLM
model. This might involve checking for support of function calling and choosing the appropriate class.
Custom Tool Integration: Utilize the guide in /fern/mdx/sdk/local_tools.mdx
for creating custom tools. For models supporting function calling, ensure that the metadata for tools includes necessary information for function invocation.
Invoke Function Calls: In the agent's invocation logic, specifically in the method where the agent processes input and decides on actions, include logic to detect when a function call should be made to the LLM and execute it accordingly.
This approach leverages the existing infrastructure for LLM integration and custom tool functionality, extending it to accommodate the specific requirements of function calling with the mentioned models.
greptileai/superagent/fern/mdx/sdk/local_tools.mdx greptileai/superagent/libs/superagent/app/agents/langchain.py
⚠️ Please check that this feature request hasn't been suggested before.
🔖 Feature description
I want us to use the LiteLLM function calling spec for instead of our hybrid GPT approach for models that support function calling. Cohere, Anthropic, Fireworks etc. etc.
Acknowledgements