Closed kumar19a closed 2 days ago
i test it over AzureOpenAI, it works fine.
some llama-index features are only supported with open-ai model so the code may need some of those features.
It should work with any LlM that extends the FunctionCallingLLM
class, or has llm.metadata.is_function_calling_model
as True
Whenever I load a llama 3.1 via HuggingFaceLLM in the FunctionCallingAgentWorker it shows model does not support function calling api. Is the multi agent concierge workload only executed via OpenAI API calls?