Open martinjaco opened 5 months ago
As far as I know, most of the open-source models supported by ollama do not support tool usage. But if you find one, you can modify the openai api as per this doc.
The Agent stuff totally leverage the LLM model functionality, such as function calling. If you want to try ollama, check the model`s function calling. Most of them are not good enough.
There are a few models that support tool usage on Ollama now, such as Llama 3.1, Llama 3.2, Mixtral, and Command-R. The list can be found here
You can replace the OpenAI config snippet with this to use llama models
config_list = [
{
"model": "llama3.1",
"base_url": "http://127.0.0.1:11434/v1",
"api_key": "ollama",
}
]
llm_config = {"config_list": config_list, "timeout": 120, "temperature": 0}
I'll echo what @zhengr wrote though. In my limited experience, I found the OpenAI models to be much smoother. Specifically in the forecaster, the llama and mixtral models weren't consistently running successfully.
How to make use of ollama models instead of Openai models?