Closed andthattoo closed 5 months ago
@pepperoni21 implemented this already, works with chat mode. Added tool trait, and two tool implementations; duckduckgo + web scraping. It has nous-hermes2 and langchain compatible function calls.
let mut ollama = Ollama::new_default_with_history(30);
let scraper_tool = Arc::new(Scraper {});
let ddg_search_tool = Arc::new(DDGSearcher::new());
After you can create a function_call_request
let parser = Arc::new(NousFunctionCall {});
let result = ollama.send_function_call(
FunctionCallRequest::new(
"adrienbrault/nous-hermes2pro:Q8_0".to_string(),
vec![scraper_tool.clone(), ddg_search_tool.clone()],
vec![user_message.clone()]
),
parser.clone()).await?;
You can chat, or send a single chat request it manages the system_prompts based on pipeline (currently OpenAI and Nous is available). All in "function-calling" feature. Will create a PR after some formatting and testing
It would be great for
ollama-rs
to have function calling abilities compatible with OpenAI, given that Rust provides an excellent environment for the agentic framework with its lightweight overhead.Something like this would provide great flexibility
And get something like OpenAI would return:
ChatCompletionMessageToolCall(id='call_zA2XyVu7JHRSmUk5Q995AKFP', function=Function(arguments='{"location": "San Francisco", "unit": "celsius"}', name='get_current_weather'), type='function')
We can provde a trait tool implementation
Whole thing might be generic for further compatibility.
Let me know if you are interested, I can implement this.