pepperoni21 / ollama-rs

A Rust library allowing to interact with the Ollama API.
MIT License
542 stars 81 forks source link

OpenAI compatible function calling #50

Closed andthattoo closed 5 months ago

andthattoo commented 6 months ago

It would be great for ollama-rs to have function calling abilities compatible with OpenAI, given that Rust provides an excellent environment for the agentic framework with its lightweight overhead.

Something like this would provide great flexibility

let model = "llama3:latest".to_string();
let prompt = "What's the weather like in San Francisco, Tokyo, and Paris?".to_string();
let weather_tool = WeatherTool::default();
let res = ollama.tool_call(ToolCallRequest::new(model, prompt, &[Arc::new(weather_tool)])).await;

if let Ok(res) = res {
    println!("{}", res.response);
}

And get something like OpenAI would return:

ChatCompletionMessageToolCall(id='call_zA2XyVu7JHRSmUk5Q995AKFP', function=Function(arguments='{"location": "San Francisco", "unit": "celsius"}', name='get_current_weather'), type='function')

We can provde a trait tool implementation

#[async_trait]
impl Tool for WeatherTool {}

Whole thing might be generic for further compatibility.

Let me know if you are interested, I can implement this.

andthattoo commented 6 months ago

@pepperoni21 implemented this already, works with chat mode. Added tool trait, and two tool implementations; duckduckgo + web scraping. It has nous-hermes2 and langchain compatible function calls.

let mut ollama = Ollama::new_default_with_history(30);
let scraper_tool = Arc::new(Scraper {});
let ddg_search_tool = Arc::new(DDGSearcher::new());

After you can create a function_call_request

let parser = Arc::new(NousFunctionCall {});
let result  = ollama.send_function_call(
    FunctionCallRequest::new(
        "adrienbrault/nous-hermes2pro:Q8_0".to_string(),
        vec![scraper_tool.clone(), ddg_search_tool.clone()],
        vec![user_message.clone()]
    ),
    parser.clone()).await?;

You can chat, or send a single chat request it manages the system_prompts based on pipeline (currently OpenAI and Nous is available). All in "function-calling" feature. Will create a PR after some formatting and testing