pepperoni21 / ollama-rs

A Rust library allowing to interact with the Ollama API.
MIT License
534 stars 79 forks source link

Multiple function calls aren't supported #75

Open Madoshakalaka opened 2 months ago

Madoshakalaka commented 2 months ago

if the model decides to call multiple functions, ollama-rs will only expose the first function as response,

the relevant code is here:

https://github.com/pepperoni21/ollama-rs/blob/d1a60302b33133e63bf11e99177929264bd411a6/src/generation/functions/pipelines/meta_llama/request.rs#L67-L84

I have been using mixtral-large and I can see the message like:

Response: <function....></function><function...></function>

while at the user side, I can only receive the first function call as a response.

andthattoo commented 1 month ago

Good catch, this should have been an issue from day one. Can you work on this? or else I can fix this sometime

Madoshakalaka commented 1 month ago

please go ahead with the fix.

I did fork and did some quick fix so maybe you can use this as a reference:

fn parse_llama_tool_response(response: &str) -> Option<Vec<LlamaFunctionCallSignature>> {
    let function_regex = Regex::new(r"<function=(\w+)>(.*?)</function>").unwrap();
    println!("Response: {}", response);

    let mut signatures = Vec::new();

    for caps in function_regex.captures_iter(response) {
        let function_name = caps.get(1).unwrap().as_str().to_string();
        let args_string = caps.get(2).unwrap().as_str();

        match serde_json::from_str(args_string) {
            Ok(arguments) => {
                signatures.push(LlamaFunctionCallSignature {
                    function: function_name,
                    arguments,
                });
            }
            Err(error) => {
                println!("Error parsing function arguments: {}", error);
                // todo:
            }
        }
    }

    if signatures.is_empty() {
        None
    } else {
        Some(signatures)
    }
}
andthattoo commented 1 month ago

@Madoshakalaka added a PR