Open Madoshakalaka opened 2 months ago
Good catch, this should have been an issue from day one. Can you work on this? or else I can fix this sometime
please go ahead with the fix.
I did fork and did some quick fix so maybe you can use this as a reference:
fn parse_llama_tool_response(response: &str) -> Option<Vec<LlamaFunctionCallSignature>> {
let function_regex = Regex::new(r"<function=(\w+)>(.*?)</function>").unwrap();
println!("Response: {}", response);
let mut signatures = Vec::new();
for caps in function_regex.captures_iter(response) {
let function_name = caps.get(1).unwrap().as_str().to_string();
let args_string = caps.get(2).unwrap().as_str();
match serde_json::from_str(args_string) {
Ok(arguments) => {
signatures.push(LlamaFunctionCallSignature {
function: function_name,
arguments,
});
}
Err(error) => {
println!("Error parsing function arguments: {}", error);
// todo:
}
}
}
if signatures.is_empty() {
None
} else {
Some(signatures)
}
}
@Madoshakalaka added a PR
if the model decides to call multiple functions, ollama-rs will only expose the first function as response,
the relevant code is here:
https://github.com/pepperoni21/ollama-rs/blob/d1a60302b33133e63bf11e99177929264bd411a6/src/generation/functions/pipelines/meta_llama/request.rs#L67-L84
I have been using mixtral-large and I can see the message like:
while at the user side, I can only receive the first function call as a response.