Open eli6 opened 3 weeks ago
Hey there, I ran into the same problem. It doesn't work with functionaryv2.5 presumably because there is a change in formatting. Try functionaryv2 instead.
Yes, the format is changed and Functionary v2.5 support is being added here https://github.com/abetlen/llama-cpp-python/pull/1509
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
I am following the functions guide here and am running the first example the "basic demo": https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb
I expect to get the reply and result of the function call like in the notebook.
Current behaviour
Instead I get this error when running it, since the "name" of the function is not in the expected format and contains a lot more than just the name (the part that starts with "functions.get_current...."} string. I expect the function_name to be only
get_current_weather
but when running it, the name is:functions.get_current_weather\n{"location": "San Francisco"}<|reserved_special_token_249|>get_current_weather\n{"location": "Tokyo"}<|reserved_special_token_249|>get_current_weather\n{"location": "Paris"}'), type='function').
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
Physical (or virtual) hardware you are using, e.g. for Linux:
Running a mac with M1 chip
Operating System, e.g. for Linux:
MacOS 14.13.1 Sonoma
Python 3.12.4
Failure Information (for bugs)
the tool_call response from the server looks like this, the name is in a weird format and there are no arguments:
ChatCompletionMessageToolCall(id='call_3VnQzKr1UZsaG2IchpR3Y7ai', function=Function(arguments='{}', name='functions.get_current_weather\n{"location": "San Francisco"}<|reserved_special_token_249|>get_current_weather\n{"location": "Tokyo"}<|reserved_special_token_249|>get_current_weather\n{"location": "Paris"}'), type='function')
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
start the server like this:
python -m llama_cpp.server --model /some-path/functionary-small-v2.5.Q4_0.gguf --chat_format functionary-v2 --hf_pretrained_model_name_or_path meetkai/functionary-small-v2.5-GGUF
step 2
run the first example from the functions notebook against that server: https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb
the output is this error message:
I am not sure if the following about llama.cpp applies to this case. Please tell me if it does.
Note: Many issues seem to be regarding functional or performance issues / differences with
llama.cpp
. In these cases we need to confirm that you're comparing against the version ofllama.cpp
that was built with your python package, and which parameters you're passing to the context.Try the following:
git clone https://github.com/abetlen/llama-cpp-python
cd llama-cpp-python
rm -rf _skbuild/
# delete any old buildspython -m pip install .
cd ./vendor/llama.cpp
cmake
llama.cpp./main
with the same arguments you previously passed to llama-cpp-python and see if you can reproduce the issue. If you can, log an issue with llama.cppFailure Logs
Please include any relevant log snippets or files. If it works under one configuration but not under another, please provide logs for both configurations and their corresponding outputs so it is easy to see where behavior changes.
Also, please try to avoid using screenshots if at all possible. Instead, copy/paste the console output and use Github's markdown to cleanly format your logs for easy readability.
Example environment info: