sigoden / llm-functions

Easily create LLM tools and agents using Bash/JavaScript/Python, also a library of commonly used LLM tools and agents.
MIT License
166 stars 22 forks source link

Unable to use functions #92

Closed grizzlycode closed 2 months ago

grizzlycode commented 2 months ago

Describe the bug

I should probably break this into three issues, but they may be related?

1) The first issue was with script not finding Python. It was on path, but wouldn't find it. I changed python to python3 in the script and it worked instantly. Not sure if I messed something on my WSL setup for it to not see path, but my fix worked and script ran with some errors as described in error 2 next. 2) The second issue after following your instructions I got the error below. I noticed web-search.sh doesn't exist in tools folder nor did I have it in my tools.txt file. Not sure why this error happend. You'll also see an error for coder this exists in the file structure and has files, but I get the invalid error. (see Issue 2 error below) 3) I proceeded to try other functions since I didn't get errors on those. None of the functions work no matter what model I use. I used Ollama with llama3.1 and Gemini and neither worked (see Issue 3 error below). Both models work if I don't use functions in prompt.

Issue 2 error:

WARNING: no found web_search tool, please run argc link-web-search to set one. error: not found tools: web_search.sh Build agents/todo/functions.json error: invalid agents: coder

Issue 3 error w/Ollama:llama3.1:

Note: I still get an LLM response just w/o the function after the warning

WARNING: This LLM or client does not support function calling, despite the context requiring it.

Issue 3 error w/Gemini API

Note: I don't get any response just this error

Failed to call chat-completions api

Caused by:

  • GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[array].items: missing field.
  • GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[array_optional].items: missing field.
  • GenerateContentRequest.tools[0].function_declarations[5].parameters.properties[array].items: missing field.
  • GenerateContentRequest.tools[0].function_declarations[5].parameters.properties[array_optional].items: missing field.
  • GenerateContentRequest.tools[0].function_declarations[10].parameters.properties[array_optional].items: missing field.
  • GenerateContentRequest.tools[0].function_declarations[10].parameters.properties[array].items: missing field. (status: INVALID_ARGUMENT)

To Reproduce

For issue 2 I followed the install guide as stated (except for script change I made in error 1) and I get the error.

For issue 3 I use the example prompts on GitHub on both models with errors. Ollama llama3.1 is set as default and I change model when using Gemini. If I omit the %functions% I'll get a response from either model just w/o the function obviosuly.

aichat -r %functions% what is the weather in Paris and Berlin
aichat -m gemini -r %functions% what is the weather in Paris and Berlin

# Search the web function
aichat -r %functions% lastest version of node.js
aichat -m gemini -r %functions% lastest version of node.js

# Execute command function
aichat -r %functions% what is my cpu arch
aichat -m gemini -r %functions% what is my cpu arch

Expected behavior

Expected behavior install without errors and all functions work correctly when called.

Screenshots/Logs

Environment

Linux test 5.15.153.1-microsoft-standard-WSL2 #1 SMP Fri Mar 29 23:14:13 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux argc 1.20.0 jq-1.7 /usr/bin/bash GNU bash, version 5.2.21(1)-release (x86_64-pc-linux-gnu) /home/linuxbrew/.linuxbrew/bin/node v22.5.1 /usr/bin/python3 Python 3.12.3

Additional context

The project doesn't state if certain models are required to run functions. I understand that llama3.1 does support functions and I assume you include Gemini and a few other providers as they do as well. But for some reason I can't get them to work so was just wondering if only certain providers/Local LLMs support these functions? If so I think they should be called out.

My config only has the two LLM providers in it nothing else. Should I have other config options enabled?

sigoden commented 2 months ago

Let me answer them one by one.

  1. 'not finding Python' should not be blamed on llm-functions.

  2. run argc link-web-search to link to a web search tool. image

  3. ollama don't support streaming tool use (do support non-streaming tool use). gemma has no problem. image

    aichat -S for non-streaming api.

sigoden commented 2 months ago

If you confirm that gemini has a problem, please submit an issue in aichat.

grizzlycode commented 2 months ago

Let me answer them one by one.

  1. 'not finding Python' should not be blamed on llm-functions.
  2. run argc link-web-search to link to a web search tool. image
  3. ollama don't support streaming tool use (do support non-streaming tool use). gemma has no problem. image

aichat -S for non-streaming api.

I agree with issue 1. I'm sure it was an issue on my end.

For 2, thanks I didn't interpet the link help correctly in the output to know I had to use one of the options in the tool section.

For issue 3 I'm still having issues.

I switched to gemma2:9b as you stated Gemma models work and I still get the warning; aichat -S -m ollama:gemma2:9b -r %functions% what is the weather in Paris?

For Gemini I still get errors I'll open a issue in aichat as requested.

sigoden commented 2 months ago

@grizzlycode

WARNING: This LLM or client does not support function calling, despite the context requiring it.

If you received the above warning, you might be missing the model configuration supports_function_calling: true.

https://github.com/sigoden/aichat/blob/5c559f616a79b5bcc86b080a9cc914cace8cedef/config.example.yaml#L175

grizzlycode commented 2 months ago

@grizzlycode

WARNING: This LLM or client does not support function calling, despite the context requiring it.

If you received the above warning, you might be missing the model configuration supports_function_calling: true.

https://github.com/sigoden/aichat/blob/5c559f616a79b5bcc86b080a9cc914cace8cedef/config.example.yaml#L175

Ok I think that did it for llama3.1 it works now. Seems to be a mixed bag though

If I just ask for Paris it works, however if I ask for both Paris and Berlin it breaks

aichat -S -m ollama:llama3.1 -r %functions% what is the weather in Paris and Berlin

Call get_current_weather {"query":"Paris"} error: unexpected argument --query found Tool call exit with 1

Also it doesn't look like Gemma2 supports functions I get this aichat -S -m ollama:gemma2:9b -r %functions% what is the weather in Paris?

Failed to call chat-completions api

Caused by: gemma2:9b does not support tools

sigoden commented 2 months ago

@grizzlycode

The correct parameter is location, not query.This is clearly a problem with your LLM, although I don't know why it used query.

image

Not all models support tools; Gemma2 does not support tools; see https://ollama.com/library/gemma2.

grizzlycode commented 2 months ago

This is a fresh clone of the repo today, but I rebuilt it as instructed. The local LLM issue seems to be good now. I'll continue troubleshooting Gemini tomorrow. Thanks for the help. By the way this looks like a great project and I look forward to using all the functions thanks for speedy support and your time!