ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
3.8k stars 324 forks source link

tool calls integration broken when empty #223

Closed aksep closed 1 month ago

aksep commented 1 month ago

With ollama-python 0.3.0 and the latest ollama server, I'm getting systematically an exception raised, even with the basic chat example provided (e.g. examples/chat/main.py).

Code to reproduce

from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama3', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])

Error message

Traceback (most recent call last):
  File "/home/aksep/src/experimental/ex.py", line 5, in <module>
    response = client.chat(model='llama3', messages=[
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/aksep/src/pyenv/ai/lib/python3.12/site-packages/ollama/_client.py", line 235, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/aks/src/pyenv/ai/lib/python3.12/site-packages/ollama/_client.py", line 98, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/aksep/src/pyenv/ai/lib/python3.12/site-packages/ollama/_client.py", line 74, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: llama3 does not support tools

Possible fix

Changing ollama/_client.py (around line 241) to remove or [] fixes it in my tests. I haven't checked the other occurrences of calls passing tools, but I guess they would fail too.

      json={
        'model': model,
        'messages': messages,
        'tools': tools,
        'stream': stream,
        'format': format,
        'options': options or {},
        'keep_alive': keep_alive,
      },
darth-kcaj commented 1 month ago

+1 I am observing this bug as well

dancininmyheart commented 1 month ago

I meet same problem.Then,I change ollama to 0.2.1. it is ok.

anthonywu commented 1 month ago

I think this is due to using an older model without tool calling support. Updating to llama3.1 should work. #237 fixes the doc and should be the resolution.

aksep commented 1 month ago

@anthonywu: I don't think it should fail with this specific error if no tools have been passed as argument, regardless whether the model supports them or not. There may also be reasons to want to use models that don't support tools.

Quite possible that the actual bug is in the upstream server that should treat an empty list as no tools. I would guess replacing [] with None works if the json field is then omitted altogether in the server request.

anthonywu commented 1 month ago

Unable to reproduce your exception. I think this problem might go away if you ollama pull llama3:latest and git pull origin main on this repo and maybe re-do pip install . in your venv if it was not a install -e install.

$ date && ollama --version && ollama list | grep llama3
Wed Jul 31 09:26:21 PDT 2024
ollama version is 0.3.0
llama3:latest               365c0bd3c000    4.7 GB  40 seconds ago

then running your snippet as is, no modifications, returns a valid response json

FWIW I ran this on macOS 14.5 M1, but IMO your problem is unlikely to be a machine/os specific issue.

aksep commented 1 month ago

Yup, it doesn't happen anymore even though I haven't really updated anything since I opened this issue. I'm still on ollama-python 0.3.0, ollama 0.2.7 (docker), and llama3 365c0bd3c000. The ollama host has been rebooted a couple of times since but it's running the same image.

Funny thing is, if I pass on a tool args, it (correctly) fails with the exception, as I would expect with this version of llama3.

Weird. Thanks for checking though!