ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

ResponseError(without error information) when running with python #83

Open Kaleemullahqasim opened 3 months ago

Kaleemullahqasim commented 3 months ago

simple codes like below

ollama.chat(model='mistral:instruct', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])

OR

import ollama
response = ollama.chat(model='mistral:instruct', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])` 

An error is being thrown, but there is no specific issue being printed. "It works perfectly fine in the command-line interface (CLI) when I run ollama run mistral:instruct

Here is the error

ResponseError                             Traceback (most recent call last)
Cell In[2], [line 1](vscode-notebook-cell:?execution_count=2&line=1)
----> [1](vscode-notebook-cell:?execution_count=2&line=1) ollama.chat(model='mistral:instruct', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])

File [~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:177](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:177), in Client.chat(self, model, messages, stream, format, options, keep_alive)
    [174](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:174)   if images := message.get('images'):
    [175](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:175)     message['images'] = [_encode_image(image) for image in images]
--> [177](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:177) return self._request_stream(
    [178](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:178)   'POST',
    [179](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:179)   '[/api/chat](https://file+.vscode-resource.vscode-cdn.net/api/chat)',
    [180](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:180)   json={
    [181](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:181)     'model': model,
    [182](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:182)     'messages': messages,
    [183](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:183)     'stream': stream,
    [184](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:184)     'format': format,
    [185](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:185)     'options': options or {},
    [186](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:186)     'keep_alive': keep_alive,
    [187](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:187)   },
    [188](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:188)   stream=stream,
    [189](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:189) )

File [~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:97](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:97), in Client._request_stream(self, stream, *args, **kwargs)
     [91](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:91) def _request_stream(
     [92](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:92)   self,
     [93](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:93)   *args,
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:94)   stream: bool = False,
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:95)   **kwargs,
     [96](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:96) ) -> Union[Mapping[str, Any], Iterator[Mapping[str, Any]]]:
---> [97](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:97)   return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()

File [~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:73](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:73), in Client._request(self, method, url, **kwargs)
     [71](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:71)   response.raise_for_status()
     [72](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:72) except httpx.HTTPStatusError as e:
---> [73](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:73)   raise ResponseError(e.response.text, e.response.status_code) from None
     [75](https://file+.vscode-resource.vscode-cdn.net/Users/kaleemullahqasim/Documents/GitHub/upwork_Sentiment_job/~/Documents/GitHub/upwork_Sentiment_job/.venv/lib/python3.12/site-packages/ollama/_client.py:75) return response

ResponseError:`
AboidoLiven commented 1 month ago

I met the same question, debugged the code and found that the error code is 502, I tried to find the place where the response is generated but the code wrapped too deep and is hard to locate precisely.

So I tried to use the python packge request directly and succeeded, here is the code:

import requests
import json

url = "http://localhost:11434/api/chat"
data = {
    "model": "phi3",
    "messages": [
        {
            'role': 'user',
            'content': 'Why is the sky blue?',
        }
    ],
    "stream": False
}

response = requests.post(url, json=data)

if response.status_code == 200:
    response_data = response.json()
    print(json.dumps(response_data, indent=4))
else:
    print(f"status_code{response.status_code}")
    print(response.text)

here is the result:

{
    "model": "phi3",
    "created_at": "2024-05-25T14:55:39.889174Z",
    "message": {
        "role": "assistant",
        "content": " The sky appears blue to the human eye because of a phenomenon called Rayleigh scattering. As sunlight travels through Earth's atmosphere, it encounters molecules and small particles which scatter the light in all directions. Blue light has a shorter wavelength and is scattered more than other colors since it travels as shorter, smaller waves. This selective scattering causes the sky to look blue in the daytime. However, at sunrise and sunset, the light passes through more of Earth's atmosphere, which scatters away most of the blue light, allowing reds, oranges, and yellows to dominate our view."
    },
    "done_reason": "stop",
    "done": true,
    "total_duration": 5341346800,
    "load_duration": 2678940100,
    "prompt_eval_count": 13,
    "prompt_eval_duration": 61936000,
    "eval_count": 139,
    "eval_duration": 2598396000
}