Hi,
I started ollama serve w/o issue
Then I tried ollama.list() which returned the 3 models I have pulled with a 200 code on /api/tags. One of these models is 'mistral:latest'
Then I tried ollama.show('mistral') and it returned an object with a license, a modelfile, ... and a code 200 on /api/show
Up to now, everything fine...
Then I tried the chat example code:
response = ollama.chat(model='mistral', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
and here I get a 404 code on /api/chat
$ python ollama-test.py
Traceback (most recent call last):
File "/home/olivi/devt/ollama/ollama-test.py", line 9, in <module>
response = ollama.chat(model='mistral', messages=[
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/olivi/miniconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 158, in chat
return self._request_stream(
^^^^^^^^^^^^^^^^^^^^^
File "/home/olivi/miniconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 81, in _request_stream
return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/olivi/miniconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 57, in _request
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: 404 page not found
Hi, I started ollama serve w/o issue Then I tried
ollama.list()
which returned the 3 models I have pulled with a 200 code on /api/tags. One of these models is 'mistral:latest' Then I triedollama.show('mistral')
and it returned an object with a license, a modelfile, ... and a code 200 on /api/show Up to now, everything fine... Then I tried the chat example code:and here I get a 404 code on /api/chat
In the ollama server terminal:
Than I tried:
which worked fine (code 200 on /api/generate')
So why a 404 on /api/chat? Is it because of an error in the lib or because mistral does provide a chat API?