ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.67k stars 220 forks source link

`ollama.create` fails with `ollama._types.ResponseError: unexpected EOF` #171

Closed dormant-user closed 3 weeks ago

dormant-user commented 4 weeks ago

Hello,

I'm trying to customize prompt using a Modelfile with instructions in the ollama repo

The CLI commands work just as it should, however when I use the python method to do the same I keep running into ResponseError

Code block

model_file = os.path.join(os.path.dirname(__file__), "Modelfile")
assert os.path.isfile(model_file)
response = ollama.create(
    model="mario",
    modelfile=model_file,
    stream=False
)

Traceback

Traceback (most recent call last):
  File "/Users/vicky/Desktop/git/private-ai/main.py", line 27, in <module>
    response = ollama.create(
               ^^^^^^^^^^^^^^
  File "/Users/vicky/Desktop/git/private-ai/venv/lib/python3.11/site-packages/ollama/_client.py", line 272, in create
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/vicky/Desktop/git/private-ai/venv/lib/python3.11/site-packages/ollama/_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/vicky/Desktop/git/private-ai/venv/lib/python3.11/site-packages/ollama/_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: unexpected EOF

Modelfile

FROM llama3

# set the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1

# set the system message
SYSTEM """
You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.
"""

I do have llama3 pulled already

ollama.list().get('models', [])
[{'name': 'llama3:latest', 'model': 'llama3:latest', 'modified_at': '2024-05-31T08:03:46.90811335-05:00', 'size': 4661224676, 'digest': '365c0bd3c000a25d28ddbf732fe1c6add414de7275464c4e4d1c3b5fcb5d8ad1', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'llama', 'families': ['llama'], 'parameter_size': '8.0B', 'quantization_level': 'Q4_0'}, 'expires_at': '0001-01-01T00:00:00Z'}, {'name': 'llama2:latest', 'model': 'llama2:latest', 'modified_at': '2024-03-28T11:18:59.51974178-05:00', 'size': 3826793677, 'digest': '78e26419b4469263f75331927a00a0284ef6544c1975b826b15abdaef17bb962', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'llama', 'families': ['llama'], 'parameter_size': '7B', 'quantization_level': 'Q4_0'}, 'expires_at': '0001-01-01T00:00:00Z'}]

Any help in this regard would be much appreciated.

TheEpic-dev commented 3 weeks ago

https://github.com/ollama/ollama-python/blob/main/ollama/_client.py#L266

It appears that you either need to provide the Modelfile contents as a string, or provide a path argument if you want Ollama to read the file for you. Try:

model_file = os.path.join(os.path.dirname(__file__), "Modelfile")
assert os.path.isfile(model_file)
response = ollama.create(
    model="mario",
-   modelfile=model_file,
+   path=model_file,
    stream=False
)
dormant-user commented 3 weeks ago

For some reason I kept trying path with just the parent path without including Modelfile in it Guess it was just a force of habit, thanks for the help This works