ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.67k stars 220 forks source link

`ollama.create` fails with `ollama._types.ResponseError: name and path are required` #192

Closed DavidSouther closed 6 days ago

DavidSouther commented 1 week ago

I'm lazy-initializing a bunch of models (mostly mistral, just varying system prompts).

@dataclass
class Model:
    name: str
    base: str
    system: Optional[str] = None
    temperature: Optional[float] = None
    num_predict: Optional[int] = None
    stop: Optional[list[str]] = None

def load_model(model: Model):
    modelfile = [f"FROM {model.base}"]
    if model.num_predict is not None:
        modelfile.append(f"PARAMETER num_predict {model.num_predict}")
    if model.temperature is not None:
        modelfile.append(f"PARAMETER temperature {model.temperature}")
    if model.stop is not None:
        for stop in model.stop:
            modelfile.append(f"PARAMETER stop {stop}")
    if model.system is not None:
        modelfile.append(f'SYSTEM """{model.system}"""')

    modelfile = "\n".join(modelfile)

    logger.info(f"Creating model {model.name}\n{modelfile}")

    path = CONFIG / model.name
    file = open(path, "w")
    file.write(modelfile)
    file.close()
    ollama.create(model.name, path=path)
    path.unlink()

    # ollama.create(model.name, modelfile=modelfile)

Running either form of ollama.create is causing the same error:

  File ".../llm.py", line 36, in load_model
    ollama.create(model.name, path=path)
  File ".../.venv/lib/python3.12/site-packages/ollama/_client.py", line 276, in create
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../.venv/lib/python3.12/site-packages/ollama/_client.py", line 98, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/dpsouth/devel/wayward_mime/kurtstock/.venv/lib/python3.12/site-packages/ollama/_client.py", line 74, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: name and path are required
python --version Python 3.12.3
pip --version pip 24.0 from .../.venv/lib/python3.12/site-packages/pip (python 3.12)
ollama==0.2.1

Am I doing anything obviously wrong?

DavidSouther commented 6 days ago

Looks like this is an error in the client (or, at least, between the client and server) in that the server requires a path in the body of the create object. See #195

DavidSouther commented 6 days ago

NVMThis was an issue with a mismatch between Client and Server versions. brew upgrade ollama seems to have resolved this.