Closed dormant-user closed 3 weeks ago
https://github.com/ollama/ollama-python/blob/main/ollama/_client.py#L266
It appears that you either need to provide the Modelfile
contents as a string, or provide a path
argument if you want Ollama to read the file for you. Try:
model_file = os.path.join(os.path.dirname(__file__), "Modelfile")
assert os.path.isfile(model_file)
response = ollama.create(
model="mario",
- modelfile=model_file,
+ path=model_file,
stream=False
)
For some reason I kept trying path
with just the parent path without including Modelfile
in it
Guess it was just a force of habit, thanks for the help
This works
Hello,
I'm trying to customize prompt using a
Modelfile
with instructions in the ollama repoThe CLI commands work just as it should, however when I use the python method to do the same I keep running into
ResponseError
Code block
Traceback
Modelfile
I do have
llama3
pulled alreadyAny help in this regard would be much appreciated.