Closed weygoldt closed 7 months ago
Hey @weygoldt , just curious how did you test this? It looks like based on the docs you linked the instruct
variation is the default when you don't explicitly pass it. I tested this on my local running ollama run mistral:instruct
and ollama run mistral
and they produce similar results (any other made-up variation I try to pass besides instruct
or text
produces an error, so it seems valid)
Hi @weygoldt ,
mistral:instruct
works for me, but I think it's better to use mistral
anyway.
Thank your for this PR and best regards, David
@Mawdac I am running Ollama v0.1.10. On ollama, the instructions for mistral are to simlply run ollama run mistral
, which pulls the model. But If you now try to use gen.nvim, nothing well be returned because the model is installed as mistral
only. Only if you explicitly run ollama run mistral:instruct
, it pulls another manifest and a few megabytes of data and then mistral:instruct
works, and I assume the package without my pull request will work as well. But I think it should work out of the box for anybody following the docs from ollama without having to debug it this way.
Makes perfect sense! Didn't understand why I didn't have the same issue but now I assume that output of grabbing the new manifest was obscured to me - thanks for explaining @weygoldt
I changed the default model in
init.lua
andREADME.md
frommistral:instruct
tomistral
asmistral:instruct
did not work and is also not in the docs (and fixed a missing comma in a code block in README.md).