David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

From mistral:instruct to mistral #45

Closed weygoldt closed 7 months ago

weygoldt commented 7 months ago

I changed the default model in init.lua and README.md from mistral:instruct to mistral as mistral:instruct did not work and is also not in the docs (and fixed a missing comma in a code block in README.md).

maddawik commented 7 months ago

Hey @weygoldt , just curious how did you test this? It looks like based on the docs you linked the instruct variation is the default when you don't explicitly pass it. I tested this on my local running ollama run mistral:instruct and ollama run mistral and they produce similar results (any other made-up variation I try to pass besides instruct or text produces an error, so it seems valid)

David-Kunz commented 7 months ago

Hi @weygoldt ,

mistral:instruct works for me, but I think it's better to use mistral anyway.

Thank your for this PR and best regards, David

weygoldt commented 7 months ago

@Mawdac I am running Ollama v0.1.10. On ollama, the instructions for mistral are to simlply run ollama run mistral, which pulls the model. But If you now try to use gen.nvim, nothing well be returned because the model is installed as mistral only. Only if you explicitly run ollama run mistral:instruct, it pulls another manifest and a few megabytes of data and then mistral:instruct works, and I assume the package without my pull request will work as well. But I think it should work out of the box for anybody following the docs from ollama without having to debug it this way.

maddawik commented 7 months ago

Makes perfect sense! Didn't understand why I didn't have the same issue but now I assume that output of grabbing the new manifest was obscured to me - thanks for explaining @weygoldt