Open nanvenomous opened 2 weeks ago
I could use a little help with the setup
:OllamaModel
llama3
neovim verison: v0.10.0
v0.10.0
return require('packer').startup(function(use) use 'nomnivore/ollama.nvim' use 'nvim-lua/plenary.nvim' end) require('ollama').setup({ cmd = { "Ollama", "OllamaModel", "OllamaServe", "OllamaServeStop" }, opts = { model = "codestral", url = "http://127.0.0.1:4005", } })
I have run
export OLLAMA_HOST=127.0.0.1:4005 ollama serve
the output of ollama list:
ollama list
NAME ID SIZE MODIFIED codestral:latest fcc0019dcee9 12 GB 4 minutes ago llama3:latest 365c0bd3c000 4.7 GB 3 weeks ago
I have confirmed that
ollama run codestral
works great
I could use a little help with the setup
The Issue
:OllamaModel
only shows myllama3
model as an optionMy config
neovim verison:
v0.10.0
I have run
the output of
ollama list
:I have confirmed that
works great