huggingface / llm.nvim

LLM powered development for Neovim
Apache License 2.0
741 stars 46 forks source link

README suggests Ollama should work but it does not #73

Closed Amzd closed 7 months ago

Amzd commented 7 months ago

I feel like I have tried every combination of setup configuration that's in the readme but did not manage to get it working with 0.5.0+. I get many different "missing field *" errors. Is there a full config somewhere as an example that works with ollama?

I have it working with a fork from before ollama support was supposedly in this repo but I would like to switch to this again if possible. https://github.com/Amzd/nvim.config/blob/main/lua/plugins/llm.lua

McPatate commented 7 months ago
            query_params = {
                maxNewTokens = 60,
                temperature = 0.2,
                doSample = true,
                topP = 0.95,
            },

Does not exist anymore, you can remove or pass them in the request_body section. Check the README for your backend again, and if there isn't any info go to the documentation of the API for the request body.

-            adaptor = "ollama",
+            backend = "ollama",

The rest of your config looks valid to me