Closed Amzd closed 7 months ago
query_params = { maxNewTokens = 60, temperature = 0.2, doSample = true, topP = 0.95, },
Does not exist anymore, you can remove or pass them in the
request_body
section. Check the README for your backend again, and if there isn't any info go to the documentation of the API for the request body.
- adaptor = "ollama",
+ backend = "ollama",
The rest of your config looks valid to me
I feel like I have tried every combination of setup configuration that's in the readme but did not manage to get it working with 0.5.0+. I get many different "missing field *" errors. Is there a full config somewhere as an example that works with ollama?
I have it working with a fork from before ollama support was supposedly in this repo but I would like to switch to this again if possible. https://github.com/Amzd/nvim.config/blob/main/lua/plugins/llm.lua