gsuuon / model.nvim

Neovim plugin for interacting with LLM's and building editor integrated prompts.
MIT License
293 stars 21 forks source link

LLM error ['stop'] #6

Closed gdnaesver closed 10 months ago

gdnaesver commented 10 months ago

Across multiple models (for example WizardLM/WizardCoder-1B-V1.0), both inference, using the model name and with my own depoyment I get the error:

[LLM] The following 'model_kwargs' are not used by the model: ['stop'] (note typos in the generate argument will also show up in this list)

I can't work out what to do to fix this....

gdnaesver commented 10 months ago

wrong repo... sorry