huggingface / llm.nvim

LLM powered development for Neovim
Apache License 2.0
741 stars 46 forks source link

ollama not working #93

Open nfwyst opened 3 months ago

nfwyst commented 3 months ago

config:

return {
  "huggingface/llm.nvim",
  opts = {
    model = "rouge/autocoder-s-6.7b:latest",
    backend = "ollama",
    url = "http://localhost:11434/api/generate",
    request_body = {
      parameters = {
        max_new_tokens = 100000, -- the maximum numbers of tokens to generate, ignore the number of prompt token
        temperature = 0.2, -- the bigger, more creatively
        top_p = 0.95, -- the bigger, text generated is diverse
      },
    },
    tokens_to_clear = { "<|endoftext|>" },
    fim = {
      enabled = true,
      prefix = "<fim_prefix>",
      middle = "<fim_middle>",
      suffix = "<fim_suffix>",
    },
    accept_keymap = "<Tab>",
    dismiss_keymap = "<S-Tab>",
    lsp = {
      bin_path = vim.api.nvim_call_function("stdpath", { "data" })
        .. "/mason/bin/llm-ls",
    },
    context_window = 100000, -- max number of tokens for the context window
    tokenizer = {
      repository = "Bin12345/AutoCoder_S_6.7B",
    },
  },
}

error message: [LLM] serde json error: EOF while parsing a value at line 1 column 0

McPatate commented 3 months ago

Your request body is incorrect, refer to https://github.com/huggingface/llm.nvim?tab=readme-ov-file#ollama and https://github.com/ollama/ollama/blob/main/docs/api.md#request-6 for more info.