SilasMarvin / lsp-ai

LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
MIT License
2.16k stars 73 forks source link

Errors while running using nvim-lspconfig -- invalid length 0, expected struct FileStore with 1 element #80

Closed mergemoveagree closed 3 weeks ago

mergemoveagree commented 3 weeks ago

I am attempting to use lsp-ai with nvim-lspconfig, but I keep getting the following error whenever I enter a buffer with the correct file type, lsp-ai logs the following error:

2024-09-24T23:10:33.316442Z DEBUG lsp_server::msg: > {"jsonrpc":"2.0","id":1,"result":{"capabilities":{"codeActionProvider":{"resolveProvider":true},"completionProvider":{},"textDocumentSync":2}}}    
2024-09-24T23:10:33.371434Z DEBUG lsp_server::msg: < {"params":{},"jsonrpc":"2.0","method":"initialized"}    
2024-09-24T23:10:33.371473Z DEBUG lsp_server::stdio: sending message Notification(
    Notification {
        method: "initialized",
        params: Object {},
    },
)    
2024-09-24T23:10:33.371604Z ERROR lsp_ai: invalid length 0, expected struct FileStore with 1 element

But it seems the server is still running since it is still sending and receiving messages. I've tried using a local model with llama_cpp and using the "deepseek-coder" model from Ollama. Furthermore, it doesn't look like Ollama is receiving the API calls (I've double-checked the port and everything).

I'm using nixvim (nixpkgs wrapper for neovim), though that should not matter too much since the relevant part of my config is written in raw lua.

LSP-AI log file: https://hastebin.com/share/tedozacagi.css

My nvim-lspconfig setup:

require('lspconfig.configs').lsp_ai = {
  default_config = {
    cmd = {
      '${lsp-ai-llama-cpp}',  # Use outPath from custom flake derivation (NixOS)
      '--use-seperate-log-file',
    },
    cmd_env = {
      LSP_AI_LOG = "DEBUG",
    },
    filetypes = { 'html' },
    root_dir = vim.loop.cwd,
    init_options = {
      memory = {
        file_store = {}
      },
      models = {
        model1 = {
          type = "llama_cpp",
          file_path = "/home/user/Meta-Llama-3.1-8B-Instruct-F16.gguf",
          n_ctx = 2048,
          n_gpu_layers = 0,
        },
        model2 = {
          type = "ollama",
          model = "deepseek-coder",
        },
      },
      completion = {
        model = "model2",
        parameters = {
          fim = {
            start = "<|fim_prefix|>",
            middle = "<|fim_suffix|>",
            ["end"] = "<|fim_middle|>",
          },
          max_context = 2000,
          max_new_tokens = 32,
        }
      }
    },
  },
}

local capabilities = require('cmp_nvim_lsp').default_capabilities()

require('lspconfig').lsp_ai.setup ({
  capabilities = capabilities,
})
SilasMarvin commented 3 weeks ago

This is an issue with how neovim converts objects into JSON. By setting file_store to {} you are setting it to an empty list in lua when it should actually be an empty object. This causes deserialization issues on the server. Here is an example neovim config I have used and confirmed it works:

-- Set leader
vim.g.mapleader = " "
vim.g.maplocalleader = "\\"

-- The init_options
local lsp_ai_init_options_json = [[
{
  "memory": {
    "file_store": {}
  },
  "models": {
    "model1": {
      "type": "anthropic",
      "chat_endpoint": "https://api.anthropic.com/v1/messages",
      "model": "claude-3-5-sonnet-20240620",
      "auth_token_env_var_name": "ANTHROPIC_API_KEY"
    }
  },
  "actions": [
    {
      "trigger": "!C",
      "action_display_name": "Chat",
      "model": "model1",
      "parameters": {
        "max_context": 4096,
        "max_tokens": 4096,
        "system": "You are an AI coding assistant. Your task is to complete code snippets. The user's cursor position is marked by \"<CURSOR>\". Follow these steps:\n\n1. Analyze the code context and the cursor position.\n2. Provide your chain of thought reasoning, wrapped in <reasoning> tags. Include thoughts about the cursor position, what needs to be completed, and any necessary formatting.\n3. Determine the appropriate code to complete the current thought, including finishing partial words or lines.\n4. Replace \"<CURSOR>\" with the necessary code, ensuring proper formatting and line breaks.\n5. Wrap your code solution in <answer> tags.\n\nYour response should always include both the reasoning and the answer. Pay special attention to completing partial words or lines before adding new lines of code.\n\n<examples>\n<example>\nUser input:\n--main.py--\n# A function that reads in user inpu<CURSOR>\n\nResponse:\n<reasoning>\n1. The cursor is positioned after \"inpu\" in a comment describing a function that reads user input.\n2. We need to complete the word \"input\" in the comment first.\n3. After completing the comment, we should add a new line before defining the function.\n4. The function should use Python's built-in `input()` function to read user input.\n5. We'll name the function descriptively and include a return statement.\n</reasoning>\n\n<answer>t\ndef read_user_input():\n    user_input = input(\"Enter your input: \")\n    return user_input\n</answer>\n</example>\n\n<example>\nUser input:\n--main.py--\ndef fibonacci(n):\n    if n <= 1:\n        return n\n    else:\n        re<CURSOR>\n\n\nResponse:\n<reasoning>\n1. The cursor is positioned after \"re\" in the 'else' clause of a recursive Fibonacci function.\n2. We need to complete the return statement for the recursive case.\n3. The \"re\" already present likely stands for \"return\", so we'll continue from there.\n4. The Fibonacci sequence is the sum of the two preceding numbers.\n5. We should return the sum of fibonacci(n-1) and fibonacci(n-2).\n</reasoning>\n\n<answer>turn fibonacci(n-1) + fibonacci(n-2)</answer>\n</example>\n</examples>\n",
        "messages": [
          {
            "role": "user",
            "content": "{CODE}"
          }
        ]
      },
      "post_process": {
        "extractor": "(?s)<answer>(.*?)</answer>"
      }
    }
  ]
}
]]

-- The configuration
local lsp_ai_config = {
  cmd = { 'lsp-ai', '--use-seperate-log-file' },
  root_dir = vim.loop.cwd(),
  init_options = vim.fn.json_decode(lsp_ai_init_options_json),
}

-- Start lsp-ai when opening a buffer
vim.api.nvim_create_autocmd("BufEnter", {
  callback = function(args)
    local bufnr = args.buf
    local client = vim.lsp.get_active_clients({bufnr = bufnr, name = "lsp-ai"})
    if #client == 0 then
      vim.lsp.start(lsp_ai_config, {bufnr = bufnr})
    end
  end,
})

-- Key mapping for code actions
vim.api.nvim_set_keymap('n', '<leader>c', '<cmd>lua vim.lsp.buf.code_action()<CR>', {noremap = true, silent = true})