zbirenbaum / copilot-cmp

Lua plugin to turn github copilot into a cmp source
MIT License
1.1k stars 39 forks source link

Suggestions appearing only randomly #3

Closed Ganitzsh closed 2 years ago

Ganitzsh commented 2 years ago

Hello, so the suggestions always properly appear in the buffer (in grey) but are only very randomly suggested in the popup window in insert mode:

Screenshot 2022-04-06 at 12 04 13

This is happening more often than not, not sure to what it's related. Any suggestions on how to debug it?

Also, LspInfo is giving me this state on copilot.lua:

Screenshot 2022-04-06 at 12 09 17

It seems it's not being attached to the buffers all the time?

Neovim version: 0.6.1 nvim-cmp config:

local default = {
   snippet = {
      expand = function(args)
         require("luasnip").lsp_expand(args.body)
      end,
   },
   formatting = {
      format = function(entry, vim_item)
         local icons = require "plugins.configs.lspkind_icons"
         vim_item.kind = string.format("%s %s", icons[vim_item.kind], vim_item.kind)

         vim_item.menu = ({
            nvim_lsp = "[LSP]",
            nvim_lua = "[Lua]",
            buffer = "[BUF]",
         })[entry.source.name]

         return vim_item
      end,
   },
   mapping = {
      ["<C-p>"] = cmp.mapping.select_prev_item(),
      ["<C-n>"] = cmp.mapping.select_next_item(),
      ["<C-d>"] = cmp.mapping.scroll_docs(-4),
      ["<C-f>"] = cmp.mapping.scroll_docs(4),
      ["<C-Space>"] = cmp.mapping.complete(),
      ["<C-e>"] = cmp.mapping.close(),
      ["<CR>"] = cmp.mapping.confirm {
         behavior = cmp.ConfirmBehavior.Replace,
         select = true,
      },
      ["<Tab>"] = cmp.mapping(function(fallback)
         if cmp.visible() then
            cmp.select_next_item()
         elseif require("luasnip").expand_or_jumpable() then
            vim.fn.feedkeys(vim.api.nvim_replace_termcodes("<Plug>luasnip-expand-or-jump", true, true, true), "")
         else
            fallback()
         end
      end, { "i", "s" }),
      ["<S-Tab>"] = cmp.mapping(function(fallback)
         if cmp.visible() then
            cmp.select_prev_item()
         elseif require("luasnip").jumpable(-1) then
            vim.fn.feedkeys(vim.api.nvim_replace_termcodes("<Plug>luasnip-jump-prev", true, true, true), "")
         else
            fallback()
         end
      end, { "i", "s" }),
   },
   sources = {
      { name = "copilot" },
      { name = "nvim_lsp" },
      -- { name = "luasnip" },
      -- { name = "buffer" },
      -- { name = "nvim_lua" },
      -- { name = "path" },
   },
}

Packer config:

     {
         "zbirenbaum/copilot.lua",
         event = "InsertEnter",
         config = function()
            vim.schedule(function()
               require("copilot").setup()
            end)
         end,
      },
      {
         "zbirenbaum/copilot-cmp",
         after = { "copilot.lua", "nvim-cmp" },
      },
zbirenbaum commented 2 years ago

This is an issue in copilot.lua for .6. I believe I have a fix that should be coming out soon, I'll let you know once it's in.

It's a shame I'll just end up taking all the .6 stuff out after the full release of .7 this month but I'll do my best to support it up until then

zbirenbaum commented 2 years ago

I just pushed an update to copilot.lua that should fix the attach issue. Can you test it out?

zbirenbaum commented 2 years ago

Additionally, I changed the plugin to call getCompletionsCycling so that cmp displays all copilot completions available, but this seems to be associated with some level of server response delay. I'm working on a solution for it, but I think the trade off here is as follows:

Either 1: We make the plugin as responsive as copilot.vim, by fetching completions every time a new char is inserted (this has cpu usage ramifications), or two, we let cmp debounce as it is, which causes a bit more pop in, but keeps everything lag free and running smoothly.

I'll likely have to make a poll for this, or just implement both when I have time and let people choose.

Ganitzsh commented 2 years ago

Thanks for the quick update, unfortunately it's still not attaching properly. Indeed there's a choice to be made, I personally prefer smoothness, I don't mind waiting a second or two for this. Don't bother too much, I know .7 is coming out soon so it would be unnecessary. I'll try and see if my config is not breaking with the new version and do the switch.

zbirenbaum commented 2 years ago

I think this may have been addressed via my latest commit. Would you mind trying it out? I fixed a number of attach related issues, and just now fixed the randomness of the completions due to a bug with start and end points for users who had the plugin attaching. I am going to close this for now, but if the problem remains in its current form please reopen it. If you still are having issues of a different form, please post them in a new issue and link this one.

The problem with .6 hypothetically should be resolved, I even downloaded a 0.6 appimage to test and it was working fine, so any outstanding problems are likely relevant to all versions.

Ganitzsh commented 2 years ago

It seems to work more reliably indeed. I still have some cases where it won't suggest the greyed code in the buffer, but I'm not sure if it's your plugin directly I think it's cmp.

Thanks for the reactivity, it's nice to be able to finally use it consistently! Keep up the good work

EDIT: From what LspInfo and CmpStatus are saying it's attaching properly every time now

zbirenbaum commented 2 years ago

It seems to work more reliably indeed. I still have some cases where it won't suggest the greyed code in the buffer, but I'm not sure if it's your plugin directly I think it's cmp.

Thanks for the reactivity, it's nice to be able to finally use it consistently! Keep up the good work

EDIT: From what LspInfo and CmpStatus are saying it's attaching properly every time now

Super glad to hear this!

Could you expand on what you mean by not suggesting the greyed our code? I'm trying to figure out if you are talking about a behavior I'm aware of or not and it's not entirely clear.