milanglacier / minuet-ai.nvim

💃 Dance with Intelligence in Your Code. Minuet AI integrates with nvim-cmp, offering AI completion from popular LLMs including OpenAI, Gemini, Claude, and more.
GNU General Public License v3.0
97 stars 3 forks source link

Not completing and not sure how to debug #14

Open OneOfOne opened 1 day ago

OneOfOne commented 1 day ago

My current lazyvim config:

    {
        'milanglacier/minuet-ai.nvim',
        lazy = false,
        config = function()
            require('minuet').setup({
                provider_options = {
                    openai_compatible = {
                        model = 'llama3.2:3b',
                        end_point = 'http://localhost:11434/v1/chat/completions',
                        name = 'Ollama',
                        stream = false,
                    },
                },
            })
        end,
    },

....

-- nvim-cmp
            opts.sources = cmp.config.sources({
                { name = 'copilot' },
                { name = 'mineut' },

                { name = 'nvim_lsp' },
                { name = 'crates' },
            }, {
                { name = 'buffer' },
                { name = 'path' },
            })

It doesn't suggest anything and as far as I can tell it doesn't call the ollama server at all judging by journalctl log.

milanglacier commented 1 day ago
  1. You need a fake API key (existed in your environement variables) even if ollama does not require the API key. This is mandatory.

  2. I would suggest you take a look at the config suggested by README.

https://github.com/milanglacier/minuet-ai.nvim?tab=readme-ov-file#integration-with-lazyvim

Besides, I would recommend stream = true, because if the ollama cannot complete the chat completion request within the timeout, minuet can't receive anything.

OneOfOne commented 23 hours ago

The provider specified as OpenAI compatible is not properly configured.

    {
        'milanglacier/minuet-ai.nvim',
        lazy = false,
        config = function()
            require('minuet').setup({
                provider = 'openai_compatible',
                provider_options = {
                    openai_compatible = {
                        model = 'llama3.2:3b',
                        end_point = 'http://localhost:11434/v1/chat/completions',
                        name = 'Ollama',
                        api_key = 'DEEPSEEK_API_KEY',
                        stream = true,
                        optional = {
                            stop = nil,
                            max_tokens = nil,
                        },
                    },
                },
            })
        end,
    },
    {
        'nvim-cmp',
        opts = function(_, opts)
            -- if you wish to use autocomplete
            table.insert(opts.sources, 1, {
                name = 'minuet',
                group_index = 1,
                priority = 100,
            })

            opts.performance = {
                -- It is recommended to increase the timeout duration due to
                -- the typically slower response speed of LLMs compared to
                -- other completion sources. This is not needed when you only
                -- need manual completion.
                fetching_timeout = 2000,
            }

            opts.mapping = vim.tbl_deep_extend('force', opts.mapping or {}, {
                -- if you wish to use manual complete
                ['<c-y>'] = require('minuet').make_cmp_map(),
                -- You don't need to worry about <CR> delay because lazyvim handles this situation for you.
                ['<CR>'] = nil,
            })
        end,
    },
milanglacier commented 17 hours ago

Did you have your deep seek api key configured in your environment variables?

it need to set to an arbitrary value, it can be whatever, like aaa, bbb.

OneOfOne commented 4 hours ago

I'm sorry, how do you set that? I thought you set it in the config as api_key?

milanglacier commented 1 hour ago

I'm sorry, how do you set that? I thought you set it in the config as api_key?

In your shell configuration file, .profile, or .bash_profile, or .zprofile, depending on which shell you are using, write something like this:

export DEEPSEEK_API_KEY=xxx