continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
18.77k stars 1.59k forks source link

Custom commands need to be written twice. #1272

Closed eladamittai closed 5 months ago

eladamittai commented 5 months ago

Before submitting your bug report

Relevant environment info

- OS: windows 10
- Continue: 0.9.126
- IDE: vs code 1.86.2
- model: deepseek 33b

Description

I've written a few / commands like logs and ut, and all of the commands don't work and the model just describes the code. In the same chat, the first time I use the command, I receive a code explanation, and when I send it again it works. The same prompt and model work for jetbrains.

To reproduce

Here is the config:

{
    "models": [
        {
            "title": "Deepseek",
            "provider": "openai",
            "model": "deepseek-33b-instruct:,
            "apiBase": "http://my-api-base/v1",
            "systemMessage": "you are an AI programming code-completion assistent, utilizing the deepseek coder model. you answer programming related questions.",
            "apikey": "my-apikey",
            "contextLength": 16000
        }
    ]
    "completeionOptions": {
        "temperature": 0.3
    }
    "customCommands": [
        {
            "name": "ut",
            "prompt": "{{{ input }}}\n\nAdd logs for thr selected code. Give the new code just as chat output, don't edit any file.",
            "description": "Add logs to highlighted code"
        }
    ]
}

Log output

No response

sestinj commented 5 months ago

@eladamittai I notice that there is a colon rather than a quote in your model name ("model": "deepseek-33b-instruct:,). Was this just a typo when copying over to GitHub issues?

Assuming that's not causing the config to not load, the first thing I would check is the prompt logs: https://docs.continue.dev/troubleshooting#llm-prompt-logs. This will tell us whether the instructions are being entirely left out, or if the problem is something else.

I also would recommend removing your system prompt. This is a detail particular to deepseek models, but they automatically have a system prompt very similar to yours, and using a different one might be causing it to act strange

eladamittai commented 5 months ago

@sestinj hey, thanks for the quick reply! 🙏 I removed the system prompt and checked the logs, and the entire command is being left out of the context. Instead of the command, it just puts /log before the context. I also tried again with just writing the command in the chat instead of using the slash command, and it worked fine. Same with the /edit command. So it's only for the custom commands.

sestinj commented 5 months ago

Got it! I think this is related to a fix I made just yesterday. I believe it should be available in the latest pre-release, 0.9.130

eladamittai commented 5 months ago

@sestinj great! I'll check it out. Thanks!

eladamittai commented 5 months ago

It works!