Closed sestinj closed 1 year ago
@meganoob1337, made some progress here in the latest version. Seeing less English in the response, but not perfect yet.
More importantly, you can now edit the prompt template on your own if you're interested in playing around with it. For now, there's an example here, but I'll write a more full-featured docs page once this is solid.
Will also be playing around with prompts for Ollama myself, so keeping this issue open until you or I or someone else finds a reliable one.
This is solved in the latest version by making sure to use the correct chat template. I still would recommend using instruct models over the raw codellama though
Describe the bug Code Llama likes to prefix its /edit responses with English explanation with the current prompt. Should use the prompt from the paper.
To Reproduce Select Code Llama as your model, attempt any edit
Expected behavior Should output only the code, no English
Environment All
Additional context Consider adding ability to pass custom prompt construction callbacks / template strings in the config file so users can play with the prompt themselves more easily
CON-210