Open CambridgeComputing opened 5 months ago
@CambridgeComputing thanks for sharing this. Looks like our template definitions got a bit twisted because of the different contexts in which codestral is used. It's sort of a quick fix, but to avoid playing a game of whack-a-mole and because you have a temp solution I'm going to take an extra second and try to do this correctly (need a little refactor/cleaning)
Before submitting your bug report
Relevant environment info
Description
I have the following model definition for Codestral-22B running locally on llama.cpp server:
Code editing works, but not chat. In order for regular chat to work, I have to specify the llama2 template. Without it, I get the following error popup when using chat.
When I do specify the template as llama2, code editing (Ctrl+i) for highlighting code and requesting changes no longer works and give this different error:
Looking through the code in
autodetect.ts
, it appears that the edit template is supposed to beosModelsEditPrompt
, but theelse if
catches it earlier and assigns itllama2
.I can't make sense of the errors I got, but either moving
...up before
} else if (templateType === "llama2") {
(line 282) or changing the autodetect logic would "make" it work, but I'm not sure if that fixes the root issue.Edit - partial solution
I have found the following partial solution that has gotten me up and running, but is a bit of a pain since I'm swapping models often for testing and benchmarking. To fix, I've added a second
model
definition with notemplate
that will be used for editing, and addedmodelRoles
withinlineEdit
defined to point to it. My new config:If there is a better way to do this, please let me know. Thanks!