Open cliffordh opened 5 months ago
This model doesn't support messages from the role system
. I will see what I can do with it later.
It seems that it's not about the system message. I only know that it's complaining about the prompt template but I don't know why yet.
Here is what I saw from ollama:
ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported, falling back to chatml. This may cause the model to output suboptimal responses | tid="0x1f9f70c00" timestamp=1718876553
Here is what the stream response contains:
{\"error\":\"an unknown error was encountered while running the model \"}
The maintainer of Ollama said that the template not yet supported error is irrelevant.
╮(╯▽╰)╭
I did some experiment but can't tell what was wrong. Let see if Ollama 0.1.45 will change anything.
I found that it will fail whenever a message is long enough.. If this is the case, there is nothing I can do. In Copilot for Xcode, we will include a part of the code in the editor as prompt, so you would see the error.
Ok, I will try to adjust the scope and message history I am sending, but I believe this model may be unusable as is.
Before Reporting
What happened?
Using Ollama to run deepseek-coder-v2:
ollama run deepseek
I've setup a model in CopilotForXcode, but when I try to chat with it I get "The data couldn’t be read because it is missing."
How to reproduce the bug.
Install deepseek-coder-v2 and configure a chat model to talk to Ollama.
Relevant log output
macOS version
15
Xcode version
15
Copilot for Xcode version
0.33.4