intitni / CopilotForXcode

The first GitHub Copilot, Codeium and ChatGPT Xcode Source Editor Extension
https://copilotforxcode.intii.com
MIT License
7.84k stars 386 forks source link

[Bug]: DeepSeek Coder V2 Not Working #541

Open cliffordh opened 5 months ago

cliffordh commented 5 months ago

Before Reporting

What happened?

Using Ollama to run deepseek-coder-v2:

ollama run deepseek

I've setup a model in CopilotForXcode, but when I try to chat with it I get "The data couldn’t be read because it is missing."

How to reproduce the bug.

Install deepseek-coder-v2 and configure a chat model to talk to Ollama.

Relevant log output

n/a

macOS version

15

Xcode version

15

Copilot for Xcode version

0.33.4

intitni commented 5 months ago

This model doesn't support messages from the role system. I will see what I can do with it later.

intitni commented 5 months ago

It seems that it's not about the system message. I only know that it's complaining about the prompt template but I don't know why yet.

Here is what I saw from ollama:

ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported, falling back to chatml. This may cause the model to output suboptimal responses | tid="0x1f9f70c00" timestamp=1718876553

Here is what the stream response contains:

{\"error\":\"an unknown error was encountered while running the model \"}
intitni commented 5 months ago

The maintainer of Ollama said that the template not yet supported error is irrelevant.
╮(╯▽╰)╭

I did some experiment but can't tell what was wrong. Let see if Ollama 0.1.45 will change anything.

intitni commented 5 months ago

I found that it will fail whenever a message is long enough.. If this is the case, there is nothing I can do. In Copilot for Xcode, we will include a part of the code in the editor as prompt, so you would see the error.

Screenshot 2024-06-20 at 21 46 29
cliffordh commented 5 months ago

Ok, I will try to adjust the scope and message history I am sending, but I believe this model may be unusable as is.