intitni / CopilotForXcode

The missing GitHub Copilot, Codeium and ChatGPT Xcode Source Editor Extension
Other
7.34k stars 347 forks source link

[Help Wanted]: OpenAI Compatible Chat using Tabby + Chat Model #525

Open DevMobileAS opened 1 month ago

DevMobileAS commented 1 month ago

Before Reporting

Describe your issue

Hey there,

thanks for all the hard work at first. Installation was easy, usability already at a really good stage!

We're trying out CoPilot here to see if it fits our needs.

One first issue though. Unfortunately I could not find anything in the docs if I'm doing something wrong. So maybe someone can help here:

I'm running Tabby on a Mac Studio M1 using this command: $ tabby serve --device metal --chat-device metal --model DeepseekCoder-6.7B --chat-model WizardCoder-3B --host xxx

Using "Custom Connection Service" the integration into Xcode already works fine.

Now I'm trying to connect the chat too, since Tabby already provides a OpenAI compatible chat API.

I've added a new "OpenAI Compatible" chat model to Copilot but when hitting "Test" I get the error: "The data couldn't be read because it isn't in the correct format."

Tabbys event log shows that it got the request and responded:

{"user":null,"ts":1716537441123,"event":{"chat_completion":{"completion_id":"chatcmpl-7cd8b4e3-9698-4c37-a116-b1edfaa62f1e","input":[{"role":"user","content":"Respond with \"Test succeeded\""}],"output":{"role":"assistant","content":"Test succeeded"}}}}

I've also tried to check what Tabby is really responding using Insomnia and it loos fine to me (seems fairly OpenAPI compatible):

data: {"id":"chatcmpl-e250a769-96d1-4475-a939-2e4ea700cd33","created":1716536402,"system_fingerprint":"unused-system-fingerprint","object":"chat.completion.chunk","model":"unused-model","choices":[{"index":0,"delta":{"content":"Test"}}]}
data: {"id":"chatcmpl-e250a769-96d1-4475-a939-2e4ea700cd33","created":1716536402,"system_fingerprint":"unused-system-fingerprint","object":"chat.completion.chunk","model":"unused-model","choices":[{"index":0,"delta":{"content":" succeeded"}}]}
data: {"id":"chatcmpl-e250a769-96d1-4475-a939-2e4ea700cd33","created":1716536402,"system_fingerprint":"unused-system-fingerprint","object":"chat.completion.chunk","model":"unused-model","choices":[{"index":0,"finish_reason":"stop","delta":{"content":""}}]}

I'm not sure which one (Tabby or CoPilot) would be the issue here. Anything I could do to find out?

intitni commented 1 month ago

Here are 2 things I noticed:

  1. The service generates stream response when stream is set to false, which cause the test to fail.
  2. The API requires that the conversation must alternate user/assistant/user/assistant/... So it's can't be used in the chat panel right now.

I will release a new beta of 0.33.1 later tonight (UTC+8) to workaround them.

intitni commented 1 month ago

Released in 0.33.1 beta 2. You have to turn on "Enforce message order" for the model.

intitni commented 1 month ago

Released in 0.33.1