Open DevMobileAS opened 1 month ago
Here are 2 things I noticed:
stream
is set to false, which cause the test to fail.I will release a new beta of 0.33.1 later tonight (UTC+8) to workaround them.
Released in 0.33.1 beta 2. You have to turn on "Enforce message order" for the model.
Released in 0.33.1
Before Reporting
Describe your issue
Hey there,
thanks for all the hard work at first. Installation was easy, usability already at a really good stage!
We're trying out CoPilot here to see if it fits our needs.
One first issue though. Unfortunately I could not find anything in the docs if I'm doing something wrong. So maybe someone can help here:
I'm running Tabby on a Mac Studio M1 using this command: $ tabby serve --device metal --chat-device metal --model DeepseekCoder-6.7B --chat-model WizardCoder-3B --host xxx
Using "Custom Connection Service" the integration into Xcode already works fine.
Now I'm trying to connect the chat too, since Tabby already provides a OpenAI compatible chat API.
I've added a new "OpenAI Compatible" chat model to Copilot but when hitting "Test" I get the error: "The data couldn't be read because it isn't in the correct format."
Tabbys event log shows that it got the request and responded:
I've also tried to check what Tabby is really responding using Insomnia and it loos fine to me (seems fairly OpenAPI compatible):
I'm not sure which one (Tabby or CoPilot) would be the issue here. Anything I could do to find out?