Closed Hangover3832 closed 5 months ago
The chat completion API of oobabooga seems to return wrong field of API, I made a patch locally, however I don't know how to and whether to make a pull request
--- a/src/extension/utils.ts
+++ b/src/extension/utils.ts
@@ -302,8 +302,8 @@ export const getChatDataFromProvider = (
return data?.choices[0].delta?.content
? data?.choices[0].delta.content
: ''
- case ApiProviders.Oobabooga:
- return data?.choices[0].text
+ //case ApiProviders.Oobabooga:
+ // return data?.choices[0].delta.content
case ApiProviders.LlamaCpp:
return data?.content
case ApiProviders.LiteLLM:
@RGSS3 thank you, this seems to be an oobabooga responsibility. There are some open api related issues. For now, I keep sticking with ollama, which works without any isses. And, @rjmacarthy, thank you for your great work!
I just mean, the API provider part in utils.ts
or Oobabooga, I didn't know which part is wrong. @Hangover3832
In my computer I used text-generation-webui-snapshot-2024-03-31.zip
. My modification uses exactly the default:
part, and it works, so I simply comment it out for a temp workaround.
export const getChatDataFromProvider = (
provider: string,
data: StreamResponse | undefined
) => {
switch (provider) {
case ApiProviders.Ollama:
case ApiProviders.OllamaWebUi:
return data?.choices[0].delta?.content
? data?.choices[0].delta.content
: ''
//case ApiProviders.Oobabooga:
// return data?.choices[0].delta.content
case ApiProviders.LlamaCpp:
return data?.content
case ApiProviders.LiteLLM:
default:
if (data?.choices[0].delta.content === 'undefined') {
return ''
}
return data?.choices[0].delta?.content
? data?.choices[0].delta.content
: ''
}
}
@RGSS3, does the default statement also apply to the OpenAI API, means that oobabooga adheres to the OpenAI standard? Does model switching work correctly? Unfortunately I don't know how to deal with the code you provided to make it work in vscodium. Do I need to build/compile from source? How? So long
@Hangover3832 I cloned this repo, and modified such util.ts
, and then npm run vscode:package
, and finally it generates a .vsix
file, I choose to install from VSIX
from the extension manager. I didn't try OpenAI because I'm currently without a key. I usually use Oobabooga, or OpenRouter coupled with SillyTavern.
That worked, thank you very much @RGSS3.
Hey both, let's reopen this issue and I'll fix it officially for the next release. Many thanks.
I just released version 3.10.16 with the ooba chat to use open ai spec, please report back.
Many thanks,
Describe the bug Nothing appears in the code window. Chat responses only with "undefinedundefinedundefinedundefinedundefinedundefinedundefinedundefinedund"
To Reproduce Steps to reproduce the behavior:
Expected behavior Put some code in the code window, give human understandable chat response.
Screenshots
Desktop (please complete the following information):
Additional context API seems to work fine, Oobabooga generates text. It happens on any model.