twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.93k stars 154 forks source link

Outputs only "undefined" #202

Closed Hangover3832 closed 5 months ago

Hangover3832 commented 6 months ago

Describe the bug Nothing appears in the code window. Chat responses only with "undefinedundefinedundefinedundefinedundefinedundefinedundefinedundefinedund"

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior Put some code in the code window, give human understandable chat response.

Screenshots Unbenannt

Desktop (please complete the following information):

Additional context API seems to work fine, Oobabooga generates text. It happens on any model.

RGSS3 commented 6 months ago

The chat completion API of oobabooga seems to return wrong field of API, I made a patch locally, however I don't know how to and whether to make a pull request

--- a/src/extension/utils.ts
+++ b/src/extension/utils.ts
@@ -302,8 +302,8 @@ export const getChatDataFromProvider = (
       return data?.choices[0].delta?.content
         ? data?.choices[0].delta.content
         : ''
-    case ApiProviders.Oobabooga:
-      return data?.choices[0].text
+    //case ApiProviders.Oobabooga:
+    //  return data?.choices[0].delta.content
     case ApiProviders.LlamaCpp:
       return data?.content
     case ApiProviders.LiteLLM:
Hangover3832 commented 6 months ago

@RGSS3 thank you, this seems to be an oobabooga responsibility. There are some open api related issues. For now, I keep sticking with ollama, which works without any isses. And, @rjmacarthy, thank you for your great work!

RGSS3 commented 6 months ago

I just mean, the API provider part in utils.ts or Oobabooga, I didn't know which part is wrong. @Hangover3832 In my computer I used text-generation-webui-snapshot-2024-03-31.zip . My modification uses exactly the default: part, and it works, so I simply comment it out for a temp workaround.


export const getChatDataFromProvider = (
  provider: string,
  data: StreamResponse | undefined
) => {
  switch (provider) {
    case ApiProviders.Ollama:
    case ApiProviders.OllamaWebUi:
      return data?.choices[0].delta?.content
        ? data?.choices[0].delta.content
        : ''
    //case ApiProviders.Oobabooga:
    //  return data?.choices[0].delta.content
    case ApiProviders.LlamaCpp:
      return data?.content
    case ApiProviders.LiteLLM:
    default:
      if (data?.choices[0].delta.content === 'undefined') {
        return ''
      }
      return data?.choices[0].delta?.content
        ? data?.choices[0].delta.content
        : ''
  }
}
Hangover3832 commented 6 months ago

@RGSS3, does the default statement also apply to the OpenAI API, means that oobabooga adheres to the OpenAI standard? Does model switching work correctly? Unfortunately I don't know how to deal with the code you provided to make it work in vscodium. Do I need to build/compile from source? How? So long

RGSS3 commented 6 months ago

@Hangover3832 I cloned this repo, and modified such util.ts, and then npm run vscode:package, and finally it generates a .vsix file, I choose to install from VSIX from the extension manager. I didn't try OpenAI because I'm currently without a key. I usually use Oobabooga, or OpenRouter coupled with SillyTavern.

Hangover3832 commented 6 months ago

That worked, thank you very much @RGSS3.

rjmacarthy commented 6 months ago

Hey both, let's reopen this issue and I'll fix it officially for the next release. Many thanks.

rjmacarthy commented 6 months ago

I just released version 3.10.16 with the ooba chat to use open ai spec, please report back.

Many thanks,