rubberduck-ai / rubberduck-vscode

Use AI-powered code edits, explanations, code generation, error diagnosis, and chat in Visual Studio Code with the official OpenAI API.
https://marketplace.visualstudio.com/items?itemName=Rubberduck.rubberduck-vscode
MIT License
582 stars 70 forks source link

Having set my own base-url came an unknown error And The answer would never complete. #78

Open Jinghao-Tu opened 1 year ago

Jinghao-Tu commented 1 year ago

Describe the bug

Having set my own base-url came an unknown error And The answer would never complete.

How to reproduce

Set base-url like 'https://api.xxx.com/v1/' and ask any questions.

Expected behavior

There is no "Error: Unknown Error" and the answer can be completed.

Screenshots

屏幕截图 2023-03-19 125618

Additional information

🧙‍ Add any other context about the problem here.

I built a Cloudflares Worker to reproxy the base-url. The Worker's code is here.

const TELEGRAPH_URL = 'https://api.openai.com';

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

async function handleRequest(request) {
  const url = new URL(request.url);
  url.host = TELEGRAPH_URL.replace(/^https?:\/\//, '');

  const modifiedRequest = new Request(url.toString(), {
    headers: request.headers,
    method: request.method,
    body: request.body,
    redirect: 'follow'
  });

  const response = await fetch(modifiedRequest);
  const modifiedResponse = new Response(response.body, response);

  modifiedResponse.headers.set('Access-Control-Allow-Origin', '*');

  return modifiedResponse;
}
lgrammel commented 1 year ago

Can you check the vscode dev console for errors (and the rubberduck output)?

Rubberduck uses the OpenAI streaming API - maybe that causes the issue.

Jinghao-Tu commented 1 year ago

Can you check the vscode dev console for errors (and the rubberduck output)?

Rubberduck uses the OpenAI streaming API - maybe that causes the issue.

Here is the rubberduck putput.

[INFO] --- Start OpenAI prompt ---
[INFO] [{"role":"user","content":"## Instructions\nContinue the conversation below.\nPay special attention to the current developer request.\n\n## Current Request\nDeveloper: hi\n\n\n## Conversation\nDeveloper: hi\n\n## Task\nWrite a response that continues the conversation.\nStay focused on current developer request.\nConsider the possibility that there might not be a solution.\nAsk for clarification if the message does not make sense or more input is needed.\nUse the style of a documentation article.\nOmit any links.\nInclude code snippets (using Markdown) and examples where appropriate.\n\n## Response\nBot:"}]
[INFO] --- End OpenAI prompt ---
[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpCmy1eGOt3egUjnkOvVEkIL9S8n","object":"chat.completion.chunk","created":1679239548,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-6vpCmy1eGOt3egUjnkOvVEkIL9S8n","object":"chat.completion.chunk","cr
[ERROR] Something went wrong with OpenAI
[ERROR] Unknown error
[ERROR] Failed to process chunk
[ERROR] eated":1679239548,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"Hello"},"index":0,"finish_reason":null}]}

More output

[INFO] --- Start OpenAI prompt ---
[INFO] [{"role":"user","content":"## Instructions\nContinue the conversation below.\nPay special attention to the current developer request.\n\n## Current Request\nDeveloper: hi\n\n\n## Conversation\nDeveloper: hi\n\n## Task\nWrite a response that continues the conversation.\nStay focused on current developer request.\nConsider the possibility that there might not be a solution.\nAsk for clarification if the message does not make sense or more input is needed.\nUse the style of a documentation article.\nOmit any links.\nInclude code snippets (using Markdown) and examples where appropriate.\n\n## Response\nBot:"}]
[INFO] --- End OpenAI prompt ---
[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpGO5ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":1679239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-6vpGO5ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":167
[ERROR] Something went wrong with OpenAI
[ERROR] Unknown error
[ERROR] Failed to process chunk
[ERROR] 9239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"Hello"},"index":0,"finish_reason":null}]}

[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpGO5
[ERROR] Failed to process chunk
[ERROR] ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":1679239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":" I"},"index":0,"finish_reason":null}]}

[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpGO5
[ERROR] Failed to process chunk
[ERROR] ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":1679239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"?"},"index":0,"finish_reason":null}]}

[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpGO5
[ERROR] Failed to process chunk
[ERROR] ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":1679239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":" your"},"index":0,"finish_reason":null}]}

[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6vpGO5
[ERROR] Failed to process chunk
[ERROR] ndX9uA2dDfGJ0uAooBWUCqC","object":"chat.completion.chunk","created":1679239772,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"!"},"index":0,"finish_reason":null}]}

VSC dev console output

There is nothing about rubberduck.

lgrammel commented 1 year ago

It seems like something is wrong with the chunks. My guess is that they somehow get modified in the proxy.

Can you set the log level to debug and try again? This should show the full chunks for inspection.

image

Jinghao-Tu commented 1 year ago

Here is debug output:

[INFO] --- Start OpenAI prompt ---
[INFO] [{"role":"user","content":"## Instructions\nContinue the conversation below.\nPay special attention to the current developer request.\n\n## Current Request\nDeveloper: hi\n\n\n## Conversation\nDeveloper: hi\n\n## Task\nWrite a response that continues the conversation.\nStay focused on current developer request.\nConsider the possibility that there might not be a solution.\nAsk for clarification if the message does not make sense or more input is needed.\nUse the style of a documentation article.\nOmit any links.\nInclude code snippets (using Markdown) and examples where appropriate.\n\n## Response\nBot:"}]
[INFO] --- End OpenAI prompt ---
[DEBUG] Fetch OpenAI API key
[DEBUG] OpenAI API key retrieved
[DEBUG] Execute POST request to OpenAI (url=https://openai-api. (MyDomain) .com/v1, max_tokens=1024, temperature=0)
[DEBUG] Streaming the response
[DEBUG] Streaming data, process chunk (chunk size=303)
[DEBUG] Process next line of chunk
[DEBUG] Process next line of chunk
[ERROR] Failed to process chunk
[ERROR] data: {"id":"chatcmpl-6w1ir7RSse76rSGzogdFpR8f96BfM","object":"chat.completion.chunk","created":1679287665,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-6w1ir7RSse76rSGzogdFpR8f96BfM","object":"chat.completion.chunk","crea
[ERROR] Something went wrong with OpenAI
[ERROR] Unknown error
[DEBUG] Streaming data, process chunk (chunk size=120)
[DEBUG] Process next line of chunk
[ERROR] Failed to process chunk
[ERROR] ted":1679287665,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"Hello"},"index":0,"finish_reason":null}]}

[DEBUG] Streaming data, process chunk (chunk size=212)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=216)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=208)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=213)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=212)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=209)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=212)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=208)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=215)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=209)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=212)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=213)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=214)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=218)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=209)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=208)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=213)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=217)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=211)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=214)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=214)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=218)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=215)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=212)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=207)
[DEBUG] Process next line of chunk
[DEBUG] Streaming data, process chunk (chunk size=210)
[DEBUG] Process next line of chunk
[DEBUG] Processed last line of chunk
[DEBUG] Stream ended but was already resolved. Do nothing.
vchauhan1 commented 10 months ago

Hello Guys, Is there any update on this issue?

lgrammel commented 10 months ago

@vchauhan1 Sadly I won't have time in the foreseeable future to work on Rubberduck. Fortunately there are now many alternative extensions with similar functionality that you could use instead and Copilot has chat functionality as well.