twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.93k stars 153 forks source link

Type error fetch failed t.streamResponse #221

Closed FrozzDay closed 5 months ago

FrozzDay commented 5 months ago

Describe the bug i got this when trying openai compatible api with twinny

Fetch error: TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11576:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async t.streamResponse (/home/user/.vscode-oss/extensions/rjmacarthy.twinny-3.11.20-universal/out/index.js:2:135485)

Desktop (please complete the following information):

Additional context using other provider still result the same (i dont use litellm proxy server)

FrozzDay commented 5 months ago

turns out i really need to use litellm proxy to convert openai compatible api to openai compatible api

rjmacarthy commented 5 months ago

Hey, please could you provide more information for the request you are attempting? Perhaps the API you are calling has CORS rules in place and using LiteLLM as a proxy is helping?

FrozzDay commented 5 months ago

no, it has no CORS. i can use it directly on other things (like sgpt cli)

FrozzDay commented 5 months ago

my bad guys, i realize i should not put protocol in "Hostname" it should be domain only or ip only