twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.98k stars 156 forks source link

Twinny tries ollama URL for oobabooga embeddings #336

Open allo- opened 2 weeks ago

allo- commented 2 weeks ago

Describe the bug I configured oobabooga as embedding provider:

When I now select the provider and click "Embed workspace documents" vscode still requests http://0.0.0.0:11434/api/embed

To Reproduce Try to use oobabooga as embedding provider.

Expected behavior The configured provider, URL, port and path should be used..

API Provider Oobabooga

Chat or Auto Complete? Embedding

Model Name all-mpnet-base-v2

Desktop (please complete the following information):

Additional context Chat works as expected with oobabooga (chat) provider.

rjmacarthy commented 1 week ago

Hey, thanks for the report. However, I'm sorry but I cannot replicate this?

allo- commented 1 week ago

I can test next week more. Anything I should start with?

What I did: I configured the embedding provider and then opened the panel and pressed the button for embedding the workspace. It didn't do anything and only showed a notification that didn't went away. Then I saw an error in the vscode developer tools (electron console) and patched a console.log for the URLs into the js file, and saw that it seems to use the oolama port. By listening there with netcat I was able to verify that it indeed tries port 11434 instead of the configured port.

I use ooba for chat and it works fine and I do not have a FIM model configured yet.

rjmacarthy commented 1 week ago

Thanks for the detailed response. Please could you let me know what version you are using?

allo- commented 1 week ago

twinny-3.17.20-linux-x64

vkx86 commented 1 week ago

Had the same issue with embedding with twinny <-> LM Studio. At end forked the repo and commented-out lines 157-160 in src/extension/provider-manager.ts and use local extension build - now documents ingestion and embedding requests runs perfectly with LM Studio. I'm still not sure it works in chat, though... @rjmacarthy - plz take note that commented code overwrites user-defined path

allo- commented 1 week ago

The question is whether the default settings shouldn't be overwritten anyway when I configure an own provider.

rjmacarthy commented 6 days ago

Hey, sorry about this bug. I thought I'd removed that code in a previous version, I just released version v3.17.24 which should address it.

Many thanks,

allo- commented 6 days ago

I still have the problem with v3.17.24.