Open allo- opened 2 weeks ago
Hey, thanks for the report. However, I'm sorry but I cannot replicate this?
I can test next week more. Anything I should start with?
What I did: I configured the embedding provider and then opened the panel and pressed the button for embedding the workspace. It didn't do anything and only showed a notification that didn't went away. Then I saw an error in the vscode developer tools (electron console) and patched a console.log for the URLs into the js file, and saw that it seems to use the oolama port. By listening there with netcat I was able to verify that it indeed tries port 11434 instead of the configured port.
I use ooba for chat and it works fine and I do not have a FIM model configured yet.
Thanks for the detailed response. Please could you let me know what version you are using?
twinny-3.17.20-linux-x64
Had the same issue with embedding with twinny <-> LM Studio.
At end forked the repo and commented-out lines 157-160 in src/extension/provider-manager.ts
and use local extension build - now documents ingestion and embedding requests runs perfectly with LM Studio. I'm still not sure it works in chat, though...
@rjmacarthy - plz take note that commented code overwrites user-defined path
The question is whether the default settings shouldn't be overwritten anyway when I configure an own provider.
Hey, sorry about this bug. I thought I'd removed that code in a previous version, I just released version v3.17.24 which should address it.
Many thanks,
I still have the problem with v3.17.24.
Describe the bug I configured oobabooga as embedding provider:
127.0.0.1
5000
/v1/embeddings
all-mpnet-base-v2
When I now select the provider and click "Embed workspace documents" vscode still requests
http://0.0.0.0:11434/api/embed
To Reproduce Try to use oobabooga as embedding provider.
Expected behavior The configured provider, URL, port and path should be used..
API Provider Oobabooga
Chat or Auto Complete? Embedding
Model Name all-mpnet-base-v2
Desktop (please complete the following information):
Additional context Chat works as expected with oobabooga (chat) provider.