twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k stars 126 forks source link

Not showing all model after update to latest version #216

Closed askareija closed 2 months ago

askareija commented 2 months ago

Describe the bug After update twinny doesn't show all my model.

twinny before update: gambar

Twinny after update: gambar

Expected behavior Twinny is showing all model. Actually previous version is working, but after i updated to latest version it's not working.

Desktop

orkutmuratyilmaz commented 2 months ago

Same here, with Manjaro Linux.

rjmacarthy commented 2 months ago

Hello, the model select and provider manager was updated in this release.

https://github.com/rjmacarthy/twinny/releases/tag/v3.11.0

Different providers/models can now be set here without opening the settings menu everytime.

image

Hope that helps.

askareija commented 2 months ago

i'm using ollama and only ollama that doesn't have model name field.

ollama: gambar

llamacpp/other provider: gambar

also i don't know the API path for ollama, i've tried /api/chat, using llamacpp but it return 400 and twinny become endless loading.

gambar

gambar

gambar

now i'm using latest version 3.11.19

rjmacarthy commented 2 months ago

Hello, please make sure that your Ollama settings in the extension settings are correct i.e hostname and port.

image

In the provider config the model dropdown is populated based on api/tags from Ollama api. For other providers I am lazy and just put a text box there.

I will make it more clear for future release. The endpoint for Ollama should be /v1/chat/completions.

Hope it helps,

askareija commented 2 months ago

okay it's working thanks. But if i'm using dev container, i have to change this host to host.docker.internal, and if i'm using local host is localhost. a little bit complicated. looks like my workaround was create 2 providers with the same model name, but only hostname is different.

gambar

gambar

rjmacarthy commented 2 months ago

Everyone has different setup, I just made it so it can be flexible for everyone, some more complicated than others.

Many thanks!

askareija commented 2 months ago

thank you, last question, i'm using stable-code for FIM, but it's always return 400, is there anything wrong?

gambar

gambar

askareija commented 2 months ago

thank you, last question, i'm using stable-code for FIM, but it's always return 400, is there anything wrong?

gambar

gambar

nevermind, looks like the API path for FIM is /api/generate