twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k stars 126 forks source link

Unable to select model when using Ollama #226

Closed dchansen closed 2 months ago

dchansen commented 2 months ago

Describe the bug When modifying or adding a provider, the "model" field disappears when selecting "ollama". It is available for all other providers, such lmstuido or llamacpp.

To Reproduce Steps to reproduce the behavior:

  1. Click "Manage twinny providers"
  2. Go to 'Add Provider'
  3. Set 'Provider' to 'ollamawebui'.
  4. Watch model field appear.
  5. Set 'Provider' to 'ollama'.
  6. Watch model field disappear.

Expected behavior The model field should be available to allow the use of other models than codellama7b.

Screenshots ollamawebui ollama

rjmacarthy commented 2 months ago

Hello, are you running ollama api? Does the /api/tags endpoint return models?

dchansen commented 2 months ago

Reinstalled the plugin, and now it correctly lists the models. Unsure what went wrong. Thanks for the quick reply and making a great plugin.

rjmacarthy commented 2 months ago

Thanks, I just applied a fix to fallback to the model name as a text box is the models are not fetched from the api correctly, many thanks.