twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.31k stars 126 forks source link

Cannot read long model names when configuring provider #249

Closed fbl100 closed 1 month ago

fbl100 commented 1 month ago

Describe the bug I'm trying to use both codeqwen:7b-code-v1.5-fp16 and codeqwen:7b-chat-v1.5-fp16. When I go to configure my providers, I see: codeqwen:7...-v1.5-fp16 listed twice in the dropdown list. When I try to expand the panel, the text behind the ... does not get revealed.

To Reproduce attempt to load both of these models and view them in the combo box

Expected behavior At a bare minimum, I would want a tooltip that shows me the full model name when I hover over the dropdown. Ideally the text area of the dropdown would scale so if you make the panel wider, you can see more text

Screenshots image

Note: I can't get a screenshot with the dropdown list expanded, but the same text is repeated twice.

This is the result of ollama list image

Logging N/A

API Provider Ollama

Chat or Auto Complete? Inability to know which is being used for Chat/Auto-Complete

Model Name codeqwen:7b-code-v1.5-fp16 codeqwen:7b-chat-v1.5-fp16

Desktop (please complete the following information):

Additional context N/A

rjmacarthy commented 1 month ago

Thanks for the report. I made a small improvement for it.