Currently, when using the Ollama provider, users are limited to a single model for both code-completion and chat functionalities. This feature request proposes the addition of an option to select different models for these two distinct tasks. By allowing users to choose specialized models for each function, we can enhance the performance and accuracy of both code-completion and conversational interactions for local models.
Proposed solution
Introduce a settings panel where users can select their preferred models for code-completion and chat independently.
It seems that there are some options available in the screenshot above. However, I am unable to switch to a specific model for code completion while using a different one for chat.
Describe the need of your request
Currently, when using the Ollama provider, users are limited to a single model for both code-completion and chat functionalities. This feature request proposes the addition of an option to select different models for these two distinct tasks. By allowing users to choose specialized models for each function, we can enhance the performance and accuracy of both code-completion and conversational interactions for local models.
Proposed solution
Introduce a settings panel where users can select their preferred models for code-completion and chat independently. It seems that there are some options available in the screenshot above. However, I am unable to switch to a specific model for code completion while using a different one for chat.
I can only to switch both.
Additional context
No response