Open rafael1856 opened 2 weeks ago
@rafael1856 thanks for the bug report.
With non-OpenAI APIs, the Chatgpt > Gpt3:Model
model list won't work.
The problem with the settings dropdown is that a specific list of model names must be coded for that setting ahead of time. The extension is unable to check which models Ollama has, and dynamically show those models.
For example, I'm connected to Ollama, but it only shows OpenAI models because the model list had to be hard-coded into the extension.
To change model with a non-OpenAI API, you need to use the model picker in the UI. It'll either show below your question input, or in the "More Actions" menu depending on the screen size.
But, I do agree this is confusing. In a future release, I think we'll replace the dropdown with a text input for the "Gpt3:Model" setting to avoid this confusion.
Thank you, that works !
Glad to hear it Rafael, I'll keep this issue open to track the progress on changing the "Chatgpt > Gpt3:Model" setting from a dropdown to a text input.
Describe the Bug
In the settings, the list for Gpt: model, does not show the ollama models.
Where are you running VSCode? (Optional)
Linux
What kind of LLM are you using? (Optional)
A local LLM (such as ollama)
Additional context (Optional)
No response