Closed tyf2018 closed 5 days ago
I tried with 2.6.11 and it works for me.
ollama list
result?
What's your
ollama list
result?How did you add this custom model to the plugin, can you show me a screenshot?
Best regards.
Hmm, this should work 🤔
Just to be sure, can you try without the Base URL? When you pick Ollama as provider you can omit it unless your ollama server is not at localhost.
Not using the basic URL, it is running normally now, thanks.
I am using the qwen2.5:7b local model through Ollama, which works fine in version 2.6.0 of the Copilot plugin. However, I have been encountering errors in the subsequent plugin versions. For example, in version 2.6.11, it reports an error: "Error: Model request failed: 404 page not found."
Describe how to reproduce Start any conversation.
Expected behavior I hope to be able to use the qwen2.5:7b local model smoothly through Ollama in Copilot.
Screenshots