logancyang / obsidian-copilot

THE Copilot in Obsidian
GNU Affero General Public License v3.0
2.79k stars 191 forks source link

I am using the qwen2.5:7b local model through Ollama,Error: Model request failed: 404 page not found #700

Closed tyf2018 closed 5 days ago

tyf2018 commented 1 week ago

I am using the qwen2.5:7b local model through Ollama, which works fine in version 2.6.0 of the Copilot plugin. However, I have been encountering errors in the subsequent plugin versions. For example, in version 2.6.11, it reports an error: "Error: Model request failed: 404 page not found." image

Describe how to reproduce Start any conversation.

Expected behavior I hope to be able to use the qwen2.5:7b local model smoothly through Ollama in Copilot.

Screenshots image

logancyang commented 1 week ago

I tried with 2.6.11 and it works for me.

SCR-20241003-slva
  1. What's your ollama list result?
  2. How did you add this custom model to the plugin, can you show me a screenshot
tyf2018 commented 1 week ago
  1. What's your ollama list result? image

  2. How did you add this custom model to the plugin, can you show me a screenshot? image

Best regards.

logancyang commented 1 week ago

Hmm, this should work 🤔

Just to be sure, can you try without the Base URL? When you pick Ollama as provider you can omit it unless your ollama server is not at localhost.

tyf2018 commented 1 week ago

Not using the basic URL, it is running normally now, thanks.