j0rd1smit / obsidian-copilot-auto-completion

MIT License
94 stars 10 forks source link

Local ollama/mistral fails due to cors #39

Open brimwats opened 2 months ago

brimwats commented 2 months ago

I got the server set up in the setting and am getting a success image

But when I try to use copilot I get no reply & this in the errors:

Access to fetch at 'http://127.0.0.1:11434/api/chat' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

j0rd1smit commented 2 months ago

Access to fetch at 'http://127.0.0.1:11434/api/chat' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Where do you see the above error? In the console? It is not entirely clear from your message to me. Can you please include that screenshot as well?

For me it work with the config: API URL: http://localhost:11434/api/chat model: mistral or llama3

It works fine (although a bit slow).

Optionally, you can also enable debug mode in the settings to view the message and responses.

Additionally, I'm also using the Ollama desktop app for macOS. That could maybe also be a differances.

Furthermore, I also found this GitHub issue on Ollama, which might be of interest. https://github.com/ollama/ollama/issues/3827

brimwats commented 2 months ago

Where do you see the above error? In the console? It is not entirely clear from your message to me. Can you please include that screenshot as well?

It was in the console, yes. Debuggig was not enabled, so it was just in the standard error messages.

llama3 and mistral are both nearly-instant on my desktop/in the shell.

I can try the debugging mode and the suggestions at the ollama thing when I'm on my desktop again later this week, but it looks like the windows terminal command might be SETX /M OLLAMA_ORIGINS "app://obsidian.md*"

brimwats commented 1 month ago

Sorry for the delay! It seems i'm making some progress. I can confirm that "SETX /M OLLAMA_ORIGINS "app://obsidian.md*" seems to help the issue.

I tried phi3 this time and get this error:

image

Here's what I get in the dev console when I use ;; (completion command) on this note:

image

Any thoughts?

brimwats commented 1 month ago

Llama3 seems to work at least! Very very slowly though, tragically, but progress!

image