longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, OpenAI, Mistral AI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
261 stars 32 forks source link

Ollama API has been blocked by CORS policy macOS #73

Closed praveenprem closed 1 month ago

praveenprem commented 1 month ago

Obsidian plugin is unable to make REST API calls to the Ollama API due to CORS rules.

Access to fetch at 'http://localhost:11434/api/chat' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
VM196 plugin:bmo-chatbot:2510 TypeError: Failed to fetch
    at fetchOllamaResponseStream (plugin:bmo-chatbot:2421:28)
    at async BMOView.BMOchatbot (plugin:bmo-chatbot:4414:11)
fetchOllamaResponseStream @ VM196 plugin:bmo-chatbot:2510
index.html:1 

OS: macOS Sonoma Obsidian: v1.5.12

twalderman commented 1 month ago

To support CORS you will need to run the following command from terminal and direct BMO to the alternate port. this will permit CORS and streaming. Use whatever port you like that is not 11434

OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve

praveenprem commented 1 month ago

To support CORS you will need to run the following command from terminal and direct BMO to the alternate port. this will permit CORS and streaming. Use whatever port you like that is not 11434

OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve

Thanks @twalderman that worked, I didn't realise setting origin can be done like that