logancyang / obsidian-copilot

THE Copilot in Obsidian
GNU Affero General Public License v3.0
2.55k stars 176 forks source link

OpenAI Proxy base url CORS errors #360

Closed pperanich closed 4 weeks ago

pperanich commented 5 months ago

Describe the bug When attempting to fetch data from an OpenAI proxy URL (https://xxxxx.xxx/chat/completions) within Obsidian, the request is blocked due to a CORS policy error. The console displays an error indicating that the response to the preflight request doesn't pass the access control check because no 'Access-Control-Allow-Origin' header is present on the requested resource.

To Reproduce

Expected behavior The expected behavior is that the proxy server should handle CORS appropriately by including the necessary Access-Control-Allow-Origin header in the response, allowing requests from app://obsidian.md. This would enable the application to communicate with the OpenAI API without encountering CORS policy errors.

Proposed Fix If an OpenAI proxy base url is provided, I think that the proxy server should be started similar to what is done for the Claude model, but providing the OpenAI proxy base url.

Additional context To enhance user experience and flexibility, I would suggest support for multiple proxy configurations that may be selected from the models drop-down. This way, you could still make use of the OpenAI models if wanted as well, rather than simply overwriting the request if a proxy address is provided. This feature would cater to a variety of use cases and preferences, enabling a more seamless and efficient workflow within Obsidian.

kteppris commented 5 months ago

First thanks for your amazing work!

I actually run in the same Issue but from a different Use Case:

We are running our own Text Generation Inference API of Huggingface with same models where some of them share the same API as OpenAI therefore we can actually use the openAI package to use our Inference. But when trying to use the proxy i ran into the same CORS Error. What a pitty, that would be amazing to have, not only for the Chat Model but also for the Embeddings, where our models also can be used same way as OpenAI.

Hope this is fixed soon :)

etlweather commented 5 months ago

I had this issue with LocalAI - the solution was to start LocalAI with --cors