Closed struanb closed 8 months ago
Unfortunately I'm not sure if there's anything to be done about this. I'm not sure how to allow mixed content on an HTTPS website. This issue has come up before, and the workaround, is to either use the desktop app or run prompta yourself locally.
Since you're already running the LLMs locally, maybe running prompta as well is a workable option. That way it's not served over https. The desktop app should work with localhost out of the box.
If you're aware of any ways to allow mixed content on the HTTPS site please let me know though. Closing, but feel free to reopen if there's a solution.
I'm running a local LLM at http://127.0.0.1:1234/v1/, using LM Studio (I don't believe the details of the LLM I'm running matter).
After entering the API URL in Prompta settings, I get the "Error in stream. Failed to fetch" snackbar message, and in the dev console this error: