Closed Madvulcan closed 4 months ago
Hi, I changed how the providers were structured and forgot to add this provider. It should now be fixed by https://github.com/ItzCrazyKns/Perplexica/commit/3b4b8a8b0227cee590014f99e10aefd9e1176791, you just need to re-install and the patch will apply.
docker compose up -d --build
Updated; there is still no place to put in the OpenAI endpoint URL. And now 'Custom OpenAI' is the only option available from the dropdown; Ollama is now missing.
I am trying to use LM Studio as a server for Perplexica.
I looked at closed issues here like this one and have seen that the instructions are to specify a custom OpenAI endpoint in the settings. However, in the settings, I do not see that option. This is with a fresh install.
As you can see from the screenshot, the option for a custom OpenAI endpoint seems to be missing.
Was this option removed recently? I just installed Perplexica about an hour or so ago.