ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
13.03k stars 1.22k forks source link

Invalid URL #96

Closed VINHTRAN-VICIDEV closed 3 months ago

VINHTRAN-VICIDEV commented 4 months ago

Describe the bug I got an error when running Perplexica by using docker

Screenshot 2024-05-09 at 10 34 13

This is my config.toml file

Screenshot 2024-05-09 at 10 34 49

and I ran Ollama successfully

Screenshot 2024-05-09 at 10 35 18

Did i miss any steps?

ItzCrazyKns commented 4 months ago

Seems like you built the images before configuring the file or either Perplexica is not able to read it. I would recommend you to clone a fresh version of Perplexica. Delete all the current images (built by Perplexica) and build them again.

SimoMay commented 4 months ago

I rebuilt the container with extra logs, doesn't seem to be the issue:

yarn run v1.22.19
$ node dist/app.js
info: WebSocket server started on port 3001
info: Server is running on port 3001
info: Using Ollama API endpoint: http://host.docker.internal:11434
info: Loading Ollama models from http://host.docker.internal:11434
info: Using Ollama API endpoint: http://host.docker.internal:11434
node:internal/process/promises:289
            triggerUncaughtException(err, true /* fromPromise */);
            ^

TypeError: Invalid URL
    at new URL (node:internal/url:775:36)
    at OpenAI.buildURL (/home/perplexica/node_modules/openai/core.js:318:15)
    at OpenAI.buildRequest (/home/perplexica/node_modules/openai/core.js:199:26)
    at OpenAI.makeRequest (/home/perplexica/node_modules/openai/core.js:274:44)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /home/perplexica/node_modules/@langchain/openai/dist/chat_models.cjs:684:29
    at async RetryOperation._fn (/home/perplexica/node_modules/p-retry/index.js:50:12) {
  code: 'ERR_INVALID_URL',
  input: 'null/chat/completions'
}

Node.js v20.8.1
error Command failed with exit code 1.
SimoMay commented 4 months ago

I found the issue is in the frontend actually, if you click on "settings" you will see it default to "CustomOpenAi" with "undefined" model. Once you fix the settings with the correct values everything works correctly.

ItzCrazyKns commented 3 months ago

Hi @VINHTRAN-VICIDEV, is the issue resolved so I can close it or you still need support?

VINHTRAN-VICIDEV commented 3 months ago

Okay Thank @ItzCrazyKns a lot