Closed VINHTRAN-VICIDEV closed 3 months ago
Seems like you built the images before configuring the file or either Perplexica is not able to read it. I would recommend you to clone a fresh version of Perplexica. Delete all the current images (built by Perplexica) and build them again.
I rebuilt the container with extra logs, doesn't seem to be the issue:
yarn run v1.22.19
$ node dist/app.js
info: WebSocket server started on port 3001
info: Server is running on port 3001
info: Using Ollama API endpoint: http://host.docker.internal:11434
info: Loading Ollama models from http://host.docker.internal:11434
info: Using Ollama API endpoint: http://host.docker.internal:11434
node:internal/process/promises:289
triggerUncaughtException(err, true /* fromPromise */);
^
TypeError: Invalid URL
at new URL (node:internal/url:775:36)
at OpenAI.buildURL (/home/perplexica/node_modules/openai/core.js:318:15)
at OpenAI.buildRequest (/home/perplexica/node_modules/openai/core.js:199:26)
at OpenAI.makeRequest (/home/perplexica/node_modules/openai/core.js:274:44)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async /home/perplexica/node_modules/@langchain/openai/dist/chat_models.cjs:684:29
at async RetryOperation._fn (/home/perplexica/node_modules/p-retry/index.js:50:12) {
code: 'ERR_INVALID_URL',
input: 'null/chat/completions'
}
Node.js v20.8.1
error Command failed with exit code 1.
I found the issue is in the frontend actually, if you click on "settings" you will see it default to "CustomOpenAi" with "undefined" model. Once you fix the settings with the correct values everything works correctly.
Hi @VINHTRAN-VICIDEV, is the issue resolved so I can close it or you still need support?
Okay Thank @ItzCrazyKns a lot
Describe the bug I got an error when running Perplexica by using docker
This is my config.toml file
and I ran Ollama successfully
Did i miss any steps?