Closed agi-dude closed 5 months ago
I think you didn't filled the environment file properly. Make sure the .env
file is filled correctly and has searxNG URL. Other users are able to set the Ollama branch correctly with Docker, you just need to follow the installation instructions (which are same) but you just need to fill in some extra fields in the environment file
Hello, could you please respond to this issue within the next 24 hours? If I don't hear back from you by then, I'll assume that everything is resolved and will mark it as completed and closed.
You mentioned that users are able to run the Ollama branch using Docker. But when I try I get this error and the backend crashes:
perplexica-perplexica-backend-1 | node:internal/process/promises:289
perplexica-perplexica-backend-1 | triggerUncaughtException(err, true /* fromPromise */);
perplexica-perplexica-backend-1 | ^
perplexica-perplexica-backend-1 |
perplexica-perplexica-backend-1 | TypeError: fetch failed
perplexica-perplexica-backend-1 | at node:internal/deps/undici/undici:12500:13
perplexica-perplexica-backend-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
perplexica-perplexica-backend-1 | at async createOllamaStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22)
perplexica-perplexica-backend-1 | at async createOllamaGenerateStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:57:5)
perplexica-perplexica-backend-1 | at async Ollama._streamResponseChunks (/home/perplexica/node_modules/@langchain/community/dist/llms/ollama.cjs:346:26)
perplexica-perplexica-backend-1 | at async Ollama._streamIterator (/home/perplexica/node_modules/@langchain/core/dist/language_models/llms.cjs:65:34)
perplexica-perplexica-backend-1 | at async Ollama.transform (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:369:9)
perplexica-perplexica-backend-1 | at async wrapInputForTracing (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:246:30)
perplexica-perplexica-backend-1 | at async pipeGeneratorWithSetup (/home/perplexica/node_modules/@langchain/core/dist/utils/stream.cjs:230:19)
perplexica-perplexica-backend-1 | at async StringOutputParser._transformStreamWithConfig (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:267:26) {
perplexica-perplexica-backend-1 | [cause]: Error: connect ECONNREFUSED 127.0.0.1:11434
perplexica-perplexica-backend-1 | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1605:16) {
perplexica-perplexica-backend-1 | errno: -111,
perplexica-perplexica-backend-1 | code: 'ECONNREFUSED',
perplexica-perplexica-backend-1 | syscall: 'connect',
perplexica-perplexica-backend-1 | address: '127.0.0.1',
perplexica-perplexica-backend-1 | port: 11434
perplexica-perplexica-backend-1 | }
perplexica-perplexica-backend-1 | }
I think that docker is unable to connect to the Ollama instance running on my main machine.
Well! Seems like setting the OLLAMA_HOST
to
OLLAMA_URL=http://host.docker.internal:11434
seemed to do the trick. Can't wait to start playing around! Cheers!
Describe the bug When running the ollama branch for the first time. (Docker didn't work, and probably wont work, so using it manually), I get the following error:
To Reproduce Steps to reproduce the behavior:
npm i
in the root directory and in the ui directorynpm run dev
both in the root dir and in the UI dir