ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
13.81k stars 1.31k forks source link

TypeError: Invalid URL #6

Closed agi-dude closed 5 months ago

agi-dude commented 5 months ago

Describe the bug When running the ollama branch for the first time. (Docker didn't work, and probably wont work, so using it manually), I get the following error:

[nodemon] 3.1.0
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: ts,json
[nodemon] starting `ts-node -r dotenv/config src/app.ts`
WebSocket server started on port 3001
API server started on port 3001
Connection closed
TypeError: Invalid URL
    at new URL (node:internal/url:796:36)
    at searchSearxng (D:\Perplexica\src\core\searxng.ts:23:15)
    at RunnableLambda.func (D:\Perplexica\src\agents\youtubeSearchAgent.ts:176:36)
    at D:\Perplexica\node_modules\@langchain\core\dist\runnables\base.cjs:1385:44
    at MockAsyncLocalStorage.run (D:\Perplexica\node_modules\@langchain\core\dist\singletons\index.cjs:10:9)
    at output (D:\Perplexica\node_modules\@langchain\core\dist\runnables\base.cjs:1383:78)
    at new Promise (<anonymous>)
    at RunnableLambda._transform (D:\Perplexica\node_modules\@langchain\core\dist\runnables\base.cjs:1382:30)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async LogStreamCallbackHandler.tapOutputIterable (D:\Perplexica\node_modules\@langchain\core\dist\tracers\log_stream.cjs:266:26) {
  code: 'ERR_INVALID_URL',
  input: '/search?format=json'
}
[nodemon] app crashed - waiting for file changes before starting...

To Reproduce Steps to reproduce the behavior:

  1. Clone the ollama branch
  2. Run npm i in the root directory and in the ui directory
  3. Run npm run dev both in the root dir and in the UI dir
ItzCrazyKns commented 5 months ago

I think you didn't filled the environment file properly. Make sure the .env file is filled correctly and has searxNG URL. Other users are able to set the Ollama branch correctly with Docker, you just need to follow the installation instructions (which are same) but you just need to fill in some extra fields in the environment file

ItzCrazyKns commented 5 months ago

Hello, could you please respond to this issue within the next 24 hours? If I don't hear back from you by then, I'll assume that everything is resolved and will mark it as completed and closed.

agi-dude commented 5 months ago

You mentioned that users are able to run the Ollama branch using Docker. But when I try I get this error and the backend crashes:

perplexica-perplexica-backend-1   | node:internal/process/promises:289
perplexica-perplexica-backend-1   |             triggerUncaughtException(err, true /* fromPromise */);
perplexica-perplexica-backend-1   |             ^
perplexica-perplexica-backend-1   |
perplexica-perplexica-backend-1   | TypeError: fetch failed
perplexica-perplexica-backend-1   |     at node:internal/deps/undici/undici:12500:13
perplexica-perplexica-backend-1   |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
perplexica-perplexica-backend-1   |     at async createOllamaStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22)
perplexica-perplexica-backend-1   |     at async createOllamaGenerateStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:57:5)
perplexica-perplexica-backend-1   |     at async Ollama._streamResponseChunks (/home/perplexica/node_modules/@langchain/community/dist/llms/ollama.cjs:346:26)
perplexica-perplexica-backend-1   |     at async Ollama._streamIterator (/home/perplexica/node_modules/@langchain/core/dist/language_models/llms.cjs:65:34)
perplexica-perplexica-backend-1   |     at async Ollama.transform (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:369:9)
perplexica-perplexica-backend-1   |     at async wrapInputForTracing (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:246:30)
perplexica-perplexica-backend-1   |     at async pipeGeneratorWithSetup (/home/perplexica/node_modules/@langchain/core/dist/utils/stream.cjs:230:19)
perplexica-perplexica-backend-1   |     at async StringOutputParser._transformStreamWithConfig (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:267:26) {
perplexica-perplexica-backend-1   |   [cause]: Error: connect ECONNREFUSED 127.0.0.1:11434
perplexica-perplexica-backend-1   |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1605:16) {
perplexica-perplexica-backend-1   |     errno: -111,
perplexica-perplexica-backend-1   |     code: 'ECONNREFUSED',
perplexica-perplexica-backend-1   |     syscall: 'connect',
perplexica-perplexica-backend-1   |     address: '127.0.0.1',
perplexica-perplexica-backend-1   |     port: 11434
perplexica-perplexica-backend-1   |   }
perplexica-perplexica-backend-1   | }

I think that docker is unable to connect to the Ollama instance running on my main machine.

agi-dude commented 5 months ago

Well! Seems like setting the OLLAMA_HOST to

OLLAMA_URL=http://host.docker.internal:11434

seemed to do the trick. Can't wait to start playing around! Cheers!