ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
16.45k stars 1.54k forks source link

Add Ollama support #3

Closed arsaboo closed 7 months ago

arsaboo commented 7 months ago

Any timeline for Ollama support....thanks!

ItzCrazyKns commented 7 months ago

Hey, its on the todo list and will be added pretty soon.

ItzCrazyKns commented 7 months ago

I created another branch just for Ollama. You can try cloning and testing it out:

git clone -b feat/ollama-support https://github.com/ItzCrazyKns/Perplexica.git
arsaboo commented 7 months ago

@ItzCrazyKns Thanks for the addition. Ollama is working with this branch.

One feature that would be useful (maybe for a subsequent PR) is to get the list of models from Ollama and use a dropdown to let users pick the model dynamically instead of configuring one model in the .env.

ItzCrazyKns commented 7 months ago

I understand the problems and I am working on all of them, the settings page is in development and will be added pretty soon.

syddharth commented 7 months ago

Unable to use Ollama on Windows. Am I doing something wrong?

2024-04-18 13:37:55 node:internal/process/promises:289
2024-04-18 13:37:55             triggerUncaughtException(err, true /* fromPromise */);
2024-04-18 13:37:55             ^
2024-04-18 13:37:55 
2024-04-18 13:37:55 TypeError: fetch failed
2024-04-18 13:37:55     at node:internal/deps/undici/undici:12500:13
2024-04-18 13:37:55     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024-04-18 13:37:55     at async createOllamaStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22)
2024-04-18 13:37:55     at async createOllamaGenerateStream (/home/perplexica/node_modules/@langchain/community/dist/utils/ollama.cjs:57:5)
2024-04-18 13:37:55     at async Ollama._streamResponseChunks (/home/perplexica/node_modules/@langchain/community/dist/llms/ollama.cjs:346:26)
2024-04-18 13:37:55     at async Ollama._streamIterator (/home/perplexica/node_modules/@langchain/core/dist/language_models/llms.cjs:65:34)
2024-04-18 13:37:55     at async Ollama.transform (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:369:9)
2024-04-18 13:37:55     at async wrapInputForTracing (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:246:30)
2024-04-18 13:37:55     at async pipeGeneratorWithSetup (/home/perplexica/node_modules/@langchain/core/dist/utils/stream.cjs:230:19)
2024-04-18 13:37:55     at async StringOutputParser._transformStreamWithConfig (/home/perplexica/node_modules/@langchain/core/dist/runnables/base.cjs:267:26) {
2024-04-18 13:37:55   [cause]: Error: connect ECONNREFUSED 127.0.0.1:11434
2024-04-18 13:37:55       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1605:16) {
2024-04-18 13:37:55     errno: -111,
2024-04-18 13:37:55     code: 'ECONNREFUSED',
2024-04-18 13:37:55     syscall: 'connect',
2024-04-18 13:37:55     address: '127.0.0.1',
2024-04-18 13:37:55     port: 11434
2024-04-18 13:37:55   }
2024-04-18 13:37:55 }
2024-04-18 13:37:55 
2024-04-18 13:37:55 Node.js v21.7.3
2024-04-18 13:37:55 error Command failed with exit code 1.
2024-04-18 13:37:34 yarn run v1.22.19
2024-04-18 13:37:34 $ node --env-file=.env dist/app.js
2024-04-18 13:37:34 WebSocket server started on port 3001
2024-04-18 13:37:34 API server started on port 3001
2024-04-18 13:37:55 info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
ItzCrazyKns commented 7 months ago

Hi @syddharth, please make sure Ollama is running on that port (11434) if not update the Ollama URL in the .env file (update it to the URL where Ollama is running) and rebuild the images. I am working on a config page so it will be done instantly and won't need images to be rebuilt in the future.

syddharth commented 7 months ago

I am running Ollama on windows on port 11434, I can access it.

image

In my understanding, the Docker container is unable to access the port from Windows.

ItzCrazyKns commented 7 months ago

As I can see you are using Ollama from 127.0.0.1:11434 and the 127.0.0.1 IP is only accessible to that particular computer and won't be accessible inside Docker. Replace it to http://localhost:11434 and it should work as expected

syddharth commented 7 months ago

localhost redirects to 127.0.0.1, they are the same thing. image

ItzCrazyKns commented 7 months ago

Please read my message carefully, Docker runs its container in an isolated environment and hence 127.0.0.1 for Docker would be something else. But the localhost URL I suggested would be the same

syddharth commented 7 months ago

Apologies, I did that too. Here is the .env file: image

Here is the error from docker: image

ItzCrazyKns commented 7 months ago

You need to delete the previous images and build new images. As stated above, a settings page is under development that would help to prevent building things multiple time just for changing.

syddharth commented 7 months ago

I did delete the container and all the images, still getting the same error! :/

ItzCrazyKns commented 7 months ago

It should be OLLAMA_URL=http://localhost:11434 instead of OLLAMA_HOST. I think a lot of people are facing configuration problems, I'll prioritise the development of the settings page over everything and try to release it soon

syddharth commented 7 months ago

Finally got it to work with this: OLLAMA_URL=http://host.docker.internal:11434 # url of the ollama server

ItzCrazyKns commented 7 months ago

Hey everyone, Perplexica now fully supports Ollama. You can read more in the readme file https://github.com/ItzCrazyKns/Perplexica. The settings page is nearly done and will be released pretty soon so changing configuration at runtime can be possible as well