ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
16.24k stars 1.52k forks source link

Using docker with ollama all I get is a spinning wheel #396

Open andreascschmidt opened 1 month ago

andreascschmidt commented 1 month ago

Describe the bug under locahost:3000 I'm only getting a spinning wheel but no search

To Reproduce I used docker-compose.yml including ollama which I linked in the config.toml

services:
  searxng:
    image: docker.io/searxng/searxng:latest
    volumes:
      - ./searxng:/etc/searxng:rw
    ports:
      - 4000:8080
    networks:
      - perplexica-network
    restart: unless-stopped

  perplexica-backend:
    build:
      context: .
      dockerfile: backend.dockerfile
    image: itzcrazykns1337/perplexica-backend:main
    environment:
      - SEARXNG_API_URL=http://searxng:8080
    depends_on:
      - searxng
    ports:
      - 3001:3001
    volumes:
      - backend-dbstore:/home/perplexica/data
      - ./config.toml:/home/perplexica/config.toml
    extra_hosts:
      - 'host.docker.internal:host-gateway'
    networks:
      - perplexica-network
    restart: unless-stopped

  perplexica-frontend:
    build:
      context: .
      dockerfile: app.dockerfile
      args:
        - NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
        - NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
    image: itzcrazykns1337/perplexica-frontend:main
    depends_on:
      - perplexica-backend
    ports:
      - 3000:3000
    networks:
      - perplexica-network
    restart: unless-stopped

  ollama:
    volumes:
      - ./ollama:/root/.ollama
    ports:
      - 11434:11434
    container_name: ollama
    image: ollama/ollama

networks:
  perplexica-network:

volumes:
  backend-dbstore:

config.toml OLLAMA = "http://host.docker.internal:11434"

Screenshot from 2024-10-06 15-06-54

ItzCrazyKns commented 1 month ago

What Ollama address are you using?

andreascschmidt commented 1 month ago

in the docker compose

ollama: volumes:

and in the config file OLLAMA = "http://host.docker.internal:11434/" I also tried localhost

I've checked with netstat the port is open

BeNeDeLuX commented 1 month ago

Confirm! I have the exact same error.

Ollama is running on another host in the network in my case. So i set the IP for that host like this: OLLAMA = "http://172.16.17.28:11434" # Ollama API URL - http://host.docker.internal:11434 I recieve a "Ollama is running", if i type the set Url with Port in the browser.

Any (debug) log that i can provide or generate ? Thanks for any help

taoi11 commented 1 month ago

The cause for this issue for me was:

      args:
        - NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
        - NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001

I was deploying the stack on a remote server and using Tailscale to make the connections.

Building the front end with the following args solved the issue:

      args:
        - NEXT_PUBLIC_API_URL=http://host-name.ts-subdomain.ts.net:3001/api
        - NEXT_PUBLIC_WS_URL=ws://host-name.ts-subdomain.ts.net:3001

I don't have the logs anymore. Inspecting the traffic in the browser, I noticed that UI was hanging up because a call was being made to http://127.0.0.1:3001/api/models.

Hope this helps.

ItzCrazyKns commented 1 month ago

in the docker compose

ollama: volumes: - ./ollama:/root/.ollama ports: - 11434:11434 container_name: ollama image: ollama/ollama

and in the config file OLLAMA = "http://host.docker.internal:11434/" I also tried localhost

I've checked with netstat the port is open

Since you're running Ollama with the other images, you can just load it over the Perplexica network and then access it by http://ollama:11434.

bmad-221B commented 1 month ago

try pulling llama3 i think then restart, it should work.

waltermo commented 1 month ago

I had the same issue using ollama and searxng on 2 other docker stack. Then I decided to build the images locally (perplexica-backend:main and perplexica-frontend:main) instead of pulling from docker.io - then it worked... maybe there is an issue with the image on dockerhub...

ItzCrazyKns commented 1 month ago

There is no issue with images on the Docker hub, they are hardcoded to local reference since when we built images, the next public vars gets bundled in the javascript so there is no way we could change it. I've mentioned this in the update guide as well, so you need to build your own images if you wish to use some other IP than localhost. There's nothing I could do.

raydoomed commented 1 week ago

I encountered the same issue because I installed Skybox, which is also using port 3001. The perplexica calling model is also using port 3001, which conflicts

andreascschmidt commented 1 week ago

@ItzCrazyKns I haven't been able to get this going and there are few more opening new issues about what sounds the same.

https://github.com/ItzCrazyKns/Perplexica/issues/437 https://github.com/ItzCrazyKns/Perplexica/issues/467

Would you mind having a dummy foolpropof step-by-step guide with included ollama or maybe even better, openwebui, assuming you have it running well?