hoarder-app / hoarder

A self-hostable bookmark-everything app (links, notes and images) with AI-based automatic tagging and full text search
https://hoarder.app
GNU Affero General Public License v3.0
6.4k stars 231 forks source link

How to verify hoarder app is working with the local ollama #185

Open lihw opened 5 months ago

lihw commented 5 months ago

I modify the docker-compose.yml a bit to make hoarder use local ollama inference. Here is the modified yml file.

version: "3.8"
services:
  web:
    image: ghcr.io/hoarder-app/hoarder-web:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - data:/data
    ports:
      - 3000:3000
    env_file:
      - .env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      DATA_DIR: /data
  redis:
    image: redis:7.2-alpine
    restart: unless-stopped
    volumes:
      - redis:/data
  chrome:
    image: gcr.io/zenika-hub/alpine-chrome:123
    restart: unless-stopped
    command:
      - --no-sandbox
      - --disable-gpu
      - --disable-dev-shm-usage
      - --remote-debugging-address=0.0.0.0
      - --remote-debugging-port=9222
      - --hide-scrollbars
  meilisearch:
    image: getmeili/meilisearch:v1.6
    restart: unless-stopped
    env_file:
      - .env
    environment:
      MEILI_NO_ANALYTICS: "true"
    volumes:
      - meilisearch:/meili_data
  ollama:
    hostname: ollama-container
    image: ollama/ollama:0.1.39-rocm
    restart: unless-stopped
    volumes:
      - ollama:/data/ollama
    ports:
      - 11434:11434
    env_file:
      - .env
  workers:
    image: ghcr.io/hoarder-app/hoarder-workers:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - data:/data
    env_file:
      - .env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      BROWSER_WEB_URL: http://chrome:9222
      DATA_DIR: /data
      OLLAMA_BASE_URL: http://ollama-container:11434
      INFERENCE_TEXT_MODEL: llama3
      # OPENAI_API_KEY: ...
    depends_on:
      web:
        condition: service_started

volumes:
  redis:
  meilisearch:
  ollama:
  data:

I entered the hoarder worker container and checked if ollama-container:13414 is connectable. Yes. But when I open the hoarder webpage, it seems tag is not working. I am wondering where hoarder log is. How can I verify that the hoarder app is already connecting to local inference service.

kamtschatka commented 5 months ago

Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker. Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again.

MohamedBassem commented 5 months ago

@lihw sorry for the late reply, somehow I missed this issue. As @kamtschatka said, please provide us with the workers container log so that we can help

lihw commented 5 months ago

Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker. Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again.

Thanks for the reply. A following question is where is the logs in the worker container? Could you let me know the file path? Thanks.

kamtschatka commented 5 months ago

AFAIK it is not stored in a file, simply logged to the stdout of the docker container.

DmacMcgreg commented 5 months ago

@MohamedBassem

2024-06-19T19:05:39.718Z info: [Crawler][15] Successfully navigated to "https://github.com/hoarder-app/hoarder/issues/8". Waiting for the page to load ... 2024-06-19T19:05:40.742Z info: [Crawler][15] Finished waiting for the page to load. 2024-06-19T19:05:40.845Z info: [Crawler][15] Finished capturing page content and a screenshot. FullPageScreenshot: false 2024-06-19T19:05:40.847Z info: [Crawler][15] Will attempt to extract metadata from page ... 2024-06-19T19:05:41.317Z info: [Crawler][15] Will attempt to extract readable content ... 2024-06-19T19:05:41.608Z info: [Crawler][15] Done extracting readable content. 2024-06-19T19:05:41.614Z info: [Crawler][15] Stored the screenshot as assetId: c4f28878-3e9b-4871-aca5-3087b1f341cb 2024-06-19T19:05:41.693Z info: [Crawler][15] Done extracting metadata from the page. 2024-06-19T19:05:41.693Z info: [Crawler][15] Downloading image from "https://opengraph.githubassets.com/2db47c3e515ccae7e42ad7a70dfd9142fa8cf5d34d8ad9ae05b5f316e51f174a/hoarder-app/hoarder/issues/8" 2024-06-19T19:05:42.388Z info: [Crawler][15] Downloaded the image as assetId: 3402bea0-9b2b-4a69-a710-004a586be8da 2024-06-19T19:05:42.404Z info: [Crawler][15] Completed successfully 2024-06-19T19:05:42.412Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:42.414Z info: [search][38] Attempting to index bookmark with id duclhshurqvnoqbtzuijxeve ... 2024-06-19T19:05:42.420Z error: [inference][15] inference job failed: TypeError: fetch failed 2024-06-19T19:05:42.490Z info: [search][38] Completed successfully 2024-06-19T19:05:42.975Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:42.982Z error: [inference][15] inference job failed: TypeError: fetch failed 2024-06-19T19:05:44.023Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:44.027Z error: [inference][15] inference job failed: TypeError: fetch failed

MohamedBassem commented 5 months ago

@DmacMcgreg this is usually an indication that your ollama URL is incorrect or unreachable from the worker container

DmacMcgreg commented 5 months ago

CleanShot 2024-06-19 at 15 39 14@2x

CleanShot 2024-06-19 at 15 39 36@2x

@MohamedBassem I've also confirmed both inference models are working as expected locally.

MohamedBassem commented 5 months ago

from within the worker container, "127.0.0.1" refers to the localhost of the worker container not the docker host. Either add ollama to the same network as the worker and refer to it by the container name, or replace 127.0.0.1 with 'host.docker.internal' to point to the ip of the docker host.

DmacMcgreg commented 5 months ago

That works, thanks!

kamtschatka commented 1 month ago

this is finished, right?

debsidian commented 1 month ago

How would you integrate with ollama running on bare metal, not in a docker container?

i.e. hoarder is running in docker but ollama is on bare-metal.

kamtschatka commented 1 month ago

there is no difference. ollama is accessed using the IP address anyways, so simply put the IP address (or domain if you happen to have one) of ollama on bare-metal. I am running ollama on my PC with a 4070 TI and I simply put 10.0.0.<don't know exactly> in the config and that was it

debsidian commented 1 month ago

I simply put 10.0.0.<don't know exactly> in the config and that was it

In your docker-compose, what is your specified network? Is your container just on the host network?

I'm getting this error:

2024-10-04T19:31:39.746Z error: [inference][8] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:12500:13

    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22)

    at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/inference.ts:2:3726)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:32:158)

    at async inferTags (/app/apps/workers/openaiWorker.ts:32:375)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:32:3805)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567)
kamtschatka commented 1 month ago

I have not changed the compose file regarding network

debsidian commented 2 weeks ago

My issue has been sorted. The problem was with my ollama config and had nothing to do with the hoarder app. Sorry for the bother.