hoarder-app / hoarder

A self-hostable bookmark-everything app (links, notes and images) with AI-based automatic tagging and full text search
https://hoarder.app
GNU Affero General Public License v3.0
6.57k stars 236 forks source link

ollama error #585

Closed divemasterjm closed 1 month ago

divemasterjm commented 1 month ago

Describe the Bug

i've got error when calling ollama in external computer version: "3.8" services: web: image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release} restart: unless-stopped volumes:

Steps to Reproduce

when creating or reicreating tags

Expected Behaviour

2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13185:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086) at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567) 2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9"

Screenshots or Additional Context

2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13185:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086) at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567) 2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9"

Device Details

linux debian 12

Exact Hoarder Version

v0.18.0

MohamedBassem commented 1 month ago

this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l?

kamtschatka commented 1 month ago

additionally: by default ollama does not allow access from external hosts, you have to set that up: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network

divemasterjm commented 1 month ago

this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l?

i can ping to ollama, i use openwebui on an proxmox lxc, using openwebui on docker it works, so maybe is a proxmox lxc ollama setup problem.