hoarder-app / hoarder

A self-hostable bookmark-everything app (links, notes and images) with AI-based automatic tagging and full text search
https://hoarder.app
GNU Affero General Public License v3.0
5.49k stars 174 forks source link

Secret not found #470

Closed phoniclynx closed 1 month ago

phoniclynx commented 1 month ago

Using Portainer to install Hoarder but for some reason, it hasn't seen the environmental variables

s6-rc: info: service s6rc-oneshot-runner: starting

s6-rc: info: service s6rc-oneshot-runner successfully started

s6-rc: info: service fix-attrs: starting

s6-rc: info: service init-db-migration: starting

s6-rc: info: service fix-attrs successfully started

s6-rc: info: service legacy-cont-init: starting

Running db migration script

s6-rc: info: service legacy-cont-init successfully started

s6-rc: info: service init-db-migration successfully started

s6-rc: info: service svc-workers: starting

s6-rc: info: service svc-web: starting

s6-rc: info: service svc-web successfully started

s6-rc: info: service svc-workers successfully started

s6-rc: info: service legacy-services: starting

s6-rc: info: service legacy-services successfully started

  ▲ Next.js 14.2.13

  - Local:        http://localhost:3000

  - Network:      http://0.0.0.0:3000

 ✓ Starting...

 ✓ Ready in 148ms

Corepack is about to download https://registry.npmjs.org/pnpm/-/pnpm-9.0.0-alpha.8.tgz.

> @hoarder/workers@0.1.0 start:prod /app/apps/workers

> tsx index.ts

2024-10-04T07:38:23.208Z info: Workers version: 0.17.1

2024-10-04T07:38:23.215Z info: [Crawler] Connecting to existing browser instance: http://chrome:9222

(node:121) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.

(Use `node --trace-deprecation ...` to show where the warning was created)

2024-10-04T07:38:23.221Z info: [Crawler] Successfully resolved IP address, new address: http://172.18.0.4:9222/

2024-10-04T07:38:27.218Z info: Starting crawler worker ...

2024-10-04T07:38:27.219Z info: Starting inference worker ...

2024-10-04T07:38:27.219Z info: Starting search indexing worker ...

[next-auth][error][NO_SECRET] 

https://next-auth.js.org/errors#no_secret Please define a `secret` in production. r [MissingSecretError]: Please define a `secret` in production.

    at t.assertConfig (/app/apps/web/.next/server/chunks/8109.js:1:51161)

    at m (/app/apps/web/.next/server/chunks/8109.js:1:44525)

    at async o (/app/apps/web/.next/server/chunks/8109.js:25:20595)

    at async i (/app/apps/web/.next/server/app/page.js:1:2845) {

  code: 'NO_SECRET'

}

[next-auth][error][NO_SECRET] 

https://next-auth.js.org/errors#no_secret Please define a `secret` in production. r [MissingSecretError]: Please define a `secret` in production.

    at t.assertConfig (/app/apps/web/.next/server/chunks/8109.js:1:51161)

    at m (/app/apps/web/.next/server/chunks/8109.js:1:44525)

    at async o (/app/apps/web/.next/server/chunks/8109.js:25:20595)

    at async g (/app/apps/web/.next/server/chunks/4685.js:1:8184) {

  code: 'NO_SECRET'

}

Error: There is a problem with the server configuration. Check the server logs for more information.

    at o (/app/apps/web/.next/server/chunks/8109.js:25:20843)

    at async i (/app/apps/web/.next/server/app/page.js:1:2845) {

  digest: '2936418077'

}

Error: There is a problem with the server configuration. Check the server logs for more information.

    at o (/app/apps/web/.next/server/chunks/8109.js:25:20843)

    at async g (/app/apps/web/.next/server/chunks/4685.js:1:8184) {

  digest: '3467703528'

}

Error: There is a problem with the server configuration. Check the server logs for more information.

    at o (/app/apps/web/.next/server/chunks/8109.js:25:20843)

    at async g (/app/apps/web/.next/server/chunks/4685.js:1:8184) {

  digest: '3467703528'

}
MohamedBassem commented 1 month ago

Did you correctly setup all the required env variables without any typos in their name?

phoniclynx commented 1 month ago

HOARDER_VERSION=release NEXTAUTH_SECRET=Ztp2XdD....wE5WE9r....icV MEILI_MASTER_KEY=Nyk4l6....3Pki9....xvM NEXTAUTH_URL=http:/192.168.1.5:3000

MohamedBassem commented 1 month ago

not an expert in portainer, but found this: https://www.portainer.io/blog/using-env-files-in-stacks-with-portainer

Did you rename the env file in docker compose to be stack.env?

phoniclynx commented 1 month ago

Yes the above translates to this: image

MohamedBassem commented 1 month ago

No I mean, in the stack definition itself. You'll find references to '.env'. You probably should change those to 'stack.env' if you haven't done so already?

phoniclynx commented 1 month ago

oh you have to remove those otherwise it won't build... I had it working and tried to add the OpenAI key, so I deleted the stack and recreated it not knowing that I could have just added it.. I deleted the volumes and stuff.. cleared the browser caches etc.. and I only get that error now

MohamedBassem commented 1 month ago

if I understand correctly, you shouldn't remove them but instead rename them to "stack.env"

phoniclynx commented 1 month ago

so the fix was to change:

    env_file:
      - .env

to

    env_file:
      - stack.env
MohamedBassem commented 1 month ago

yes that's what I'm suggesting. Did it work?

phoniclynx commented 1 month ago

It worked... now I can't seem to get OpenAI to work.. it doesn't seem to hit my OpenAI account.. it said it's Never been accessed on the OpanAI API webpage

MohamedBassem commented 1 month ago

I assume the key is correct? Can you share the error logs from the container?

phoniclynx commented 1 month ago

I don't even think it's trying?

 ✓ Starting...
 ✓ Ready in 159ms
Corepack is about to download https://registry.npmjs.org/pnpm/-/pnpm-9.0.0-alpha.8.tgz.
> @hoarder/workers@0.1.0 start:prod /app/apps/workers
> tsx index.ts
2024-10-04T08:10:35.601Z info: Workers version: 0.17.1
2024-10-04T08:10:35.608Z info: [Crawler] Connecting to existing browser instance: http://chrome:9222
(node:121) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
2024-10-04T08:10:35.617Z info: [Crawler] Successfully resolved IP address, new address: http://172.18.0.4:9222/
2024-10-04T08:10:39.691Z info: Starting crawler worker ...
2024-10-04T08:10:39.692Z info: Starting inference worker ...
2024-10-04T08:10:39.692Z info: Starting search indexing worker ...
2024-10-04T08:14:01.059Z info: [Crawler][1] Will crawl "https://tteck.github.io/Proxmox/" for link with id "y7tv5hn2q8w6uh4kyjz1uwsn"
2024-10-04T08:14:01.059Z info: [Crawler][1] Attempting to determine the content-type for the url https://tteck.github.io/Proxmox/
[meilisearch] Updating desired filterable attributes to id,userId from 
[meilisearch] Updating desired sortable attributes to createdAt from 
2024-10-04T08:14:01.267Z info: [search][2] Attempting to index bookmark with id y7tv5hn2q8w6uh4kyjz1uwsn ...
2024-10-04T08:14:01.321Z info: [Crawler][1] Content-type for the url https://tteck.github.io/Proxmox/ is "text/html; charset=utf-8"
2024-10-04T08:14:01.332Z info: [search][2] Completed successfully
2024-10-04T08:14:02.603Z info: [Crawler][1] Successfully navigated to "https://tteck.github.io/Proxmox/". Waiting for the page to load ...
2024-10-04T08:14:04.647Z info: [Crawler][1] Finished waiting for the page to load.
2024-10-04T08:14:04.713Z info: [Crawler][1] Finished capturing page content and a screenshot. FullPageScreenshot: false
2024-10-04T08:14:04.718Z info: [Crawler][1] Will attempt to extract metadata from page ...
2024-10-04T08:14:04.929Z info: [Crawler][1] Will attempt to extract readable content ...
2024-10-04T08:14:05.076Z info: [Crawler][1] Done extracting readable content.
2024-10-04T08:14:05.115Z info: [Crawler][1] Stored the screenshot as assetId: 1c629f89-6d67-4d72-8758-cc8501bab6d2
2024-10-04T08:14:05.436Z info: [Crawler][1] Done extracting metadata from the page.
2024-10-04T08:14:05.473Z info: [Crawler][1] Completed successfully
2024-10-04T08:14:06.121Z debug: [inference][3] No inference client configured, nothing to do now
2024-10-04T08:14:06.121Z info: [inference][3] Completed successfully
2024-10-04T08:14:06.361Z info: [search][4] Attempting to index bookmark with id y7tv5hn2q8w6uh4kyjz1uwsn ...
2024-10-04T08:14:06.424Z info: [search][4] Completed successfully
2024-10-04T08:14:18.447Z info: [search][6] Attempting to index bookmark with id svp5jqofzfistjdd4nxi162x ...
2024-10-04T08:14:18.504Z info: [Crawler][5] Will crawl "https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html" for link with id "svp5jqofzfistjdd4nxi162x"
2024-10-04T08:14:18.504Z info: [Crawler][5] Attempting to determine the content-type for the url https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html
2024-10-04T08:14:18.509Z info: [search][6] Completed successfully
2024-10-04T08:14:18.775Z info: [Crawler][5] Content-type for the url https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html is "text/html; charset=UTF-8"
2024-10-04T08:14:19.728Z info: [Crawler][5] Successfully navigated to "https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html". Waiting for the page to load ...
2024-10-04T08:14:22.734Z info: [Crawler][5] Finished waiting for the page to load.
2024-10-04T08:14:22.823Z info: [Crawler][5] Finished capturing page content and a screenshot. FullPageScreenshot: false
2024-10-04T08:14:22.828Z info: [Crawler][5] Will attempt to extract metadata from page ...
2024-10-04T08:14:24.189Z info: [Crawler][5] Will attempt to extract readable content ...
2024-10-04T08:14:25.204Z info: [Crawler][5] Done extracting readable content.
2024-10-04T08:14:25.209Z info: [Crawler][5] Stored the screenshot as assetId: 66b95c1d-8251-490b-880f-74129d1797a9
2024-10-04T08:14:25.314Z info: [Crawler][5] Done extracting metadata from the page.
2024-10-04T08:14:25.314Z info: [Crawler][5] Downloading image from "https://www.derekseaman.com/wp-content/uploads/proxmox-logo2.jpg"
2024-10-04T08:14:26.143Z info: [Crawler][5] Downloaded image as assetId: a0ea5f70-b203-4167-ae57-22699fbb425d
2024-10-04T08:14:26.184Z info: [Crawler][5] Completed successfully
2024-10-04T08:14:26.216Z debug: [inference][7] No inference client configured, nothing to do now
2024-10-04T08:14:26.216Z info: [inference][7] Completed successfully
2024-10-04T08:14:26.238Z info: [search][8] Attempting to index bookmark with id svp5jqofzfistjdd4nxi162x ...
2024-10-04T08:14:26.304Z info: [search][8] Completed successfully
2024-10-04T08:15:05.273Z debug: [inference][9] No inference client configured, nothing to do now
2024-10-04T08:15:05.273Z info: [inference][9] Completed successfully
2024-10-04T08:15:05.347Z info: [search][10] Attempting to index bookmark with id voaxsubvby0p0118tcetbqga ...
2024-10-04T08:15:05.410Z info: [search][10] Completed successfully
2024-10-04T08:15:12.431Z info: [search][11] Attempting to index bookmark with id voaxsubvby0p0118tcetbqga ...
2024-10-04T08:15:12.495Z info: [search][11] Completed successfully
2024-10-04T08:15:32.308Z debug: [inference][12] No inference client configured, nothing to do now
2024-10-04T08:15:32.308Z info: [inference][12] Completed successfully
2024-10-04T08:15:32.538Z info: [search][13] Attempting to index bookmark with id zhe353988fdrjz9ogfjzfxgq ...
2024-10-04T08:15:32.599Z info: [search][13] Completed successfully
2024-10-04T08:17:36.499Z info: [Crawler][14] Will crawl "https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html" for link with id "svp5jqofzfistjdd4nxi162x"
2024-10-04T08:17:36.499Z info: [Crawler][14] Attempting to determine the content-type for the url https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html
2024-10-04T08:17:36.619Z info: [Crawler][14] Content-type for the url https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html is "text/html; charset=UTF-8"
2024-10-04T08:17:37.559Z info: [Crawler][14] Successfully navigated to "https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html". Waiting for the page to load ...
2024-10-04T08:17:40.538Z info: [Crawler][14] Finished waiting for the page to load.
2024-10-04T08:17:40.630Z info: [Crawler][14] Finished capturing page content and a screenshot. FullPageScreenshot: false
2024-10-04T08:17:40.640Z info: [Crawler][14] Will attempt to extract metadata from page ...
2024-10-04T08:17:41.976Z info: [Crawler][14] Will attempt to extract readable content ...
2024-10-04T08:17:43.018Z info: [Crawler][14] Done extracting readable content.
2024-10-04T08:17:43.023Z info: [Crawler][14] Stored the screenshot as assetId: 46b9fb89-f755-4665-b827-a179da496f80
2024-10-04T08:17:43.127Z info: [Crawler][14] Done extracting metadata from the page.
2024-10-04T08:17:43.127Z info: [Crawler][14] Downloading image from "https://www.derekseaman.com/wp-content/uploads/proxmox-logo2.jpg"
2024-10-04T08:17:43.954Z info: [Crawler][14] Downloaded image as assetId: c26b8a7a-6bad-47a8-9748-a9c05976d290
2024-10-04T08:17:44.007Z info: [Crawler][14] Completed successfully
2024-10-04T08:17:44.024Z info: [Crawler][15] Will crawl "https://tteck.github.io/Proxmox/" for link with id "y7tv5hn2q8w6uh4kyjz1uwsn"
2024-10-04T08:17:44.024Z info: [Crawler][15] Attempting to determine the content-type for the url https://tteck.github.io/Proxmox/
2024-10-04T08:17:44.031Z info: [search][17] Attempting to index bookmark with id svp5jqofzfistjdd4nxi162x ...
2024-10-04T08:17:44.041Z debug: [inference][16] No inference client configured, nothing to do now
2024-10-04T08:17:44.041Z info: [inference][16] Completed successfully
2024-10-04T08:17:44.112Z info: [search][17] Completed successfully
2024-10-04T08:17:44.357Z info: [Crawler][15] Content-type for the url https://tteck.github.io/Proxmox/ is "text/html; charset=utf-8"
2024-10-04T08:17:45.439Z info: [Crawler][15] Successfully navigated to "https://tteck.github.io/Proxmox/". Waiting for the page to load ...
2024-10-04T08:17:47.443Z info: [Crawler][15] Finished waiting for the page to load.
2024-10-04T08:17:47.502Z info: [Crawler][15] Finished capturing page content and a screenshot. FullPageScreenshot: false
2024-10-04T08:17:47.507Z info: [Crawler][15] Will attempt to extract metadata from page ...
2024-10-04T08:17:47.564Z info: [Crawler][15] Will attempt to extract readable content ...
2024-10-04T08:17:47.620Z info: [Crawler][15] Done extracting readable content.
2024-10-04T08:17:47.626Z info: [Crawler][15] Stored the screenshot as assetId: edabe378-ecdc-4b95-8a6b-b31031342936
2024-10-04T08:17:47.645Z info: [Crawler][15] Done extracting metadata from the page.
2024-10-04T08:17:47.678Z info: [Crawler][15] Completed successfully
2024-10-04T08:17:48.069Z debug: [inference][18] No inference client configured, nothing to do now
2024-10-04T08:17:48.069Z info: [inference][18] Completed successfully
2024-10-04T08:17:48.129Z info: [search][19] Attempting to index bookmark with id y7tv5hn2q8w6uh4kyjz1uwsn ...
2024-10-04T08:17:48.191Z info: [search][19] Completed successfully
MohamedBassem commented 1 month ago

this means that you didn't set your openAI API key in the env variables

phoniclynx commented 1 month ago

ok.. so I got it working in the sense of it now tells me that my quota has now been exceeded, however there are funds in there $15 in total.. but not sure if I needed to set something else up?

2024-10-04T08:42:01.288Z info: [inference][48] Starting an inference job for bookmark with id "y7tv5hn2q8w6uh4kyjz1uwsn"

2024-10-04T08:42:03.256Z error: [inference][48] inference job failed: Error: 429 You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.

Error: 429 You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.

    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.29.0/node_modules/openai/error.js:62:20)

    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.29.0/node_modules/openai/core.js:263:33)

    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.29.0/node_modules/openai/core.js:306:30)

    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/inference.ts:2:1934)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:32:158)

    at async inferTags (/app/apps/workers/openaiWorker.ts:32:375)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:32:3805)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567)

image

Capsup commented 1 month ago

Are you using a project API key that points to a project with no quota?

I had the same problem initially and learned that I needed to use the legacy user keys instead: image

phoniclynx commented 1 month ago

It just started working out of the blue. I didn't touch anything

MohamedBassem commented 1 month ago

Seems resolved then.

fpatrick commented 1 month ago

The solution worked for @phoniclynx because he likely wasn't deploying via a repository. I just want to share the solution that worked for me (deploying via GitHub on Portainer) in case someone finds this through a search.

Add the variables in the Docker Compose environment and pass them through the Portainer UI.

My compose: https://github.com/fpatrick/compose/blob/main/hoarder/docker-compose.yml

On portainer ui set:

HOARDER_VERSION=release NEXTAUTH_SECRET=YOURLONGSTRING MEILI_MASTER_KEY=YOURANOTHERLONGSTRING NEXTAUTH_URL=http://YOURURL:PORT