Closed Aetherinox closed 2 months ago
I found the commands on Discord
DO
$$
DECLARE
r RECORD;
BEGIN
-- Loop through all table names
FOR r IN (SELECT tablename FROM pg_tables WHERE schemaname = 'public') LOOP
-- Drop each table
EXECUTE 'DROP TABLE IF EXISTS ' || quote_ident(r.tablename) || ' CASCADE';
END LOOP;
END
$$;
and then I went into the shell for the container, ran the pnpm
command, restarted. The console has zero errors now, but the site is still unreachable
[2024-09-02T16:14:34 INFO db] connecting to database postgresql://postgres:postgres@ziplinepostgres:5432/postgres
[2024-09-02T16:14:34 DEBUG config::validate] environment validated: {"core":{"port":3000,"hostname":"localhost","secret":"12345678901234567890","databaseUrl":"postgresql://postgres:postgres@ziplinepostgres:5432/postgres","returnHttpsUrls":false,"defaultDomain":null,"tempDirectory":"/tmp/zipline"},"chunks":{"max":99614720,"size":26214400,"enabled":true},"tasks":{"deleteInterval":1800000,"clearInvitesInterval":1800000,"maxViewsInterval":1800000,"thumbnailsInterval":1800000,"metricsInterval":1800000},"files":{"route":"/u","length":6,"defaultFormat":"random","disabledExtensions":[],"maxFileSize":104857600,"defaultExpiration":null,"assumeMimetypes":false,"defaultDateFormat":"YYYY-MM-DD_HH:mm:ss","removeGpsMetadata":false},"urls":{"route":"/go","length":6},"datasource":{"type":"local","local":{"directory":"/zipline/uploads"}},"features":{"imageCompression":true,"robotsTxt":false,"healthcheck":true,"userRegistration":false,"oauthRegistration":false,"deleteOnMaxViews":true,"thumbnails":{"enabled":true,"num_threads":4},"metrics":{"enabled":true,"adminOnly":false,"showUserSpecific":true}},"invites":{"enabled":true,"length":8},"website":{"title":"Zipline","titleLogo":null,"externalLinks":[{"name":"GitHub","url":"https://github.com/diced/zipline"},{"name":"Documentation","url":"https://zipline.diced.tech"}],"loginBackground":null,"defaultAvatar":null,"theme":{"default":"system","dark":"builtin:dark_gray","light":"builtin:light_gray"},"tos":null},"mfa":{"totp":{"enabled":false,"issuer":"Zipline"},"passkeys":true},"oauth":{"bypassLocalLogin":false,"loginOnly":false,"discord":{},"github":{},"google":{},"authentik":{}},"discord":null,"ratelimit":{"enabled":true,"max":10,"window":null,"adminBypass":true,"allowList":[]},"httpWebhook":{"onUpload":null,"onShorten":null},"ssl":{"key":null,"cert":null}}
[2024-09-02T16:14:34 INFO server] starting zipline mode="production" version="4.0.0-dev+1"
[2024-09-02T16:14:34 INFO server] reading environment for configuration
[2024-09-02T16:14:34 DEBUG config::validate] environment validated: {"core":{"port":3000,"hostname":"localhost","secret":"12345678901234567890","databaseUrl":"postgresql://postgres:postgres@ziplinepostgres:5432/postgres","returnHttpsUrls":false,"defaultDomain":null,"tempDirectory":"/tmp/zipline"},"chunks":{"max":99614720,"size":26214400,"enabled":true},"tasks":{"deleteInterval":1800000,"clearInvitesInterval":1800000,"maxViewsInterval":1800000,"thumbnailsInterval":1800000,"metricsInterval":1800000},"files":{"route":"/u","length":6,"defaultFormat":"random","disabledExtensions":[],"maxFileSize":104857600,"defaultExpiration":null,"assumeMimetypes":false,"defaultDateFormat":"YYYY-MM-DD_HH:mm:ss","removeGpsMetadata":false},"urls":{"route":"/go","length":6},"datasource":{"type":"local","local":{"directory":"/zipline/uploads"}},"features":{"imageCompression":true,"robotsTxt":false,"healthcheck":true,"userRegistration":false,"oauthRegistration":false,"deleteOnMaxViews":true,"thumbnails":{"enabled":true,"num_threads":4},"metrics":{"enabled":true,"adminOnly":false,"showUserSpecific":true}},"invites":{"enabled":true,"length":8},"website":{"title":"Zipline","titleLogo":null,"externalLinks":[{"name":"GitHub","url":"https://github.com/diced/zipline"},{"name":"Documentation","url":"https://zipline.diced.tech"}],"loginBackground":null,"defaultAvatar":null,"theme":{"default":"system","dark":"builtin:dark_gray","light":"builtin:light_gray"},"tos":null},"mfa":{"totp":{"enabled":false,"issuer":"Zipline"},"passkeys":true},"oauth":{"bypassLocalLogin":false,"loginOnly":false,"discord":{},"github":{},"google":{},"authentik":{}},"discord":null,"ratelimit":{"enabled":true,"max":10,"window":null,"adminBypass":true,"allowList":[]},"httpWebhook":{"onUpload":null,"onShorten":null},"ssl":{"key":null,"cert":null}}
[2024-09-02T16:14:34 DEBUG migrations] running migrations...
[2024-09-02T16:14:34 DEBUG migrations] ensuring database exists...
[2024-09-02T16:14:35 DEBUG migrations] applying migrations...
[2024-09-02T16:14:35 DEBUG migrations] no migrations applied
[2024-09-02T16:14:37 INFO server] server started hostname="localhost" port=3000
[2024-09-02T16:14:37 DEBUG tasks] starting tasks tasks=9
[2024-09-02T16:14:37 DEBUG tasks] running first run id="deletefiles"
[2024-09-02T16:14:37 DEBUG tasks] started interval task id="deletefiles" interval=1800000
[2024-09-02T16:14:37 DEBUG tasks] running first run id="maxviews"
[2024-09-02T16:14:37 DEBUG tasks] started interval task id="maxviews" interval=1800000
[2024-09-02T16:14:37 DEBUG tasks] running first run id="metrics"
[2024-09-02T16:14:37 DEBUG tasks] started interval task id="metrics" interval=1800000
[2024-09-02T16:14:37 DEBUG tasks] started worker id="thumbnail-0"
[2024-09-02T16:14:37 DEBUG tasks] started worker id="thumbnail-1"
[2024-09-02T16:14:37 DEBUG tasks] started worker id="thumbnail-2"
[2024-09-02T16:14:37 DEBUG tasks] started worker id="thumbnail-3"
[2024-09-02T16:14:37 DEBUG tasks] running first run id="thumbnails"
[2024-09-02T16:14:37 DEBUG tasks] started interval task id="thumbnails" interval=1800000
[2024-09-02T16:14:37 DEBUG tasks] running first run id="clearinvites"
[2024-09-02T16:14:37 DEBUG tasks] started interval task id="clearinvites" interval=1800000
[2024-09-02T16:14:41 DEBUG tasks::thumbnail-0] started thumbnail worker
[2024-09-02T16:14:40 DEBUG tasks::thumbnail-1] started thumbnail worker
[2024-09-02T16:14:41 DEBUG tasks::thumbnail-2] started thumbnail worker
[2024-09-02T16:14:40 DEBUG tasks::thumbnail-3] started thumbnail worker
[2024-09-02T16:14:43 DEBUG tasks::deletefiles] found 0 expired files files=[]
[2024-09-02T16:14:43 DEBUG tasks::maxviews] found 0 expired files files=[]
[2024-09-02T16:14:43 DEBUG tasks::clearinvites] found 0 expired invites files=[]
[2024-09-02T16:14:43 DEBUG tasks::maxviews] found 0 expired urls dests=[]
[2024-09-02T16:14:43 DEBUG tasks::clearinvites] found 0 max used invites files=[]
[2024-09-02T16:14:43 DEBUG tasks::metrics] created metric id="cm0l7baa10000899a48y3pyeh" metric={"files":0,"urls":0,"users":0,"storage":0,"fileViews":0,"urlViews":0,"filesUsers":[],"urlsUsers":[],"types":[]}
Tried with and without https, and about a dozen other ways. NO access.
What happened?
Just a head's up. I saw that v4 was a possible tag, so I decided to try it for a few just to see what the differences were like.
When I install v4, there's absolutely no connection at all, and no way to access the server. When I access it locally via browser, I just get
Unable to connect
.Version
other (provide version in additional info)
What browser(s) are you seeing the problem on?
Firefox, Chromium-based (Chrome, Edge, Brave, Opera, mobile chrome/chromium based, etc)
Zipline Logs
However on restart without the entrypoint, I see:
I don't get why it's doing migrations, as I have nothing to migrate. I've completely wiped the db folder and started fresh. Nothing to convert.
The only other errors I noticed was on first run again, I found
And those settings are coming from your v4 branch example.
Browser Logs
No logs.
Only:
Additional Info
Docker Compose
v3 runs fine.
Tried about every combination of env vars you can think of. Yet when I migrate back to v3, works perfectly. Tried removing most from v4 thinking it may have changed dramatically.
Yup. The error refuses to go away.
I tried both db names,
postgres
andzipline
. default password, and a whole slew of things.And the Postgres container is just throwing
Fresh install, no existing DB, etc. Got desperate and even assigned a static local ip just for whatever reason. Absolutely no amount of tinkering corrects it.
Opened up port
5432
and connected using DB Beaver from an outside source, and all the tables are there. Connection is fine.