coollabsio / coolify

An open-source & self-hostable Heroku / Netlify / Vercel alternative.
https://coolify.io
Apache License 2.0
31.46k stars 1.61k forks source link

[Bug]: Database Backup for postgres database in the supabase stack failing #2977

Open moby3012 opened 1 month ago

moby3012 commented 1 month ago

Description

The in-dashboard backup solution for supabase postgres database is not working

Minimal Reproduction (if possible, example repository)

After setting up a fresh new supabase stack, when you try to create a backup job using the coolify dashboard to the supabase-postgres-db container the execution of this always fails, presumably due to an error of the user, executing the backup not having enough rights. Tried locally and to S3, the problem occurs before it tries to save the backup somewhere.

Unfortunately I cannot see what user is executing the backup command nor change the rights for it. Additionally I had the idea to exclude the analytics schema from the backup, but that's also not possible from within the dashboard.

Exception or Error

Database: postgres Status: failed Started At: 2024-07-30 10:52:34 Message: pg_dump: error: query failed: ERROR: permission denied for schema _analytics pg_dump: detail: Query was: LOCK TABLE _analytics.schema_migrations IN ACCESS SHARE MODE Size: 0 B / 0 kB / 0 MB Location: N/A

Version

4.0.0-beta.319

Cloud?

fabioallexandr3 commented 1 month ago

Instead of using the postgres user, use supabase_admin; it should work.

moby3012 commented 1 month ago

I would if it'd be possible to change the user that executes this action, but there is no option to do this, or I haven't found it. Neither do I know the supabase_admin password? But I think this might be found in the env-variables.

grafik

moby3012 commented 1 month ago

Update:

Although the original error still exists, where i cannot modify the backup process through the UI (neither can i set a User nor exclusions to make the backup work) i managed to create a backup with a decent size via this command:

PGPASSWORD=myVerySecurePassword pg_dump -h 127.0.0.1 -U supabase_admin -d postgres --exclude-schema=_analytics --clean --if-exists --no-owner --no-acl --format=plain | gzip > /tmp/backup-$(date +%Y%m%d-%H%M%S).sql.gz

this would be ideal to transfer to my s3 backup bucket, but neither aws, wget nor curl is installed within the container x(

Timopheym commented 3 weeks ago

I am experiencing the same issue with v4.0.0-beta.323 and getting an error

pg_dump: error: query failed: ERROR: permission denied for schema _analytics pg_dump: detail: Query was: LOCK TABLE _analytics.schema_migrations IN ACCESS SHARE MODE

I don't see a possibility to change the backup process pg credentials. Where is it @fabioallexandr3 ?

Thanks!

Timopheym commented 3 weeks ago

I fixed it with given rights to postgres user right from supabase SQL interface.

GRANT USAGE ON SCHEMA _analytics TO postgres;
GRANT SELECT ON ALL TABLES IN SCHEMA _analytics TO postgres;

GRANT USAGE ON SCHEMA _realtime TO postgres;
GRANT SELECT ON ALL TABLES IN SCHEMA _realtime TO postgres;

GRANT USAGE, SELECT ON SEQUENCE _analytics.billing_accounts_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.billing_counts_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.endpoint_queries_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.oauth_access_grants_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.oauth_access_tokens_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.oauth_applications_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.partner_users_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.partners_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.payment_methods_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.plans_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.rules_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.saved_search_counters_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.saved_searches_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.source_backends_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.source_schemas_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.sources_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.system_metrics_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.team_users_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.teams_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.users_id_seq TO postgres;
GRANT USAGE, SELECT ON SEQUENCE _analytics.vercel_auths_id_seq TO postgres;

but looks like a hack for me, would be happy to have a proper solution, thanks!

Netyson commented 2 weeks ago

Maybe it could work like this: image

This way, you could customize the pg_dump command. However, I'm unsure about how to configure it to save directly to an S3 bucket. Does anyone have insights on this?