tiredofit / docker-db-backup

Backup multiple database types on a scheduled basis with many customizable options
MIT License
846 stars 124 forks source link

backup of Postgres fails if pg_hba.conf forces encryption #369

Open jkoopmann opened 19 hours ago

jkoopmann commented 19 hours ago

Summary

I am unable to backup a postgresql probably due to encryption problems.

Steps to reproduce

Setup a postgresql server that requires SSL encrypted user sessions. Try to backup with the typical DB01_NAME _USER _HOST _PASSWORD settings.

2024-09-19.10:26:14 [INFO] ** [db-backup04] Starting Manual Backup for db-backup04 2024-09-19.10:26:17 [NOTICE] ** [04-x.x.x.x__postgres] Dumping PostgresSQL database: 'postgres' and compressing with 'zstd' pg_dump: error: connection to server at "x.x.x.x", port 5432 failed: could not open certificate file "/root/.postgresql/postgresql.crt": Permission denied connection to server at "x.x.x.x", port 5432 failed: FATAL: pg_hba.conf rejects connection for host "x.x.x.x", user "maitadmin", database "postgres", no encryption 2024-09-19.10:26:17 [INFO] ** [04-x.x.x.x__postgres] DB Backup of 'pgsql_postgres_x.x.x.x_20240919-102617.sql.zst' completed successfully

If I exec into the docker having the problem and use psql to login to the server with the credentials provided in the compose.yml, the connection works like a charm. So I can only assume that network and pg_hba.conf are correct. db_backup however fails with the above error.

What is the expected correct behavior?

Login and backup working from db_backup

Relevant logs and/or screenshots

See above.

jkoopmann commented 19 hours ago

Funny thing:

running pg_dump from the normal prompt (root I assume) works:

[tiredofit/db-backup:4.1.1 10:42:10 /] $ pg_dump -h x.x.x.x -p 5432 -U maitadmin postgres Password: pg_dump: error: query failed: ERROR: permission denied for table postgres_log pg_dump: detail: Query was: LOCK TABLE public.postgres_log IN ACCESS SHARE MODE [tiredofit/db-backup:4.1.1 10:42:26 /] $

Locking error but let's ignore that. If I run it as the script does with the dbbackup user however:

[tiredofit/db-backup:4.1.1 10:41:41 /] $ sudo -Eu dbbackup pg_dump -h x.x.x.x -p 5432 -U maitadmin postgres pg_dump: error: connection to server at "x.x.x.x", port 5432 failed: could not open certificate file "/root/.postgresql/postgresql.crt": Permission denied connection to server at "x.x.x.x", port 5432 failed: FATAL: pg_hba.conf rejects connection for host "x.x.x.x", user "maitadmin", database "postgres", no encryption

This looks like a permission problem in the db_backup part of this.

tiredofit commented 14 hours ago

I'm not sure I would qualify this as a bug but more so an inability to backup with an encrypted connection.

You could try fiddling with the container options specifically the USER_DBBACKUP options, alternatively to try to mount your certificate to /home/dbbackup/.... There doesn't look like to be any command line arguments based on a cursory look to set a custom location for the certs which we could work into the pg_dump process. You could also get creative with the image pre / post hooks and do some scripting to get them into the right places, alternatively some of the base image "hooks" (custom scripts/custom assets) functionality could also help with scaffolding to support it.

After that, it would require to be a feature to add into the image to support this sort of thing. I see it as beneficial.

EDIT: It does seem there is an ability to pass some sort of arguments regarding custom paths for certificates based on a random search here.. There are some environment variables that could append some of those strings onto your psql command line which may work in the short term?

jkoopmann commented 10 hours ago

If it was not for my second mail I would tend to agree. However: The pgdump installation in the docker obviously is able to communicate to the database in question (tls encrypted) without any fiddling around. I never installed any type of certificate in that docker. So the question is: Why is this working as the root user in the docker but not as the dbbackup user? Why would I need to do something with options/hooks or import a certificate in /home/dbbackup when the same docker, network, database etc. is working like a charm, just with the root user.

sudo -Eu dbbackup export PGSSLCRL=/tmp; psql -h x.x.x.x -U maitadmin -d postgres

actually does work. So the remaining question is: How can I set the environment variable PGSSLCRL (and to make it complete presumably the other PGSSL* variables as well) from within docker-compose.yml. That would indeed solve the issue and would be an improvement.

Thanks very much for your support!

tiredofit commented 10 hours ago

Interesting that you don't even have a certificate to begin with ! My assumption is that you did due to the initial error message.

So it seems the execution of pg_dump is looking to get inside the /root/ folder and it can't so its throwing the errors, even though there is no certificate within /root to begin with.

I have a function that looks like this

run_as_user() {
    sudo -Eu "${DBBACKUP_USER}" "$@"
}

that is executing the pg_dump command which looks very similar to yours. Above. While not optimal, but certainly solves the problem you should be able to set DBBACKUP_USER as root and this should allow for the backup. Looking at the README this is actually undocumented, but was the way things were done in the 3.xx series where everything was run as the root user.

WRT to a prefix to be able to insert those environment variables it's not possible but would be fairly easy to insert into the code I believe, but lets see how this user context change works first and then see if there is an easy way.