prodrigestivill / docker-postgres-backup-local

Backup PostgresSQL to local filesystem with periodic backups and rotate backups.
https://hub.docker.com/r/prodrigestivill/postgres-backup-local
MIT License
826 stars 140 forks source link

What do you use to save the backup off the database server? #79

Closed hermesalvesbr closed 2 years ago

hermesalvesbr commented 2 years ago

Thanks for existing, it's helping a lot and it works perfectly.

I would like to know if there is any other image that sends the backup files by email or to google drive, for example.

Saving external is safer, isn't it? Thanks!

slhck commented 2 years ago

Just commenting here as a user of this script. Usually you'd store the generated files using Amazon S3 or Google Cloud Storage (or any other cloud storage provider). Sending via email seems like a suboptimal choice (too large files), and Google Drive cannot be easily interacted with from the command line.

In my case I have a cron script that copies the latest daily backup to Google Cloud Storage on a daily basis.

You need the gsutil (Google Cloud SDK command line tools) program installed, and have it authenticated with a respective service account that can write to a certain bucket.

#!/usr/bin/env bash

GOOGLE_CLOUD_STORAGE_BUCKET="my-bucket-name" # replace here
BACKUP_DIR="$(realpath ./backup)"
lastDailyBackup="$(find "$BACKUP_DIR/daily" -type f -printf '%T+ %p\n' | sort -r | head -n 1 | cut -d' ' -f2)"

if [[ ! -f "$lastDailyBackup" ]]; then
    echo "No latest daily backup file found in $BACKUP_DIR!"
    exit 1
fi

if [[ $(gcloud config get-value account 2>&1) == *unset* ]]; then
    echo "'gcloud' was not authenticated. Use 'gcloud auth activate-service-account' to authenticate it!"
    exit 1
fi

dailyFileName="$(basename "$lastDailyBackup")"
gsutil cp -n "$lastDailyBackup" "gs://$GOOGLE_CLOUD_STORAGE_BUCKET/$dailyFileName"