dergachev / vagrant_drupal

Deploy a local Drupal development stack with Vagrant.
9 stars 2 forks source link

db backup workflow #4

Open dergachev opened 12 years ago

dergachev commented 12 years ago

Document a good workflow to backup the db. One idea: drush sql-dump | gzip | /vagrant/db/sitename-TIMESTAMP.sql.gz Perhaps vagrant could create a script to do this with no args.

dergachev commented 12 years ago

FYI: drush sql dump > /path/to/BACKUPS/site_name-$(date +%Y-%m-%d-%H.%M.%S).sql via http://snipplr.com/view/17046/

dergachev commented 11 years ago

Alternatively use drush sql-dump --result-file without a path argument:

$ drush sql-dump --result-file
Database dump saved to /home/vagrant/drush-backups/drupal/20130617155729/drupal_20130617_155729.sql
dergachev commented 11 years ago

The only limitation of drush sql-dump --result-file is that it puts it in the VM, not in the host. But perhaps we should mount /home/vagrant/drush-backups as a vagrant shared folder? Otherwise we can install a backup script that's a wrapper on drush sql-dump.

dergachev commented 11 years ago

For the record, here's an old-school "drupal-backup.sh" script we have lying around:

#!/bin/bash

BACKUPDIR=/var/shared/backup
if SQLCONF=`drush sql-conf --show-passwords 2>/dev/null`
then
  DBNAME=`echo "$SQLCONF" | grep database | sed 's/^.*=> //'`
  USER=`  echo "$SQLCONF" | grep username | sed 's/^.*=> //'`
  PASS=`  echo "$SQLCONF" | grep password | sed 's/^.*=> //'`

elif DATA=`cat config/database.yml 2>/dev/null | grep -A 6 "production:"`
then
# TODO: Find a better way to test if we're really in a redmine site.
#
# TODO: Do we actually only ever want production databases? At present the
#       dev VM uses the same DB for prod and dev, so it doens't matter,
#       but we shouldn't just assume that will always be so.

  DBNAME=`echo "$DATA" | grep database | sed 's/^.*: //'`
  USER=`  echo "$DATA" | grep username | sed 's/^.*: //'`
  PASS=`  echo "$DATA" | grep password | sed 's/^.*: //'`

  # The Drupal specific table search makes no sense here.
  ALL=0

else
  echo "Unable to dump database. This script will only run within a drupal or redmine site."
  exit 1
fi

DUMPFILE=${BACKUPDIR}/${DBNAME}-`date "+%Y-%m-%d:%T"`.sql.gz
TMPFILE=${BACKUPDIR}/${DBNAME}-`date "+%Y-%m-%d:%T"`.sql
LATESTFILE=${BACKUPDIR}/${DBNAME}-latest.sql.gz
STRUCTURE_ONLY='/^(cache(_.+)?|watchdog|sessions|search_dataset|search_index|search_total|search_node_links|boost_cache|boost_cache_relationships|history)$/'

#if I only want the path to the latest dump
if [[ "$1" = "DBHEAD" ]]
then
  echo "$LATESTFILE"
  exit 0
fi

#set -o pipefail #for the mysqldump error check

#if I I need the cache, sessions and watchdog tables
if [[ "$1" = "ALL" || "$ALL" = 0 ]]
then
    /usr/bin/mysqldump --opt --skip-add-locks --user="${USER}" --password="${PASS}" $DBNAME | gzip > $DUMPFILE
#set +o pipefail
else
# Get the tables from the database
    TABLES=`mysql -u$USER -p$PASS -B -N -e 'show tables;' $DBNAME`

# Loop over the tables
    for t in $TABLES; do
  # Test if the table matches the 'structur only' regex
    RESULT=`echo "$t" | gawk "$STRUCTURE_ONLY"`

  # if a match...
    if [ $RESULT ]
    then
    # ... dump structure only onto the end of the SQL file
        mysqldump --opt --no-data --user=$USER --password=$PASS $DBNAME $t >> $TMPFILE
    else
    # dump full table onto the end of the SQL file
        mysqldump --opt --user=$USER --password=$PASS $DBNAME $t >> $TMPFILE
    fi
    done
    gzip $TMPFILE
fi

echo "backed up $DBNAME to $DUMPFILE"

ls -alh $DUMPFILE
zcat $DUMPFILE | head -n5

unlink $LATESTFILE
ln -s $DUMPFILE $LATESTFILE
echo "DBHEAD: $LATESTFILE"

At this point, there's no excuse not to use drush sql-dump, but this has some interesting ideas...