Open albertas-jn opened 2 years ago
Perhaps this approach could be useful: Docker Backup. This might also be used to address #121 .
@Evert-R Can this issue be closed?
I tried the following approaches:
pg_dump
: since we don't have information on the migrations, we just dump every time docker-compose up
is run. Problematic because a command
in the Dockerfile doesn't wait for the start up of Postgres to be complete. Potential solution: use some wait-for-it
entrypoint (i.e,. healthcheck & dump rolled into one bash script)dumpdata
management command. Advantage: can use something along those lines: python manage.py migrate --check | if [$PIPESTATUS[0] > 0] ; then python manage.py dumpdata -o wherever/we/want/the/backup.dmp; fi
, so only run a backup if there are unapplied migrations. Disadvantage: dumpdata
will exit as well if the migrations haven't been applied, as it relies on Django's model state.So the solution is probably going to be an uninformed pg_dump
on every startup (potentially, this could be just a latest.dmp
to avoid bloating the backup volume over time), using a custom shell script to await that Postgres is available.
After
migrate
, the local database might become incompatible with other branches. We should automatically back up the existing local database so that we could switch between databases when switching branches.Related to / blocked by
893