We are currently relying on the database dumps from crates.io. These are large .sql and .json files that are ingested into the postgres database. Downloading these files and updating the database is a slow operation (~20 minutes). We do not want the site to be unavailable at this time, so how can we update the database whilst still serving production traffic on the old data.
Some approaches are:
Manage two databases, use a connection pool in the main app, and after a database ingest, swap database connections to use the new backend.
We are currently relying on the database dumps from
crates.io
. These are large.sql
and.json
files that are ingested into the postgres database. Downloading these files and updating the database is a slow operation (~20 minutes). We do not want the site to be unavailable at this time, so how can we update the database whilst still serving production traffic on the old data.Some approaches are: