radical-data / queering-the-map

Queering the Map is a community-based platform where individuals anonymously pin their queer experiences and stories on a global map.
https://queeringthemap.com
18 stars 5 forks source link

Prepare the data migration #49

Closed jokroese closed 3 months ago

jokroese commented 4 months ago

Here's my sketch of how we'll do the migration. Let me know what you think!

Preparation

  1. Prepare Freeze Announcement
    • Write code to announce the temporary freeze on new submissions and updates on the old website.
  2. Write Translation Scripts
    • Develop scripts to translate data from the old Django schema to the new Supabase schema.
  3. Practice Run
    • Perform a practice run of the data extraction, translation, and import processes without the freeze and domain switch to identify any issues.
  4. Set Low TTL on DNS Records
    • Reduce the TTL on Cloudflare DNS records a day before the migration to ensure quick propagation of the change.
  5. Prepare Supabase
    • Adjust the statement timeout settings for Supabase to prevent timeouts during large data import operations.

Migration

  1. Freeze and Backup
    1. Announce the temporary freeze on submissions on the old website.
    2. Back up the old Django database.
  2. Data Extraction

    1. Export data from the Django database:

      python manage.py dumpdata > data.json
    2. Verify the export to ensure all data is included.

  3. Data Translation
    1. Translate the data from the old schema to the new schema using the prepared scripts.
  4. Data Import
    1. Import the translated data into the new Supabase database, probably using pgloader (https://supabase.com/docs/guides/database/import-data#option-2-bulk-import-using-pgloader)
    2. Verify the import to ensure all data is there.
  5. Domain Switch
    1. Update the DNS records on Cloudflare to point to the new site on Netlify.

Post-Migration

  1. Monitor
    1. Monitor the site post-transition to ensure everything is working.
jokroese commented 4 months ago

@narcode, for the Data Import part, do you think we should do it with pgloader or one of the other methods Supabase mentions?

narcode commented 4 months ago

@narcode, for the Data Import part, do you think we should do it with pgloader or one of the other methods Supabase mentions?

i think that if we can pull it off with importing with a csv we should do that. As i understand it pg_loader can import the whole database but we just want moments records?

jokroese commented 4 months ago

That's right. I was just worried that csv import might be more prone to an issue?

narcode commented 4 months ago

im sure we can find ways to verify the integrity of the data