Open GabeIsman opened 8 years ago
@gabeisman to be honest, I always just blow away the local sqlite db when I change the schema. I then update the tests / test data generation to reflect the schema change. I'm sure @ryansb has some thoughts on this but my take is if it isn't production, just blow it away.
@thequbit That's totally fine (if less than ideal, sometimes test data takes a long time to set up), but we still need a story for production.
Definitely will need a story for production. I'm not super familiar with the flow for bridge schema's for PostgreSQL, but it's certainly a well documented thing. For the time being, I think the extras
field that will take json will keep us moving forward in development ( see #88 ).
As for setting up test data, I'll create a ticket for that so we can just run a script and have it load it in.
There's an app for that (tm)!
In OpenStack we use alembic to generate migrations between sqla schema versions.
It pretty well resolves the migration story, but I don't feel like that's something we need to do "right now" as we don't have a production environment. I'd rather put the work into a loader for test data for now, and then build migrations when we have a "real" deployment. #JustInTimeDevOps
I just ran into an issue where I wanted to make some fields Nullable. I updated the models to have the appropriate columns, but couldn't figure out how to migrate the DB schema. Looking into it a little bit it seems like alembic is probably the preferred tool for this. I tried to get it set up but stumbled a bit configuring it on some issues that I think would be easy to resolve if I were a little more familiar with the python module system. Would be great if someone with a little more familiarity could take this on and save me the headache!
In the meantime I'll just be blowing away my database and re-running initializedb to 'migrate' my schema.