Open alangmaid opened 2 months ago
2.0
branches of the APIs, but it must be enabled in the CAMPD release branches.xlarge-gp-psql
plan in Staging and the xlarge-gp-psql-redundant
plan in Production.cf create-service aws-rds xlarge-gp-psql[-redundant] camd-pg-db-v15 -c '{ "storage": 1000, "enable_pg_cron": true }'
cf create-service-key camd-pg-db-v15 camd-pg-db-key
cf ssh -NL <source-tunnel-port>:<source-rds-url>:5432 auth-api
cf ssh -NL <destination-tunnel-port>:<destination-rds-url>:5432 auth-api
pg_dump -Fc --no-acl --no-owner -p <source-tunnel-port> -U <source-username> -h localhost <source-database> \
| pg_restore --clean --no-owner --no-acl -p <destination-tunnel-port> -U <destination-username> -h localhost -d <destination-database>
Connect to the postgres
database in the new RDS instance with psql
or another client tool and follow the instructions in section 10.2 of the runbook to finish setting up pg_cron
and schedule the necessary jobs.
Coordinate with the Informatica team to disable CDC jobs before moving on the next step.
Update each application to use the new service. For each of the following Cloud.gov applications:
run the following commands:
cf bind-service <app> camd-pg-db-v15
cf unbind-service <app> camd-pg-db
cf restage <app>
cf rename-service camd-pg-db camd-pg-db-v12
cf rename-service camd-pg-db-v15 camd-pg-db
Coordinate with the Informatica team to enable CDC jobs on the new RDS instance.
Email Cloud.gov support to verify that automated backups are enabled on the new service.
After thorough testing against the new database, delete the old RDS instance:
cf delete-service-key camd-pg-db-v12 camd-pg-db-key
cf delete-service camd-pg-db-v12
@maxdiebold-erg Looks good. Here are few comments:
@maheese Thanks, I will make those updates. I get the error "This service does not support fetching service instance parameters" when trying to view the service parameters; I know the Staging DB will need pg_cron
, but should it have 1TB of storage, too?
@maxdiebold-erg I've never been able to get the view service parameters to work either. I just discussed the size of the staging DB with @mark-hayward-erg and we arrived at 1TB for staging too.
@maxdiebold-erg I had another thought about this after talking to @mark-hayward-erg. Running this on a client machine might take a while over the SSH tunnel. We've exported data to an S3 bucket by using the apt-buildpack to load the Postgres client tools into a Cloud Foundry app and then we run the export as a Cloud Foundry task. This keeps everything in the Cloud.gov environment. I think this approach could also be used with what you've written to pipe the output of the export to import. Here's the project we developed to do this https://github.com/USEPA/cf-pg-db-tasks.
Email-msg-from-cloud.gov.pdf Expect to develop plan to implement option #1 described in email, including dropping PostgreSQL v12 databases for dev, text, perf, and beta.