Now that we have a test DB server (calv-panthergres.med.usc.edu; 207.151.20.155) to use for a PAINT test environment, we should decide on how to regularly refresh the data. This should be as simple as:
# On prod DB server
pg_dump -d Curation --username postgres > Curation.dump
scp Curation.dump calv-panthergres.med.usc.edu:/pgres_data/
# On test DB server
psql Curation < Curation.dump
However, since the Curation DB size is currently 300 GB:
Dump and restore process will be slow.
The prod DB server doesn't even have enough space to dump this out.
We should try to reduce the size of the DB and, of course, clear up enough space to make the dump. @mugitty @xiaosonghuang @huaiyumi Got any ideas for panthertestdb server files to delete or Curation DB data we can do without?
Noting that we'll also need to figure out how to get the foreign data wrapper extension to work on the postgres server in order to mirror the update process between test and prod.
Now that we have a test DB server (calv-panthergres.med.usc.edu; 207.151.20.155) to use for a PAINT test environment, we should decide on how to regularly refresh the data. This should be as simple as:
However, since the Curation DB size is currently 300 GB:
We should try to reduce the size of the DB and, of course, clear up enough space to make the dump. @mugitty @xiaosonghuang @huaiyumi Got any ideas for panthertestdb server files to delete or Curation DB data we can do without?