Open MattCallahan-NOAA opened 3 weeks ago
Hey @MattCallahan-NOAA,
I think this is a generally a good idea, the delay from when GAP_PRODUCTS tables are updated to when they are on the AKFIN server (on the order of days - weeks) can certainly be streamlined with less people involved. I wanna suggest that we meet about this but the next time we can all meet about this is maybe mid-Dec or perhaps in the new year. Before that, I have questions:
@MattCallahan-NOAA 1) Does this require changes in WRITE privileges to the GAP_PRODUCTS_STAGE schema on the AKFIN server? Who is administering Oracle privileges on the AKFIN side? 2) Do we need to add a timestamp field before writing to GAP_PRODUCTS_STAGE or is that automatically created when records are inserted? 3) How are tables in GAP_PRODUCTS_STAGE updated? Are tables entirely dropped and then uploaded with the updated table? 4) Can we do testing of this approach while in the meantime continuing to do the transfers as it's done currently?
@EmilyMarkowitz-NOAA 5) This implies that we don't need to house the AKFIN_ tables in GAPPRODUCTS anymore? I think there are a few AKFIN tables that are used by other processes (creation of FOSS tables, Bering Sea Data report?) so we need to reconcile which of these tables are used outside of AKFIN.
@Ned-Laman-NOAA 6) Is there any context I'm missing here? Advantages to the "rumble strips" of how we do the AKFIN transfer currently?
Mid December meeting sounds good. 1) We would give you password to the GAP_PRODUCTS_STAGE schema, and you would connect as GAP_PRODUCTS_STAGE to upload. 2) I recommend that AKFIN adds that with the push to production, but adding in R prior to the upload is also possible. 3) I like truncating the tables and then appending rather than dropping. It probably doesn't matter as much for a stage schema but in general, grants, comments, etc. are preserved that way. 4) Yes! Plus 10 to testing before deploying. If you log into your personal schema in AKFIN you could test uploading some tables to that. I am also working on a general akfinupload R package but its still in internal testing and honestly your oracle upload code is pretty good as is so it might be easier to just use what you use to load to the AFSC db.
Issue
Zack, you are not allowed to respond to this until you are back from vacation.
At today's office hours we discussed GAP uploading tables directly to the AKFIN database through an oracle connection in R in order to streamline the transfer of data to AKFIN. I believe this would entail an update of the update_production_tables.R script to also push to the GAP_PRODUCTS_STAGE schema on the AKFIN server. Once that ran we could script a notification to AKFIN to load to production. This would remove the back and forth with OFIS to transfer updated data. Let me know if this is something you are interested in pursuing and if so we can set up a test load.