Closed gabrielspadon closed 8 months ago
The process of importing data from Spire into the Postgres database is a two-phase operation:
The script for the first phase was provided by Matt, and it appears to be a legacy from the previous data manager. It functions independently of the AISdb library, which allows us to continue its utilization without modifications.
For the second phase, I have authored and locally tested a new script, and seems good. I am using this script for processing October's data for insertion into the database now. This script relies on the AISdb library, thus future updates to the library that are not backward-compatible may necessitate revisions to the script.
The challenge with relying on an automation service is the unpredictability of Spire's data uploads to Wasabi, which seem to be conducted manually and without a consistent schedule.
The October dataset has been successfully integrated into our Postgres database 🎉
This issue has been moved to "Done," so I'm closing it. Thank you!
I'm re-opening this issue as we are having issues with the pg_export and the script of data importation for exporting and re-creating the database from scratch. This issue concentrates the efforts we are making on these tasks.
Please create a script that can import data from Spire into the Postgres database. The script should establish connections to the required services such as Wasabi and Postgres, create the databases in the AISdb structure, and commit the results once the insertion is validated. If possible, the process should be automated as a background service in BigData 1, using crontabs or other suitable services that can run without user interference. In the case of automation, detailed documentation should be provided for future maintainers on how to run/debug the service.