Open enzoampil opened 4 years ago
Store contents into a separate database and ingest daily via cronjob (or pipeline tool of choice).
DB - Google BigQuery Pipeline - cron on a compute instance / cloud function running daily
Makes data more fault tolerant in case either of our datasources are deleted, stop being updated, are made private.
My thoughts are only to resort to this if the APIs prove to be sufficiently unstable. I.e. people start complaining
Store contents into a separate database and ingest daily via cronjob (or pipeline tool of choice).
DB - Google BigQuery Pipeline - cron on a compute instance / cloud function running daily
Makes data more fault tolerant in case either of our datasources are deleted, stop being updated, are made private.