This commits allows Mastercard updates to fetch the data directly from
the bucket. Example for loading one month of Australia:
PGSERVICE=postgres10 docker-compose run -d -e LOGGING_FILE=etl_mc.data.log -e GOOGLE_APPLICATION_CREDENTIALS=/bigmetadata/tmp/mrli-reader.json bigmetadata luigi --module tasks.mc.data tasks.mc.data.AllMCData --country au --month 201811 --log-level INFO
This will be used to update the data. See CartoDB/do_tiler/issues/66.
The updated rows can be dumped with scripts/dump_mastercard_months.sh.
The dumped rows can be loaded with scripts/load_mastercard_months.sh.
For a complete automation we should:
1.- Trigger the load automatically when the bucket content changes.
2.- Run the dump for the updated months.
3.- Copy the dump files.
4.- Run the load.
Now, it's a matter of plugging the pipeline, but all tools are there :-)
For acceptance, updating all staging data would be quick and complete. I've already updated Australia.
This commits allows Mastercard updates to fetch the data directly from the bucket. Example for loading one month of Australia:
This will be used to update the data. See CartoDB/do_tiler/issues/66.
The updated rows can be dumped with
scripts/dump_mastercard_months.sh
.The dumped rows can be loaded with
scripts/load_mastercard_months.sh
.For a complete automation we should:
1.- Trigger the load automatically when the bucket content changes. 2.- Run the dump for the updated months. 3.- Copy the dump files. 4.- Run the load.
Now, it's a matter of plugging the pipeline, but all tools are there :-)
For acceptance, updating all staging data would be quick and complete. I've already updated Australia.