@owahltinez mentioned that we can use one of our existing GCS buckets to host this data, and use Xenon to transfer the data. Once this is all working, the transfer should be a one-time exercise since this data rarely changes.
Unfortunately, unless one is using the cloud SDK to upload files to a bucket, one can't also set object metadata. Since these GEO json files are quite large we need to enable gzip compression. That, in-turn, requires metadata to be set in order to serve gzip encoded data correctly.
This small utility is meant to be run after these assets are uploaded.
The https://github.com/google-research/vaccination-search-insights project needs somewhere to host a large number of GEO json files in order to show zip-level data in the map.
@owahltinez mentioned that we can use one of our existing GCS buckets to host this data, and use Xenon to transfer the data. Once this is all working, the transfer should be a one-time exercise since this data rarely changes.
Unfortunately, unless one is using the cloud SDK to upload files to a bucket, one can't also set object metadata. Since these GEO json files are quite large we need to enable gzip compression. That, in-turn, requires metadata to be set in order to serve gzip encoded data correctly.
This small utility is meant to be run after these assets are uploaded.