We decided to migrate WhoTracks.me datasets from Git LFS to Amazon S3.
Git LFS worked very well for us, but every month we have more and more data. We started having problems with limits described by Philipp in #231.
Also, I have changed the GitHub Actions Workflow. The workflow fetches much smaller datasets because we started having an issue with free space on the disk. The GitHub hosted runner has only 14GB of SSD disk space.
What's changed:
All datasets from whotracksme/data/assets directory are on Amazon S3 now
The S3 bucket data.whotracks.me is publicly available. A user can read all objects and list the bucket
We decided to migrate WhoTracks.me datasets from Git LFS to Amazon S3.
Git LFS worked very well for us, but every month we have more and more data. We started having problems with limits described by Philipp in #231.
Also, I have changed the GitHub Actions Workflow. The workflow fetches much smaller datasets because we started having an issue with free space on the disk. The GitHub hosted runner has only 14GB of SSD disk space.
What's changed:
whotracksme/data/assets
directory are on Amazon S3 nowdata.whotracks.me
is publicly available. A user can read all objects and list the bucketFixes #231