it might be worth adding the logic in the script to bucket the output into directories.
If file_time_range is set at d then monthly directories are created (YYYY-MM) -- same logic as posting to cloudflare, if m is selected, then yearly directories are created (YYYY), if y selected then no directories created.
In CTI Butler we store CVE & CPE records.
b/c hundreds are added and updated each day, this means that someone needs to be running this script daily.
We may as well bring the whole flow into this logic using github actions
It will work something like this
0. r2 data structure
nvd_cve_daily
nvd_cpe_daily
1. run cpe/cve backfill daily
05:00 UTC each day two jobs will be run using GH actions
This will create files like...
cve-bundle-2023_01_01-2023_01_01.json
This will create files like...
cve-bundle-2023_01_01-2023_01_01.json
2. upload cve/cpe bundle to R2
The script will then upload the respective Cloudflare R2 buckets
something along the lines of
https://community.dogesec.com/t/uploading-large-objects-to-cloudflare-r2-using-rclone/94
note
it might be worth adding the logic in the script to bucket the output into directories.
If file_time_range is set at
d
then monthly directories are created (YYYY-MM) -- same logic as posting to cloudflare, ifm
is selected, then yearly directories are created (YYYY), ify
selected then no directories created.