rehanvdm / serverless-website-analytics

A CDK construct that consists of a serverless backend, frontend and client side code to track website analytics
GNU General Public License v2.0
166 stars 13 forks source link

Janitor for pruning data #1

Closed rehanvdm closed 12 months ago

rehanvdm commented 1 year ago

Create a cron job lambda that runs a CTAS query to group and store the records by page_id and highest time_on_page, combining the initial page track and the final page track. This cuts down on the data stored. The system has been designed in such a way as to cater for this. It is basically the equivalent of a vacuum command in postgress terms, but can be done without locking.

rehanvdm commented 12 months ago

Released in v1.2.0 #45

This PR contains more details about implementation. Also, see docs/CONTRIBUTING.md

The lambda function runs at 1 hour past midnight UTC. At this point, the Firehose is done writing for the previous day and no new data will be written. The function is idempotent and will delete the raw files after doing the rollup only if it is successful.

During the test, it does an incredible job at rolling up.

BEFORE: S3 Objects (388) Input rows: 650 Input bytes: 181.34 KB Output rows: 148 Output bytes: 37.37 KB

AFTER: S3 Objects (4) Input rows: 148 Input bytes: 33.73 KB Output rows: 148 Output bytes: 37.26 KB

The important part is the almost 100x reduction in S3 file count. This will decrease the amount of files that Athena has to scan by magnitudes, which means big savings to gain from this rollup.