the lambda function's implementation will be (instead of just logging the event) inserting the event into a timestream table. This lambda function will need permission to insert into timestream table or vice versa (timestream will need to allow for it to)
There is currently a script written using boto3 that converts an event into a timestream row (see parse_cloudtrail_event) and inserts it into the right upload/download table (see write_item_to_table): msdlive-rdm-app/scripts/timestream.py
We want one stack for each deployment type (local, dev, stage, prod)
Need a CloudTrail trail for each deployment that specifies the event selector WRT the deployment. So, buckets named with the suffix of either 'local' 'dev' etc. see https://us-west-2.console.aws.amazon.com/cloudtrail/home?region=us-west-2#/trails/arn:aws:cloudtrail:us-west-2:889772541283:trail/test-logging-for-metrics/edit/dataEvents for example but we will also need to listen to upload events too (not in that example)
Need a rule and lamda function as defined in this tutorial: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-log-s3-data-events.html
the lambda function's implementation will be (instead of just logging the event) inserting the event into a timestream table. This lambda function will need permission to insert into timestream table or vice versa (timestream will need to allow for it to)
There is currently a script written using boto3 that converts an event into a timestream row (see parse_cloudtrail_event) and inserts it into the right upload/download table (see write_item_to_table): msdlive-rdm-app/scripts/timestream.py