The SQL file here captures the views and tables we've created for billing so far in Redshift. The python scripts correspond to two other lambdas:
upload_billing_resource_hours_to_lago.py: uploads all billing resource counts to Lago to be counted in each user's Usage (and eventually, invoice).
track_resource_counts.py: processes incoming events from SDF into a coherent table, workspace_resource_counts, with a row for each time a workspace's resource count changed, as well as the timespan it was active, so you can query it to find out what count a workspace had at any arbitrary time. This will be replaced with workspace_resource_counts_view in the SQL when I have time for testing that.
si_logging.py: just sets up a common logging object.
si_redshift.py: a little query module letting you do Redshift.from_env().query("SELECT * FROM table"); does the heavy lifting of getting our env vars and setting up the AWS client, and interpreting the results into something comprehensible. @johnrwatson wrote much of the underlying code.
si_lago_api.py: handles authentication and error interpretation for the Lago API.
Copy/pasted versions of these are currently running in prod. We'll get them driven off these files after they merge.
The SQL file here captures the views and tables we've created for billing so far in Redshift. The python scripts correspond to two other lambdas:
upload_billing_resource_hours_to_lago.py
: uploads all billing resource counts to Lago to be counted in each user's Usage (and eventually, invoice).track_resource_counts.py
: processes incoming events from SDF into a coherent table,workspace_resource_counts
, with a row for each time a workspace's resource count changed, as well as the timespan it was active, so you can query it to find out what count a workspace had at any arbitrary time. This will be replaced withworkspace_resource_counts_view
in the SQL when I have time for testing that.si_logging.py
: just sets up a common logging object.si_redshift.py
: a little query module letting you doRedshift.from_env().query("SELECT * FROM table")
; does the heavy lifting of getting our env vars and setting up the AWS client, and interpreting the results into something comprehensible. @johnrwatson wrote much of the underlying code.si_lago_api.py
: handles authentication and error interpretation for the Lago API.Copy/pasted versions of these are currently running in prod. We'll get them driven off these files after they merge.