MattTriano / analytics_data_where_house

An analytics engineering sandbox focusing on real estates prices in Cook County, IL
https://docs.analytics-data-where-house.dev/
GNU Affero General Public License v3.0
9 stars 0 forks source link

Implement DAG to clean up downloaded files and XComs in Airflow metadata database #6

Closed MattTriano closed 1 year ago

MattTriano commented 1 year ago

If the ingestion is reliable enough, it might even be feasible to include cleanup of the downloaded file after ingestion to the DAG.

MattTriano commented 1 year ago

I've started working on this. So far I've implemented functionality to identify and delete data files that are identical to an earlier pull. I've structured things so that some functionality can be reused in the next step, where I'll implement a DAG to delete some non-duplicated data, although I haven't settled on retention logic yet. Maybe it will be a keep_last_n_data_versions or maybe it would be better to have a keep_data_versions_from_past_n_days, or maybe both. I'll think through cases.

MattTriano commented 1 year ago

After reviewing the sizes of XComs stored in the airflow_metadata_db and of logs in all non-scheduler logs directories, I see that the contents of the /logs/scheduler dir comprise 94% of the /logs disk usage. Upon inspecting a few scheduler log files, I see that the issue is there are ~25MB of logs per DAG per day, and it's overwhelmingly driven by this unnecessary warning that's slated to be removed in Airflow v2.5.2 (we're at v2.5.1 right now). So I'll settle for just clearing out old scheduler records right now.