Building the 411 for air quality in the United States: a texting platform accessible to all, that provides actionable local information to protect your and your community.
Per this discussion, I think it's reasonable to begin storing sensor readings in Postgres and to update those readings every 10 minutes. This will also simplify our architecture, allowing us to store all zipcode data in Postgres as well (and build it once a week).
This PR is a biggie. It:
Sets up Celery as a task worker.
Creates two new ECS services, worker and scheduler, to run tasks of type worker and scheduler respectively.
The scheduler will instruct the worker to pull all the data from the Purpleair JSON API once per 10 minutes. This data will then be inserted into Postgres.
The app will query Postgres and only fall back to directly querying Purpleair if data is missing in Postgres.
This will allow us to:
Remove dependency on sqlite3. We can start just storing geographic data in Postgres and building it oncer per week in a Celery task.
Remove dependency on Memcached. Since we will almost always find the data in Postgres.
Start considering how we might store historical/time-series data.
Per this discussion, I think it's reasonable to begin storing sensor readings in Postgres and to update those readings every 10 minutes. This will also simplify our architecture, allowing us to store all zipcode data in Postgres as well (and build it once a week).
This PR is a biggie. It:
worker
andscheduler
, to run tasks of typeworker
andscheduler
respectively.This will allow us to: