datamade / clearstreets-web

Website that tracks where Chicago plows have been during a snowstorm.
http://clearstreets.org
MIT License
7 stars 6 forks source link

Start up and Shutdown Clearstreets based upon city's activation and deactivation of the snow tracker #9

Open fgregg opened 10 years ago

fgregg commented 10 years ago

Write a script to watch http://www.cityofchicago.org/city/en/depts/mayor/iframe/plow_tracker.html

When it is activated, turn the plow tracker on When it is deactivated, turn the plow tracker off and archive

derekeder commented 10 years ago

Oh boy ... you ready to automate a serious kludge?

My current process for this is I have a email & text message alert from Pingdom checking for specific text on the City's plow tracker website. This has proven to be pretty reliable, except when they change the text on the page :wink:

When I get notice, I delete all rows in the clearstreets_live CartoDB table:

DELETE FROM clearstreets_live;

Then, I fire up the @smartchicago ClearStreets processing EC2 server, ssh in, and run bash launch_clearstreets.sh in clearstreets-processing (script here)

To make sure its running properly I check ps aux, the size of slurp2gpx/plows.db and the presence of files in /gpx and /osm. I also run this query in CartoDB to get the most updated date:

SELECT date_stamp from clearstreets_live ORDER BY date_stamp DESC LIMIT 1;

Then, I add some text to the home page with the proper date and message like this:

screen shot 2013-12-17 at 1 10 40 pm

and then deploy to heroku and hit http://clearstreets.org/flush_cache to clear the website cache (or wait for it expire in 1 hour).

derekeder commented 10 years ago

Oh I forgot the best part.

When the city's plow tracker turns off, I back up all of our data and upload it to an S3 bucket:

fgregg commented 10 years ago

Didn't say we were ready for this, just saying this is an issue.

On Tue, Dec 17, 2013 at 1:58 PM, Derek Eder notifications@github.comwrote:

Oh I forgot the best part.

When the city's plow tracker turns off, I back up all of our data and upload it to an S3 bucket:

  • kill -9 all running scripts
  • nohup bash process_all_clearstreets.sh & re-runs path inference on all plow points. takes anywhere from 5-24 hours
  • bash backup_clearstreets.sh zips up gpx, osm and sqlite database and moves them to an S3 bucket with the current day's date

— Reply to this email directly or view it on GitHubhttps://github.com/open-city/clearstreets-web/issues/9#issuecomment-30785014 .

773.888.2718 2231 N. Monticello Ave Chicago, IL 60647