Perhaps setting up logstash (or equivalent) would be the best way to log all of the work being done by the scrapers. Right now the city of Rochester's website is erroring out, and there is so much being logged it is difficult to figure out what the error is. Writing to disk sounds scary, and pushing to MongoDB might be a bit excessive. More search needs to be done.
Perhaps setting up logstash (or equivalent) would be the best way to log all of the work being done by the scrapers. Right now the city of Rochester's website is erroring out, and there is so much being logged it is difficult to figure out what the error is. Writing to disk sounds scary, and pushing to MongoDB might be a bit excessive. More search needs to be done.