Closed talos closed 9 years ago
Sounds right to me (we wanted to use Postgres and Flask earlier this summer but decided that we didn't need to for now). @rcackerman and @derekeder what do you think?
Yah looks right. It's best not to have people install unnecessary requirements. When you do end up using Postgres and/or Flask, its easy enough to add them back in!
Cool! If it turns out you want to provide some sort of heavyweight frontend+database, I would recommend bundling it either as a separate project from the scraper+data (keeping it modular) or at least separating out the requirements into separate files, so if someone desires they can isolate one box for the scrape, another for the server, etc.
Thanks @derekeder and @talos!
I agree with @talos in keeping things modular. That gives you and anyone interested in this data much more flexibility in using this scraper and its data.
Thanks @talos - I'll take this as a bump to finally get rid of all the extraneous files on the master branch.
Sounds great @rcackerman.
Installation is very bloated by a lot of heavy requirements -- psycopg2 for Postgres, plus Flask and its attendant ORM (sqlalchemy.) It looks like all that really gets used in
do.py
is beautifulsoup and scrapelib, so that's all we should make our target machines install.