We need to be able to automatically update the webscraping.
We could have a cron running every week/day or so, may interrupt go server. Should be good, since our python scripts are separate from the go server.
If we also have updates, we should be updating a database instead of json files.
Overall, having a database, and separate scripts, will help us with scraping multiple schools #4. However, the scripts should not overwrite each time, when we come to it, maybe we can just write per semester/quarter, but that needs some thinking when to write per semester/quarter.
We need to be able to automatically update the webscraping.
We could have a cron running every week/day or so, may interrupt go server. Should be good, since our python scripts are separate from the go server.
If we also have updates, we should be updating a database instead of json files.
Overall, having a database, and separate scripts, will help us with scraping multiple schools #4. However, the scripts should not overwrite each time, when we come to it, maybe we can just write per semester/quarter, but that needs some thinking when to write per semester/quarter.