johndpjr / AgTern

19 stars 5 forks source link

Make the web scraper more robust #75

Closed JeremyEastham closed 11 months ago

JeremyEastham commented 1 year ago

Currently, if the web scraper runs into certain errors, we lose all of the data that we just scraped! This obviously isn't desirable, but there are certain errors that would make it impossible to write to the database (such as a database error itself). We probably shouldn't automatically try to commit this data since it could contain errors, so what do we do with it?

At least during development, it could be useful to have some sort of data dumping mechanism to help figure out what went wrong.