To make adding new datasets easy for enterprising civic tech hackers, there needs to be a guide that explains how to contribute new scrapers and pull the data into Got Gastro.
Should cover:
[x] Writing and publishing a scraper on Morph
[x] Pulling the data into gotgastro_scraper
[x] What to do in your scraper, vs what to do in gotgastro_scraper
[x] What the Got Gastro architecture looks like (nested scrapers, gotgastro_scraper, the /reset function, how it interplays with email alerts)
[x] The PR process
[x] To geocode, or to not geocode
[x] When to normalise the data (fixing up dates, multiple addresses)
[x] Handling complex HTML data (by converting it to Markdown)
To make adding new datasets easy for enterprising civic tech hackers, there needs to be a guide that explains how to contribute new scrapers and pull the data into Got Gastro.
Should cover:
/reset
function, how it interplays with email alerts)