When beginning to write a new scraper, I first had to get the server running with test data to make sure I understood what the parser should output. This is my attempt to automate that process for other new contributors.
The primary changes are:
Copy example data from README to scrapers/example_data.csv
Write scrapers/collate_outputs.py which will collate data from scrapers/$REGION/parse_output.csv, or, if it doesn't find any, from scrapers/example_data.csv, and dump the result to data/all_regions.csv
Add yarn scripts to populate the data and start the server, respectively
I know this PR is pretty trivial, so feel free to reject if you think it adds more clutter than value. 😁
When beginning to write a new scraper, I first had to get the server running with test data to make sure I understood what the parser should output. This is my attempt to automate that process for other new contributors.
The primary changes are:
scrapers/example_data.csv
scrapers/collate_outputs.py
which will collate data fromscrapers/$REGION/parse_output.csv
, or, if it doesn't find any, fromscrapers/example_data.csv
, and dump the result todata/all_regions.csv
I know this PR is pretty trivial, so feel free to reject if you think it adds more clutter than value. 😁