As @Kirkman found in #597 , a scraper can stop producing output without triggering an error in workflow.
While a few states keep WARN and non-WARN layoffs in the same database it's unlikely that many states would ever have a reduction in the number of incidents reported in the files getting scraped. So if a state moves from 283 reports to 123 reports or 0 reports that should get flagged. Simple row counts of CSVs compared to earlier snapshots would have caught Missouri problem.
Weekly Github Action built in warn-support repo, perhaps?
As @Kirkman found in #597 , a scraper can stop producing output without triggering an error in workflow.
While a few states keep WARN and non-WARN layoffs in the same database it's unlikely that many states would ever have a reduction in the number of incidents reported in the files getting scraped. So if a state moves from 283 reports to 123 reports or 0 reports that should get flagged. Simple row counts of CSVs compared to earlier snapshots would have caught Missouri problem.
Weekly Github Action built in warn-support repo, perhaps?