Closed konklone closed 11 years ago
There's the potential here to broaden this to a general report/doc collection project. CRS reports, CBO reports, GAO reports, committee reports, statements of administration policy, etc. Those are all individually centralized, but collectively fragmented. :)
As a start, I'm doing this for GAO reports already for Scout, via a Ruby script I could refactor to a standalone context.
This is now proceeding at unitedstates/inspectors-general, so closing this to focus attention there.
I'm waiting to get at least one or two scrapers done before formally making a public call for contributions, but we have a public spreadsheet that tracks the location of every IG, what they have, and a cell for people to claim IGs. When we formalize this further, we'll start using Github's issue tracker for claiming instead.
How many Inspector General sites are there that would need to be scraped to collect these reports? Is this feasible?
Or, is there any internal government network or website which collects this information that we could try to convince someone to make public?