Closed phildini closed 2 years ago
Hello! 👋 Thanks for initiating the process to configure a new CDP instance.
I am a bot that will validate the information provided in your form. If any check fails, please update your issue by opening the '⋯' dropdown in the top-right-corner of your GitHub Issue and selecting 'Edit'. I will automatically rerun the checks after you update the issue to validate the changes.
A member from the CDP team will respond as soon as possible!
✅ Governing body type is an accepted value. ✅ @phildini has been marked as the instance maintainer. ✅ CouncilDataProject/alameda is available. ✅ The municipality's Legistar instance contains the minimum required CDP event ingestion data.
{
"municipality": "The City of Alameda",
"governing_body_type": "city council",
"municipality_slug": "alameda",
"python_municipality_slug": "alameda",
"infrastructure_slug": "cdp-alameda-d3dabe54",
"maintainer_or_org_full_name": "phildini",
"hosting_github_username_or_org": "CouncilDataProject",
"hosting_github_repo_name": "alameda",
"hosting_github_url": "https://github.com/CouncilDataProject/alameda",
"hosting_web_app_address": "https://councildataproject.github.io/alameda",
"firestore_region": "us-central"
}
To proceed with the deployment process, please do the following:
make login
in cdp-backend/dev-infrastructure and login to the CDP gcloud and pulumi accountsmake init project=cdp-alameda-d3dabe54
in cdp-backend/dev-infrastructureMore details on the make
commands can be found in cdp-backend.
This comment was written by a bot!
cdp-alameda-d3dabe54
This comment was written by a bot!
Hey @phildini Thanks for opening this issue! Excited at the potential for deploying in Alameda!
I am attempting to rerun the bot just to see if we get the weird city_council
bug again, but on inspection of the actual data, the original bot report of "we have everything we need except the video links" does seem accurate.
Here is the data we received from Legistar for the last month of meetings for example: alameda.md -- JSON blob as markdown because GitHub can't handle JSON upload
What that basically means is: we can easily pull information about the event dates and times, the minutes items, the body, and a few more items.
Interestingly, I don't think the Alameda Legistar stores voting records or councilmember information. Which is quite frustrating because even after a few minutes of searching I can't find any city council voting records for Alameda. What is with cities and not having council member voting records easily accessible. :exploding_head:
The big item that is missing is the "video link" but I see that Alameda follows the same pattern as our existing King County deployment which is basically: "Legistar data includes a link to the "event page" and the event page has a link to the "video page" and you can get the video from there.
Example:
Our King County scraper does basically that process and you can see that code here: https://github.com/CouncilDataProject/cdp-scrapers/blob/main/cdp_scrapers/instances/kingcounty.py#L83
Why am I telling you all of this:
If you want to deploy a CDP instance for Alameda and for us to cover it, you would need to basically fill in that function I linked for King County but for Alameda (honestly you could probably copy and past a lot of the code in the King County scraper file kingcounty.py
as a good starting point). (cc @dphoria this is where we should try to generalize this workflow, if the returned EventInSiteURL
is populated, we should try this automatically potentially?)
Once that is done, we can deploy a new instance of CDP for Alameda which would have:
but it would be missing:
All in all, Alameda should be decently easy to spin up and get running if you are interested in making the scraper and CDP would definitely be able to host and help get you setup.
Let me know what you think and thanks for even stopping by and making an issue!
Hey! Thanks for getting back to me.
So, hilariously, they keep the votes in the PDF of the minutes (see "Minutes" doc linked here) but they don't put them into Legistar, I bet because they don't know how
I'm happy to take a stab at making the scraper, where would I make a PR for that?
I'm happy to take a stab at making the scraper, where would I make a PR for that?
Awesome! Feel free to make a PR to cdp-scrapers and place it in the "instances" sub-module.
So, hilariously, they keep the votes in the PDF of the minutes (see "Minutes" doc linked here) but they don't put them into Legistar, I bet because they don't know how
Oh interesting. I didn't even open those because it looked like they were posted very delayed. But that makes sense I guess. Still sad there isn't just a single "show me the voting records for this person" easily findable? (Again, I don't live in Alameda so maybe I just missed something) -- and I guess that's somewhat what CDP is trying to do anyway :joy:
The good note then is that because they do store them, if you wanted to get an Alameda scraper and CDP instance setup then we can do that now, and if in the future you wanted to get that info in as well then you would just need to update the scraper to pull and injest that info too.
Anyway, happy to help with that scraper over GitHub issue back and forth or over Slack: https://join.slack.com/t/councildataproject/shared_invite/zt-112g7qylg-I0B5_AvtLvER9gFwP0je9Q
Note to self- this deployment request fixed a bug in the validation bot: https://github.com/CouncilDataProject/cookiecutter-cdp-deployment/commit/9a1b3c968d64e76bc4aa34bd4bd12499f37deb9e
@phildini due to some recent work by @dphoria, we can now fully deploy this with ease. Are you still interested? I can fully deploy it later today.
I am definitely still interested.
Will attempt to deploy later today then!
/cdp-deploy
:tada: :tada: Repository Created :tada: :tada:
A new CouncilDataProject Instance Repository was created (CouncilDataProject/alameda), external collaborator added (@phildini), and cookiecutter files generated and pushed to repository.
The instance is setting itself up right now and the process will take around 10 minutes to complete. Once completed, a CDP maintainer will comment on this issue with your instance's website link. See the instance's GitHub Action job history for more details on the deployment setup progress.
Your CDP instance will be populated with data within 6 hours of website creation.
At any point in the future if you would like to destroy this instance, please just add a comment to this thread and a maintainer will help you.
make gen-key project=cdp-alameda-d3dabe54
GOOGLE_CREDENTIALS
make set-cors project=cdp-alameda-d3dabe54
cdp-alameda-d3dabe54
Firebase projectrules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read;
}
}
}
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read;
}
}
}
make login
and login to the CDP gcloud and pulumi accountsmake destroy project=cdp-alameda-d3dabe54
More details on the make
commands can be found in cdp-backend.
Deployment Status - Complete
Website: https://councildataproject.org/alameda Initial Data Backfill (just the last couple of days): https://github.com/CouncilDataProject/alameda/runs/6632548845?check_suite_focus=true
Data backfill should finish processing in an hour or so. Feel free to backfill more data if you would like to.
Highly recommend reading the: admin-docs for further setup / customization.
Beautiful, thank you! I'll review soon.
Seems like you may want to add a couple of "ignore minutes item patterns" to the scraper init: https://github.com/CouncilDataProject/alameda/blob/main/python/cdp_alameda_backend/scraper.py#L18
LegistarScraper
docstring: https://github.com/CouncilDataProject/cdp-scrapers/blob/main/cdp_scrapers/legistar_utils.py#L567
From: https://councildataproject.org/alameda/#/events/9ebe703133a0
Municipality Name
The City of Alameda
Governing Body Type
city council
Maintainer GitHub Name
phildini
Legistar Client Id
alameda
Municipality Timezone
America/Los_Angeles
Municipality Slug
alameda
Firestore Region
No response
Code of Conduct
edit by @JacksonMaxfield, testing if this instance can be deployed with new LegistarScraper setup