Closed rrrutledge closed 1 year ago
@marshmallowrobot has some more detail.
Some ideas:
Adding some more details...
The build fails in the get_contributors.js
file with the following error messages:
(node:1722) UnhandledPromiseRejectionWarning: HttpError: You have exceeded a secondary rate limit. Please wait a few minutes before you try again.
[109](https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344#step:6:110)
at /home/runner/work/InnerSourceLearningPath/InnerSourceLearningPath/InnerSourceLearningPath/scripts/node_modules/@octokit/request/dist-node/index.js:66:23
[110](https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344#step:6:111)
at runMicrotasks (<anonymous>)
[111](https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344#step:6:112)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
[112](https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344#step:6:113)
at async module.exports (/home/runner/work/InnerSourceLearningPath/InnerSourceLearningPath/InnerSourceLearningPath/scripts/get_contributors.js:13:49)
[113](https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344#step:6:114)
at async /home/runner/work/InnerSourceLearningPath/InnerSourceLearningPath/InnerSourceLearningPath/scripts/generate_learning_path_markdown.js:65:30
For everyone's reference, here are GitHub's rate limit guidelines:
rateLimit
info to the getContributors.js
query and learned that each call (now) costs 3. asciidoc
files. At 3 per file, that's not enough to trip the 5k/hour request limit./introduction
, /product-owner
, etc/introduction/de
, /introduction/zh
, etcDid ya'll notice how the error message says "secondary rate limit"?!?? Because I TOTALLY DID NOT NOTICE THIS. Turns out a secondary rate limit is its own thing. Quoth the GitHubs:
Reading comprehension FTW!
Great info, @marshmallowrobot ❗️ I wonder if there is some kind of open source module out there for calling GitHub that takes all of this stuff into account (with appropriate retries)?
Great info, @marshmallowrobot exclamation I wonder if there is some kind of open source module out there for calling GitHub that takes all of this stuff into account (with appropriate retries)?
I wonder if you've already tried this SDK GitHub apparently provides: https://octokit.github.io/rest.js/v19#throttling (I'd guess you're already using it, so ignore in that case. If not, maybe it's still useful.) The docs state that there appears to be a plugin to handle / facilitate handling the ratelimiter.
And there's this unrelated JS thing: https://www.npmjs.com/package/express-rate-limit and this article that does sound sort-of useful: https://www.useanvil.com/blog/engineering/throttling-and-consuming-apis-with-429-rate-limits/
@marshmallowrobot says that the throttling module appears to work ‼️ Just need to play around with the throttling settings.
@lenucksi Thank you so much for that reference! That plugin ended up being the solution!
Please check out the PR here, and let me know your feedbacks. TY! 503 throttle graphql requests
If the added build time (due to throttling) is too much to bear, I have an alternative idea:
Use git command-line to fetch commiters. I know that attributing folks that don't git-commit changes was something important to ya'll. But, if we require that all contributors use GitHub going forward, then this may not be as important tomorrow as it is today.
The git shortlog
command can quickly spit out a list of names and email addresses of all committers to a given file path. We could make a one-time API request for all repo users, and match the avatars up with git shortlog
output to populate the contributors per file.
That said, I'm not sure how to make git commands execute from within node, or whether those commands will also be rate limited by GitHub.
Thanks for this work and for thinking things through, @marshmallowrobot ❗️ How long is the build? I’m inclined to leave things just the way they are and wait and see if it becomes a problem.
Nah it's not bad. Its like 2-3 minutes currently.
Seems fine. A little slow for development, but we could remedy that by having an English-only build or something if needed. Seems fine for now, though.
We can't publish to the website. See https://github.com/InnerSourceCommons/InnerSourceLearningPath/actions/runs/3228272591/jobs/5284148344