Closed mastermatt closed 4 years ago
Looks like you have too many deps.
Also those release notes are only cached per repo, this is by design. It's to not mix credentials per repo used to fetch things.
we do cache all http get requestes here https://github.com/renovatebot/renovate/blob/c57bb612940831e1b6be29c1a86affe05bb92fee/lib/util/http/index.ts#L122-L125
which is not reset, so available for all repos until url and headers (auth) match
We actually do some non-ephemeral caching too: https://github.com/renovatebot/renovate/blob/78c2d6a42fe40e519c56526b6bab41e9782a2367/lib/workers/pr/changelog/release-notes.ts#L287
If I'm reading this right, those caches only have a 5min TTL. Is there a general rule being followed on picking TTLs for different data? I seems like release notes would be a good candidate for a longer lived cache. Of course, I'm biased.
Oh, I missed that those caches are in-memory caches instead of using file, or in my case, Redis. So maybe an extension to my general-rule question about TTLs, is there a rule on using in-memory cache?
Our current setup doesn't run all the ~250 repos in one execution, it bundles them by org. I guess I could change that if that ends up being what I need to do. I guess I'm looking for guidance on best practices.
Willing to accept a PR to make the release notes cache time configurable. That may be the quickest way to address the problem for now. Longer term, I'd like the release notes caching approach to be intelligent based on how old the release is. The chance that release notes are updated or added after a release decreases a lot after the first hours and days.
I think some parts of renovatebot/renovate#6964 can help here. so we have a cache for release notes which uses the packageCache (redis / file) from some longterm caching
This issue has been automatically marked as stale because it has not had recent activity. It will be closed soon if no further activity occurs. If this question is not done (either you plan to update it or are waiting on someone to respond) then please add a comment here to bump it and/or get the other person's attention. We aim to do our best to solve every problem. This bot is here to help us clean up issues which are no longer of use to the original poster, and not to close anything prematurely, so bump as you need!
Which Renovate are you using?
Docker
renovate/renovate
@ latestWhich platform are you using?
GitHub Enterprise
Have you checked the logs? Don't forget to include them if relevant
I'm getting hundreds of the following per hour:
What would you like to do?
I'm looking to see what the options are to reduce the throughput on calls to github.com since I keep getting rate limited.
We are using
GITHUB_COM_TOKEN
, but still using more than the 5k request an hour. We are using the Redis caching option, which seems to help some, but not enough.Our process runs hourly, using the Docker image to run against >250 repos; comprised mostly of NPM with Docker. The rate limiting seems to come primarily from
getReleaseList
calls, which doesn't seem to be using the caching mechanism. Maybe this is a feature request to add caching togetReleaseList
🤷