Open WillAbides opened 1 year ago
This seems like a good idea to me. Anything I can do to help move it forward?
Using those libraries, does anyone know if they also implement backoff when hitting rate limits or other client side issues (4xx responses)?
currently running into issues with rate limiting using scm generator and figure that I could work on implementing the httpcache if no one else is currently working on this
Would it be a good idea to use the redis deployment that the repo server uses to hold the cache of these requests? @crenshaw-dev
Summary
Make the github api client used by ApplicationSet cache responses.
Motivation
Using an
scm_provider
generator withallBranches
leads to a lot of queries to the GitHub API. This takes some time and eats through GitHub's rate limit.We can save time and rate limit by caching responses and using the
If-None-Match
andIf-Modified-Since
headers to check whether the cached response is up to date. This is described in GitHub's api documentation here: https://docs.github.com/en/rest/overview/resources-in-the-rest-api#conditional-requestsProposal
I propose using https://github.com/gregjones/httpcache and https://github.com/die-net/lrucache to implement the cache.
httpcache
provides an http transport that does the header checks and wraps a cache.lrucache
is an in-memory least-recently-used cache made to be used with httpcache. I have used this combination with the go-github client successfully other projects.An environment variable
ARGOCD_APPLICATIONSET_CONTROLLER_GITHUB_CLIENT_CACHE_SIZE
can be added to set the cache size. If it is unset, no cache will be used. Otherwise an LRU cache will be created with the given size in kilobytes.This is what the code might look like:
https://github.com/AWholeNewOrg/argo-cd/commit/7516d647441f66862cb5f7618e7efa2a2222bcba?w=1