Closed cboulanger closed 6 years ago
As @cajus has rightly pointed out, the current behaviour of qx contrib update is not production-ready since without a GitHub token, it immediately runs into GitHub’s API rate limit, and requiring a GitHub token makes it unappealing to use. @cajus will set up an hourly Travis CI job that produces the json data that qx contrib update would normally produce locally, and checks this data into the qx-contrib repository. I propose the following behaviour: By default, qx contrib update will try to download the data from there. If a GitHub token is passed by the --token parameter, the local cache will be created by querying the GitHub API. This is what happens in the CI job. Unless anyone objects, this is how I will implement it.
The immediate problem has been fixed with https://github.com/qooxdoo/qooxdoo-cli/commit/e61e0f662855fb27a0fe2fee03b16d61cbf97c53. However, anticipating possible stricter rate limits, we should keep conditional requests in mind in order to avoid unnecessary use of the GitHub API.
I think this issue isn't really relevant right now, so I would close it. In case we get another problem with rate limits, we can open it again.
Since nobody objected, closing this.
GitHub's API rate limit presents a problem for using GitHub directly as a contribution catalog, with no intermediate caching. The number of requests currently made by the
qx contrib
commands without authentication (Token) exhausts the rate limit too early, and forcing people to use a token is a show-stopper. This might force us to use an automatically generated catalogue at the end, but I'ld like to explore other options first, to avoid the overhead and additional fragility that this adds to the process.