peter-evans / create-or-update-comment

A GitHub action to create or update an issue or pull request comment
MIT License
685 stars 113 forks source link

[Feature Request] Is it possible to use Throttling to gracefully handle issues with API limits? #241

Closed kunaltyagi closed 1 year ago

kunaltyagi commented 1 year ago
const { throttling } = require("@octokit/plugin-throttling");

function getOctokit(token) {
  const options = {
    "throttle": {
      "onRateLimit": (retryAfter, options, octokit, retryCount) => {
        octokit.log.warn(
          `Request quota exhausted for request ${options.method} ${options.url}`
        );
        // always retry
        octokit.log.info(`Retrying after ${retryAfter} seconds for ${retryCount + 1} time!`);
        return true;
      },
      onSecondaryRateLimit: (retryAfter, options, octokit) => {
        // does not retry, only logs a warning
        octokit.log.warn(
          `SecondaryRateLimit detected for request ${options.method} ${options.url}`
        );
        // always retry
        octokit.log.info(`Retrying after ${retryAfter} seconds!`);
        return true;
      },
    },
  };
  const octokit = github.getOctokit(token, options, throttling);
  return octokit;
}

Then we can use this returned octokit transparently instead to handle API limits caused by factors unknown to the action:

const octokit = getOctokit(token);
peter-evans commented 1 year ago

Hi @kunaltyagi

If you are hitting rate-limits it's likely to be the secondary limits for resource creation. There is a thread about rate-limiting related to GitHub actions and a summary here: https://github.com/peter-evans/create-pull-request/issues/855#issuecomment-900797502

The short version is that it's not really feasible to handle secondary rate-limits well in actions. The best advice is to redesign your workflows to avoid hitting the limits.

kunaltyagi commented 1 year ago

I do understand the need to redesign the workflow based on the details. Would it be possible to add an option for a pre-comment delay? If we set that to 1 second, it would also achieve the same as suggested by the recommendation.

I could wrap your action inside another, but that would be a lot of work for just a 1 second sleep

peter-evans commented 1 year ago

When you hit secondary rate-limits they aren't like "normal" rate-limits where you can just wait a second or two. That's why throttling in the action code doesn't work. GitHub actually blocks resource creation for a significant period of time, like minutes. It's not feasible or desirable for the action to wait for large periods of time. That's why the advice is to redesign and make sure you avoid hitting the secondary rate-limits for resource creation.

kunaltyagi commented 1 year ago

If you're making a large number of POST, PATCH, PUT, or DELETE requests for a single user or client ID, wait at least one second between each request

Based on this extract from your comment, if we have 1 second between requests, this would resolve the fact that we are hitting secondary rate-limits. Maybe my understanding is wrong...

peter-evans commented 1 year ago

It most likely will not resolve it because the API you are being rate-limited on for this action is creating resources.

Requests that create content which triggers notifications, such as issues, comments and pull requests, may be further limited and will not include a Retry-After header in the response. Please create this content at a reasonable pace to avoid further limiting.

These are the rate-limits that I'm talking about.

I don't want to start another long thread going back and forth about this whole issue again. Please read the thread I linked carefully. You can redesign your workflow to avoid it, or perhaps use multiple PATs from different user accounts.