Closed josacar closed 4 years ago
Hum. how random is it? Is it all calls past a certain time until you restart? or X% of calls ? etc.
Maybe this is related to this issue: octokit/octokit.rb#1096
I doubt so as Shipit include code to handle token expiration.
We have 7 failed requests from 406 requests last 24 hours.
This is the histogram of the failed requests, maybe it can be related to the token expired in the meanwhile of the job, I'm pretty lost.
Yeah, we could probably consider the token as expired a couple minutes before the claimed expiry. This would protect against shifting system clocks & such.
Are you will to deploy a branch of Shipit-engine to test a fix, or maybe just to monkey patch the fix in?
module ShipitExpiryPatch
def initialize(token, expires_at)
super(token, expires_at - 5.minutes)
end
end
Shipit::GitHubApp::Token.prepend(ShipitExpiryPatch)
The above patch dropped into config/initializers
should allow to confirm wether it's an expiry race or not.
Ok, applied, let's see tomorrow.
Only had one error during the night, I'm increasing it to 15 minutes to see if it works.
Hi, we are experiencing random 401 Unauthorized errors from Github API randomly.
This is happening since we moved from Github API token to a GitHub Application.
Maybe this is related to this issue: https://github.com/octokit/octokit.rb/issues/1096
and payload:
This is another example:
With this job payload:
This is another example:
With this job payload: