Closed 0x4007 closed 5 months ago
The associated log error that usually comes with this is
{
"error": {
"name": "HttpError",
"status": 404,
"request": {
"url": "https://api.github.com/orgs/ubiquity/memberships/ubiquity",
"method": "GET",
"headers": {
"accept": "application/vnd.github.v3+json",
"user-agent": "probot/12.3.3 octokit-core.js/3.6.0 Node.js/20.12.0 (linux; x64)",
"authorization": "token [REDACTED]"
},
"request": {
"retryCount": 1
}
},
"response": {
"url": "https://api.github.com/orgs/ubiquity/memberships/ubiquity",
"data": {
"message": "Not Found",
"documentation_url": "https://docs.github.com/rest/orgs/members#get-organization-membership-for-a-user"
},
"status": 404,
"headers": {
"date": "Mon, 13 May 2024 19:24:52 GMT",
"vary": "Accept-Encoding, Accept, X-Requested-With",
"server": "GitHub.com",
"content-type": "application/json; charset=utf-8",
"referrer-policy": "origin-when-cross-origin, strict-origin-when-cross-origin",
"x-frame-options": "deny",
"content-encoding": "gzip",
"x-ratelimit-used": "21",
"x-xss-protection": "0",
"transfer-encoding": "chunked",
"x-ratelimit-limit": "7150",
"x-ratelimit-reset": "1715631888",
"x-github-media-type": "github.v3; format=json",
"x-github-request-id": "BA3C:26DB23:3839439:603C045:66426904",
"x-ratelimit-resource": "core",
"x-ratelimit-remaining": "7129",
"x-content-type-options": "nosniff",
"content-security-policy": "default-src 'none'",
"strict-transport-security": "max-age=31536000; includeSubdomains; preload",
"access-control-allow-origin": "*",
"access-control-expose-headers": "ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Used, X-RateLimit-Resource, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type, X-GitHub-SSO, X-GitHub-Request-Id, Deprecation, Sunset",
"x-accepted-github-permissions": "members=read",
"x-github-api-version-selected": "2022-11-28"
}
}
},
"caller": "renderCatchAll",
"revision": null,
"handlerType": {
"type": "main",
"actions": [
null
]
},
"activeHandler": "issueClosed"
}
Related Supabase entry I suspect some call to Octokit to check if the user belongs the the organization failing that is not caught and leads to the error.
I'm aware the i can generate permits on repos where I see the "settings" at the end of the repo:
as you may see:
https://github.com/ubiquibot/assistive-pricing/issues/2
as oppose to:
@molecula451 Yep that's one of the criteria, but also being a billing manager
in the organization should allow for generating permits despite not having rights on the specific repo.
@molecula451 Yep that's one of the criteria, but also being a
billing manager
in the organization should allow for generating permits despite not having rights on the specific repo.
I agree, it's more like we're missing something on the bot level
I have spent quite some time checking this today, and couldn't pinpoint the issue.
First, I assume that the latest deployment of the bot is inside the refactor/move-to-delegate-compute branch.
According to the error, it happened on issueClosed
and poking the API at https://api.github.com/orgs/ubiquity/memberships/ubiquity which returned a 404 triggering the error.
First it is weird to have a 404 since I should be in the members so far. Second, the only spot where this is called would be in checkUserPermissionForOrg. As you can see, it is wrapped in a try catch clause.
Locally, I get error logs but no crash. Inside of Supabase, the record indicates that the associated message is action has an uncaught error which would mean the error got bubbled all the way up. My theory is that this error is not the real one, and the stack is actually destroyed somewhere. Sadly I do not have access to Cloudflare to debug there.
Edit: could finally repro the error. It seems the catch doesn't catch, going through the rabbit hole.
! No price label has been set. Skipping permit generation.
View | Contribution | Count | Reward |
---|---|---|---|
Issue | Comment | 3 | 0 |
Review | Comment | 9 | 29.452 |
View | Contribution | Count | Reward |
---|---|---|---|
Issue | Comment | 2 | 2.864 |
Comment | Formatting | Relevance | Reward |
---|---|---|---|
![image_2024-05-14_051857954](https://github.com/ubiquity/ubiqui… | 2.8p: count: 28 score: 1 img: count: 2 score: 0 | 0.68 | 1.904 |
I agree, it's more like we're missing something on the bot level | 1.2p: count: 12 score: 1 | 0.8 | 0.96 |
View | Contribution | Count | Reward |
---|---|---|---|
Issue | Specification | 1 | 0.15 |
Review | Comment | 6 | 3.039 |
Comment | Formatting | Relevance | Reward |
---|---|---|---|
_Originally posted by @0x4007 in https://github.com/ubiquity/wor… | 0.6p: count: 6 score: 1 em: count: 6 score: 0 | 0.25 | 0.15 |
Maybe that's simplest yes. | 0.4p: count: 4 score: 1 | 0.06 | 0.024 |
It's fine unless something breaks lol | 0.6p: count: 6 score: 1 | - | - |
https://github.com/ubiquity/ubiquibot/commit/bc4baaf2a9c9b4f3e89… | 0.1p: count: 1 score: 1 | 0.47 | 0.047 |
Just revert the value to what it was originally. | 0.9p: count: 9 score: 1 | - | - |
No comment on bun | 0.4p: count: 4 score: 1 | 0.7 | 0.28 |
Check the files view at the bottom. There's an [e2e test error](… | 4.8p: count: 41 score: 1 a: count: 7 score: 1 | 0.56 | 2.688 |
View | Contribution | Count | Reward |
---|---|---|---|
Review | Comment | 4 | 10.734 |
Comment | Formatting | Relevance | Reward |
---|---|---|---|
The change looks safe. | 0.4p: count: 4 score: 1 | 0.21 | 0.084 |
lol let me check this. Six hours should be maximum but normally … | 2p: count: 20 score: 1 | 0.43 | 0.86 |
@0x4007 @gentlementlegen I found a most likely cause for the six… | 9.4p: count: 93 score: 1 code: count: 1 score: 1 | 0.95 | 8.93 |
For this PR I am not sure about configuration error https://gith… | 2p: count: 20 score: 1 | 0.43 | 0.86 |
My theory is that this error is not the real one, and the stack is actually destroyed somewhere.
Sorry for being late to comment on this. It's a valid theory because i manually implemented a ton of display logic for managing the stack to seamlessly support GitHub comments, Supabase logs, console logs.
There's a good possibility that there is a mistake somewhere. I also was trimming some of the stack to not log the internals of the logging function, for example.
No worries, in this case it was the check for collaborator status that was not wrapped within a try catch thus would bubble up the error all the way up hence the strange error. Should hopefully be fixed by https://github.com/ubiquity/ubiquibot/pull/928
Originally posted by @0x4007 in https://github.com/ubiquity/work.ubq.fi/issues/41#issuecomment-2108901126