Closed gentlementlegen closed 3 months ago
@gentlementlegen the deadline is at 2024-08-15T09:35:49.992Z
I think @Keyrxng might be the best to handle this task actually @gentlementlegen given their experience with building the RPC handler.
@Keyrxng rfc. Your input to the specification can also be useful if you don't want to handle the code.
Sure thing, I've also fiddled quite a lot with the RPC package lately. I thought @Keyrxng had quite a lot on his plate already, but feel free to take it.
I'm not at a computer right now to investigate things at all I will be able in about an hour or so.
I'm happy to take it if that is suitable
I'd double check that the rpc handler package is the latest version it likely isn't
I haven't seen that specific error before 'require(false)' so I don't have much value to add without debugging sorry
Checked in with @gentlementlegen and he's going to take care of it as it's a level playing field with understanding etc
It's really strange that it would have failed for a read-op and something like symbol()
that takes no args and returns a hardcoded string value but instead errors with "no data present" which reads as if that function doesn't exist at the target address but it certainly does
rpc-handler
needs bumped from 1.1.0
to 1.2.3
rpc-handler@1.1.0
was able to default to localhost
(although I'm sure I wrote a fix for that from the beginning so long as the networkId wasn't 31337
it would race and return another)I'm happy to take the task if the need calls for it for whatever reason but I'm sure it's easily resolved with a package bump but I may be wrong
I suppose we can bump the package and wait a month to see this error not happen again before closing as complete.
@Keyrxng I will try to bump the package then. I wanted to check on the retry logic because it would also be possible for endpoints to fail, and in such case I do not know how it is currently handled retry wise. Back to you in a bit.
@Keyrxng I will try to bump the package then. I wanted to check on the retry logic because it would also be possible for endpoints to fail, and in such case I do not know how it is currently handled retry wise. Back to you in a bit.
I'm sure we just loop through our object of valid latencies retrying the call on each provider once and then moving on before repeating the entire object loop again n times according to the config settings.
In theory with a handful of RPCs we should expect to never see a failed call which exhausts all endpoints for any reason other than human error, well that was my thinking originally and the purpose of the Proxy
.
I've found authoring something which appears to have no bugs is far more anxiety inducing than something you had to really fight with to make it do what you want. So I hope you do find a bug because while I'm confident in my work I am very hesitant to say that it's completely error free
I suppose we can bump the package and wait a month to see this error not happen again before closing as complete.
We can also close it if it works and reopen later if needed. I just bumped the packages in the linked pull-request.
# Issue was not closed as completed. Skipping.
\r\n```\r\n_Originally posted by @ubiquityos[bot] in https://github.com/ubiquibot/conversation-rewards/issues/76#issuecomment-2290670929_\r\n \r\nThe following error occurred during the comment evaluation, which seems linked to the `ether` package failing at some point, and needs investigating.",
"reactions": {
"url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
},
"timeline_url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/timeline",
"performed_via_github_app": null,
"state_reason": "not_planned"
},
"caller": "preflightChecks",
"revision": null
}
--> # Issue was not closed as completed. Skipping.
Mixed feelings on the crediting of this because we don't have the root cause diagnosed and are guessing that this is solved.
@0x4007 Sure, maybe we can reopen if that occurs again?
# Issue was not closed as completed. Skipping.
\r\n```\r\n_Originally posted by @ubiquityos[bot] in https://github.com/ubiquibot/conversation-rewards/issues/76#issuecomment-2290670929_\r\n \r\nThe following error occurred during the comment evaluation, which seems linked to the `ether` package failing at some point, and needs investigating.",
"reactions": {
"url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
},
"timeline_url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/timeline",
"performed_via_github_app": null,
"state_reason": "not_planned"
},
"caller": "preflightChecks",
"revision": null
}
--> # Issue was not closed as completed. Skipping.
\r\n
``\r\n_Originally posted by @UbiquityOS[bot] in https://github.com/[/issues/76](https://github.com/ubiquibot/conversation-rewards/issues/76)#issuecomment-2290670929_\r\n \r\nThe following error occurred during the comment evaluation, which seems linked to the
ether` package failing at some point, and needs investigating.","reactions": { "url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }, "timeline_url": "https://api.github.com/repos/ubiquibot/conversation-rewards/issues/81/timeline", "performed_via_github_app": null, "state_reason": "not_planned"
}, "caller": "preflightChecks", "revision": null } -->
@gentlementlegen something was definitely recently broken here because I'm seeing it all the time now
Update: perhaps its the HTML comment notation not being properly escaped, which could be an old problem.
@0x4007 Since you delete the comments I cannot see, is that also occurring in v2?
Those are the reward comments. I assumed that if I closed as unplanned first, and then merged the pull, it would skip permit generation but unfortunately it marked as complete upon merge.
@0x4007 So the ouput of both bots was like the comment you posted I mean? Asking if there is something I should investigate / fix for v2.
No everything is fine here. Carry on.
Originally posted by @ubiquityos[bot] in https://github.com/ubiquibot/conversation-rewards/issues/76#issuecomment-2290670929
The following error occurred during the comment evaluation, which seems linked to the
ether
package failing at some point, and needs investigating.