Closed ashiemke closed 11 months ago
Hi @pozil any way we can dig into this together somehow? It's caused our production app to stop working with the Salesforce live sync
Hi all, sorry for the delay, I was out of office. I don't think that this is an issue that I can fix on the client side. Let me raise this in the official Pub/Sub API repo and engage with the product team.
Thanks for passing it along. This issue seems to be causing other issues and the message appears to be corrupted (avro-js throws a "truncated buffer" exception when trying to read a string later in the message, misreading the length to be several MB in ~hundreds of bytes of message).
I can provide the message and schema, but don't want to post it publicly since I can't decode to verify it doesn't contain sensitive information.
For this project, I think there should be better error handling to assist in debugging - the current implementation exits the node process with an unhandled exception if there are errors parsing the data. That should probably be wrapped in a try-catch to at least log errors.
Hi @ashiemke and @branburciaga, while we wait for the product team to respond, I have improved the client error handling logic.
Whenever an event fails to parse I'm now wrapping the error in an EventParseError object and sending it via the error
event via the PubSubEventEmitter.
This means that event parsing errors will no longer interrupt the client so make sure that you handle them:
eventEmitter.on('error', error => {
if (error typeof EventParseError) {
// Handle event parsing error
const replayId = error.replayId;
// ...
} else {
// Generic gRPC stream error
}
});
Hi @ashiemke and @branburciaga I have even better news. After chatting with the engineering team and exploring the Avro library, we realized that there's an avro-js specific bug that prevents us from properly unserializing the event payload. Long story short: the JS library doesn't support all the values that fit in a 64 bit Long. It's limited to the values that fit a JS Number (2^53 – 1). Fortunately, I could write a custom deserializer for the Avro Long type to address this case.
Starting from the latest version of the client (v3.2.1), whenever your data contains a value that doesn't fit in the maximum safe integer in JavaScript, such as the commit number you shared earlier, it will be automatically returned as a BigInt.
Amazing, thank you so much @pozil !!! Very much appreciated. We will check this out.
A very large commitNumber (e.g. 1698100138468892700) causes an uncaught exception due to failing a check for 'isSafeLong' in avro-js, causing all events to crash the node process.
The fromBuffer call should be wrapped in try/catch to avoid an uncaught exception, but I'm not sure how to work around the actual issue. Can we omit commitNumber or update the schema to accurately reflect the type of
commitNumber
?