If Exif offsets happen to be random garbage - package tries to split at those random positions without validation. This will cause "Out Of Memory" error and process stop node itself.
I'd suggest as first command in readTag function add validation of abnormal valueOffset and numValues:
function readTag(buffer, offset, bigEndian) {
...
// Prevent damaged Exif data to cause Out Of Memory
if (valueOffset >= buffer.length || numValues > 1000000 || valueOffset + numValues * valueSize >= buffer.length)
throw new Error("Damaged Exif data");
numValues > 1000000 (just assumed very large number) - but without this check - last comparsion may not always work - it can rollower.
If Exif offsets happen to be random garbage - package tries to split at those random positions without validation. This will cause "Out Of Memory" error and process stop node itself.
I'd suggest as first command in readTag function add validation of abnormal valueOffset and numValues:
numValues > 1000000 (just assumed very large number) - but without this check - last comparsion may not always work - it can rollower.
Attached damaged header Exif information damaged_header.zip