Open arthurschreiber opened 3 months ago
Attention: Patch coverage is 91.20370%
with 19 lines
in your changes missing coverage. Please review.
Project coverage is 79.53%. Comparing base (
fc2aa28
) to head (39cbe28
).
Files with missing lines | Patch % | Lines |
---|---|---|
src/message-io.ts | 90.83% | 3 Missing and 9 partials :warning: |
src/connection.ts | 90.27% | 4 Missing and 3 partials :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@MichaelSun90 @mShan0 Mind taking a look? I only moved the PRELOGIN
payload and parsing to use the new methods so far. I think this is probably large enough to warrant merging this before moving more things over - I don't think this will be reviewable if we move more things over inside of this PR.
This introduces some initial refactoring around
MessageIO
/IncomingMessageStream
/OutgoingMessageStream
.Namely, the goal is to completely remove
IncomingMessageStream
/OutgoingMessageStream
/Message
and replace them with two straightforward functions instead:MessageIO.readMessage
to read the contents of a message as a stream using anAsyncIterable<Buffer>
from aReadable
stream.MessageIO.writeMessage
to write a stream of buffers (generated synchronously via aIterable<Buffer>
or asynchronously via aAsyncIterable<Buffer>
) to aWritable
stream.Both these new methods are much simpler compared to the previous
IncomingMessageStream
/OutgoingMessageStream
implementation, both from a logical complexity as well as an implementation complexity point.They're also both significantly faster than the current implementations. I added benchmarks that try to compare either implementation (benchmarks run using Node.js v18.20.4):
The current implementation spends a lot of time setting up new stream objects for each incoming and outgoing message (via the
Message
class that inherits fromPassThrough
transform stream), and that causes quite a dent in performance, especially when v8 optimizations haven't kicked in yet.