Closed darky closed 5 years ago
What exactly do you mean? Could you provide some code samples/API proposals/feature description?
For example in grpc microservice stream implementation observer simply push data on data
event. If my subscriber will be slower that data
(db insert), than occur memory leak (not handled yet subscriber callbacks).
Need backpressure like this: https://nodejs.org/en/docs/guides/backpressuring-in-streams/#too-much-data-too-quickly
But I don't know, how implement it with existing rxjs observable api
Yeah, that's definitely one downside to Observables. @mattpodwysocki and I have been working on IxJS/AsyncIterable to accommodate this use-case, including native interop with node Streams. If Nest is interested in automatically composing backpressure through the streaming pipeline, we'd love to collaborate or even talk about possible migration strategies. For example, we can more closely align the Ix and Rx APIs (where appropriate), automatically convert to/from Observables/AsyncIterables in both Ix and Rx, etc. Just hit us up!
If Nest is interested in automatically composing backpressure through the streaming pipeline, we'd love to collaborate or even talk about possible migration strategies
Thanks for your input @trxcllnt. I would love to talk about possible strategies. Just to give you a bit more context - Nest doesn't use native Node API too much internally. However, it's heavily used in the microservices module (communication through the network is wrapped in rxjs streams).
We do plan to stick with RxJS and leave the user an ability to decide what strategy should be used (buffering by time, count etc).
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
I'm submitting a...
Current behavior
Calling grpc stream can't support backpressure, that can occur memory leak
Expected behavior
Exist opportunity to receive grpc stream data with throughput for avoid memory leak