Closed jkonowitch closed 5 months ago
Hello, thanks for opening this discussion. We'll discuss this internally and get back to you about this!
Hello, Your assumption on the current behavior of the library is correct. We will open a PR to implement an approach similar to the java stream client and work on it in the following days/weeks. Any feedback on that PR will be appreciated!
Thanks again
@jkonowitch, may I ask why?
Other clients, like .net, Go, and Python, send the credit request just before parsing the chunk. The important is that the client does not request too many chunks.
Wow, awesome work @icappello ! I missed this notification earlier so I'm sorry I did not provide PR feedback, but I just reviewed, and the CreditPolicy
API looks well designed and clearly solves this problem.
@Gsantomaggio By requesting credits, the client is effectively requesting more chunks, correct? I had in mind a client that is resource constrained, and where processing each chunk takes some time. If the broker keeps pushing messages while the client is overloaded or busy or operations are failing, you could easily instigate a crash. Having an easy mechanism to signal to the rabbit client that it should slow down and not request a new credit yet, as the design of the java client, seems a good solution. Absent that, the only way I could see to control the flow of chunks is to perhaps disconnect the client, which would entail a lot of overhead. I see you are a rabbitmq maintainer though, so let me know if you think there is an alternative way of thinking about the problem!
Thank you for writing this nodejs implementation of RabbitMQ Streams!
Regarding flow control. I noticed the Java library will only request new credits when some percentage of messages have been processed: https://rabbitmq.github.io/rabbitmq-stream-java-client/stable/htmlsingle/#flow-control
Reading through the source code for
rabbitmq-stream-js-client
, it seems like new credits are requested regardless of how quickly/successfully handlers return - client.ts#L503Do I have that correct? And if so, any thoughts on making this more configurable? I would be happy to help.