weyoss / redis-smq

A simple high-performance Redis message queue for Node.js.
MIT License
588 stars 64 forks source link

Consumer throttling #69

Closed PhilHannent closed 2 years ago

PhilHannent commented 2 years ago

This project is very interesting and straight forward to use.

One use-case I'd like is to throttle message consumption for the purpose of consuming 3rd party API calls. It would be good to have an configuration for the consumer which allows you to set the interval and quantity within that interval, much like this library: https://www.npmjs.com/package/p-throttle

Is this something you would be interested in adding?

PhantomRay commented 2 years ago

Thought about that as well. If a queue has too many messages, dumping everything to consumer at once may not be ideal.

weyoss commented 2 years ago

@PhilHannent Thank you for opening this issue and for sharing your suggestion.

If a queue has too many messages, dumping everything to consumer at once may not be ideal.

@PhantomRay RedisSMQ does not dump everything at once to a consumer. A consumer can receive only one message at a time. Once the message has been (un-)acknowledged, a new message may be received.

Returning to your question, I want to point out that the main goal of this project is to provide a simple message queue, usable, and with focus on message processing performance.

If a feature may impact overall performance or may require a lot of changes and refactoring, it would not be promoted for implementation.

One use-case I'd like is to throttle message consumption for the purpose of consuming 3rd party API calls. It would be good to have an configuration for the consumer which allows you to set the interval and quantity within that interval

A queue may have many consumers. So implementing message throttling at the consumer layer would not be a good idea. An ideal alternative sounds like setting a rate limit for delivering messages from a given queue.

I am not going to draw any early conclusions for now. Let me think about it for a while.

I will keep you updated.

PhantomRay commented 2 years ago

Thank you for the explanation. I think I shouldn't say dumping everything to consumer. But let me explain:

My use case is a client will set up a persistent connection using server send event (SSE) connection or web socket connection. Then the sever will use consumer to consume the messages. On the server side, the consumer may consume messages quickly enough and dump them over the persistent connection, but the client may not be able to handle them quickly enough.

Limit how many messages can be sent from server to client, or in another words, limit how many messages can be consumed within a giving time would be good. I think this is where throttling comes from.

weyoss commented 2 years ago

This feature will be implemented in v6.2.0

PhantomRay commented 2 years ago

super awesome!

weyoss commented 2 years ago

This feature has been implemented in v6.2.0.

See https://github.com/weyoss/redis-smq/blob/master/docs/queue-rate-limiting.md

weyoss commented 2 years ago

Closing as resolved.

weyoss commented 2 years ago

Recently a bug has been fixed in redis-smq@6.2.2 regarding queue rate limiting. If you are using an older redis-smq version please upgrade.

The bug occurred only if you provided the queue parameter as a string and caused saving invalid Redis keys. For example:

  queueManager.setQueueRateLimit('notifications', { limit: 200, interval: 60000 }, (err) => {
    // ...
  })

In the example the queue parameter is provided as a string (notifications) and implies to use the default namespace. But in fact, the bug caused no namespace to be used and saved an invalid Redis key.

redis-smq@6.2.2 has fixed this issue.