Open candlerb opened 5 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Pinging @elastic/elastic-agent-data-plane (Team:Elastic-Agent-Data-Plane)
+1
As a customer, Redis Streams presents to me a lightweight alternative to getting Kafka's features without the overhead of setting Zookeeper and Kafka for a lightweight development and production logging environment.
One detail that would aid with my specific use case for streams is:
Hi! We just realized that we haven't looked into this issue in a while. We're sorry!
We're labeling this issue as Stale
to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1
.
Thank you for your contribution!
:+1
Hi! We just realized that we haven't looked into this issue in a while. We're sorry!
We're labeling this issue as Stale
to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1
.
Thank you for your contribution!
Describe the enhancement:
Currently libbeat supports two types of redis output: datatype "list" (
RPUSH
) and datatype "channel" (PUBLISH
).However redis 5.0+ supports another type of delivery: redis streams.
Implementing this ought to be straightforward. It would use
XADD
to deliver a message, and should be configurable what stream name to use - the existingkey/keys
setting can do this. It should also be possible to configure an optionalMAXLEN ~ n
value (so that the queue is bounded).In order to be useful in the context of the whole stack, logstash would also need to be extended to be able to consume from redis streams (
XREAD/XREADGROUP/XACK
).Note:
XADD
natively carries key/value pairs. I think it would be simplest to carry the entire JSON payload as a single value, under a fixed key (e.g. empty string or configurable).Whilst it would be possible to break up the top-level JSON object into its constituent keys and values, the values supported by redis stream are only strings; it would therefore be necessary to reserialize each value as JSON to preserve its type (e.g. a string value would be enclosed in double quotes) or to support nested values. That seems to be expensive for little or no benefit.
Describe a specific use case for the enhancement or feature:
Redis Streams have semantics similar to a Kafka queue: messages can be stored indefinitely, replayed on demand, distributed amongst consumers within a consumer group, and you can have multiple consumer groups processing the same set of messages, with catch-up if they fall behind.
Redis is much easier to setup and manage and much less resource-heavy than Kafka, yet still scales to very high throughput (they suggest a million messages per second).
On the minus side, Redis Streams probably requires the entire stream to reside in RAM for efficient behaviour.
Still, Redis Streams appears to be an attractive middleware for small to medium installations, where the amount of backlogging is bounded and the stream is not being used as a long-term archive.