Closed vicradon closed 2 years ago
👋 @vicradon Good afternoon and thank you for submitting your topic suggestion. Your topic form has been entered into our queue and should be reviewed (for approval) as soon as a content moderator is finished reviewing the ones in the queue before it.
@vicradon Closing due to inactivity - if you'd like to continue working on the topic. Please let us know.
Topic Suggestion
How to process data streams using Kafka Streams
Writing sample(s):
Include any links or writing samples - to help our team better gauge your writing quality.
Proposal Submission
Proposed title of the article
How to process data streams using Kafka Streams
Proposed article introduction
[Languages] How to process event streams using Kafka Streams
In a distributed application system, different parts will need to communicate to ensure the system remains fast and scalable. One way of achieving such communication is through a message broker like Apache Kafka. Kafka allows for sending of messages from producer to consumer application. However, consumer applications may need to transform the data they receive into a more suitable form. In order to simplify this data processing, Kafka Streams was developed.
Kafka Streams is a tool that is used to process event streams from topics in a Kafka cluster. It eliminates the difficulties associated with processing data streams in individual Kafka consumers and provides a functional way to perform stream processing. Kafka Streams is packaged as a Java dependency and will typically be deployed with a Spring boot application. It also has support for other languages in the JVM ecosystem like Scala and Kotlin. In this article, you will learn how to aggregate, join and window data using Kafka Streams.
Key takeaways
Article quality
I follow the approach of introducing Kafka Streams API and its importance in distributed application architecture. I also go over how to perform windowing on data which is not common in other tutorials.
References