Open gsakel1 opened 5 years ago
can you let me know a use case?
To be injested by Logstash using the "netflow" codec. Right now, that codec in logstash won't read the vflow decoded ipfix/netflow because it is technically json format. Also, I believe that data going to Kafka should stay raw for consumers to have the original, un-altered data set. But this could be an option added in the vflow config. Not recommending to change the code, just to have as an option or addition.
You can use Kafka input plugin in Logstash to directly read the data from any partition in JSON format.
I think sending RAW IPFIX/Netflow data to Kafka would be extremely beneficial instead of decoding it to JSON. Is this possible?