elastic / beats

:tropical_fish: Beats - Lightweight shippers for Elasticsearch & Logstash
https://www.elastic.co/products/beats
Other
112 stars 4.93k forks source link

Demultiplex multiline messages #11900

Closed kvch closed 3 years ago

kvch commented 5 years ago

The existing multiline filter is able to aggregate multiple lines which are coming continusly. The following example gets forwarded as two events with the appriopiate configuration:

my multiline message1
    more multiline
    more multiline
my multiline message2
    more multiline
    more multiline

However, Filebeat is not able to aggregate messages which are not comming immediately after the starting line. Example:

09/May/2017:22:28:26 +0530 [1485] -> GET /editor.html/content/geometrixx-outdoors/en.html HTTP/1.1
09/May/2017:22:28:26 +0530 [1486] -> GET /editor.html/content/geometrixx-outdoors/de.html HTTP/1.1
09/May/2017:22:28:27 +0530 [1485] <- 200 text/html;charset=utf-8 518ms

A new option should be added to the multiline filter which lets the user define a patterns to aggregate such lines named demultiplex. A user could provide a regex pattern which contains the value to be aggregated.

multiline.demultiplex: 'thread-id: [(\d+)]'

The expected message is the following after applying the filter:

{
    "message": "-> GET /editor.html/content/geometrixx-outdoors/en.html HTTP/1.1
<- 200 text/html;charset=utf-8 518ms"
}
ph commented 5 years ago

@kvch You might want to take a look at https://github.com/logstash-plugins/logstash-filter-aggregate for inspiration.

botelastic[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

elasticmachine commented 4 years ago

Pinging @elastic/integrations-services (Team:Services)

botelastic[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.