Open kafkapre opened 8 years ago
multiline filter is not thread safe, so is not good to collapse logs as you described. one improvement to this would be to use the multiline codec in your input section, see https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html for details.
Closing this as is a known problem of the multiline filter, feel free to reopen any other issue you might find. Keep in mind https://discuss.elastic.co/, a great source of community support.
Exactly, I would prefer to put multiline into input section. But, as I described above, it does not work correctly.
Use the codec syntax, your example is wrong at my understanding.
/purbon
On Wed, 9 Dec 2015 13:11 kafkapre notifications@github.com wrote:
Exactly, I would prefer to put multiline into input section. But, as I described above, it does not work correctly.
— Reply to this email directly or view it on GitHub https://github.com/elastic/logstash/issues/4308#issuecomment-163205935.
Oh, sorry I wrote i incorrectly. This is correct (I have also corrected it above).
input {
gelf {
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => "previous"
}
}
And for this case, multilining does not work.
Hmm, I think this is a problem that the gelf input needs to get together with the IdentityMapCodec. Thoughts, @guyboertje?
I will look into it
At my understanding, after looking at the code base https://github.com/logstash-plugins/logstash-input-gelf/blob/master/lib/logstash/inputs/gelf.rb, the gelf codec have the codec directive, but the codec is never used within. So this is the bug at my understanding.
@kafkapre I agree there is a bug, no codec is actually working there, can you please reopen this issue in the https://github.com/logstash-plugins/logstash-input-gelf repo? thanks a lot for your finding.
For others that might be searching for a solution to this. This worked beautifully for me: https://stackoverflow.com/questions/34075538/elk-process-multiline-logs-from-multiple-docker-images
This plugin does not use a codec because it receives a JSON string that represents an event. Codecs are meant to take a line and create an Event. Filters are meant to operate on Events. The multiline codec buffers lines until some condition is seen - it would have to be changed to buffer events and match on a field, but wait - that is what the Mulitline Filter does. So we would need a way to run the multiline filter after an input but before the queue. We can't do that at the moment.
@kafkapre did you ever find a workaround?
@jmreicha and other future readers:
The solution offered in the Stackoverflow link from @uschtwill IS the solution for this issue. There are no workarounds. There may well be a performance impact when using the multiline filter because, by design, it needs to receive every event. This is so because we can't ensure that events with the same identity will always travel down the same path in parallel multithreaded filter stages.
@guyboertje Thanks for the update. I was worried about performance but maybe it's okay? Do you have a ballpark of how many events a single thread can handle?
@jmreicha - unfortunately, the answer to that question is very difficult to write because it depends on the version of LS and the complexity of the configuration i.e. whether you have other filters before or after the multiline filter, the rate at which the events are 'pumped' in via the input(s) and the time taken for the output to deliver an event out.
This is a fantastic presentation from Avleen at Etsy that give v good advice on systematic tuning of the Elastic Stack.
I'll check it out, thanks. On Fri, Sep 2, 2016 at 3:11 AM Guy Boertje notifications@github.com wrote:
@jmreicha https://github.com/jmreicha - unfortunately, the answer to that question is very difficult to answer because it depends on the version of LS and the complexity of the configuration i.e. whether you have other filters before or after the multiline filter, the rate at which the events are 'pumped' in via the input(s) and the time taken for the output to deliver an event out.
This is a fantastic presentation http://www.slideshare.net/avleenvig/elk-mooseively-scaling-your-log-system from Avleen at Etsy that give v good advice on systematic tuning of the Elastic Stack.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/logstash/issues/4308#issuecomment-244337175, or mute the thread https://github.com/notifications/unsubscribe-auth/AA_a-pvJYwliK6IRegefIS9zjOAFFc3gks5ql_YugaJpZM4GvqUw .
I'm a bit confused that this issue is closed.
I'm in a similar scenario in which I'm being unable to make the multiline codec works properly with the gelf plug-in in ELK 5.x
The solution from @stream_identity is using the multiline filter with the _streamidentity parameters, but the multiline filter is deprecated and unavailable now, so I think we are now at the starting point :-(
Bump. This really needs to be reopened.
Logstash is the wrong place to fix multiline events, you need to fix as soon as possible, so inside the app or just after the docker:
https://github.com/elastic/beats/issues/918
I hope someone implements the new docker logger beat plugin with multiline support, as in filebeat
@danielmotaleite I completely disagree. One of the original killer features of Logstash was the ability to handle multiline events.
@maxblaze I'll reopen this.
For folks curious about the future, there's some problems to discuss:
1) The multiline filter is gone and probably not coming back. For performance, Logstash will process things out of order (when using multiple pipeline worker threads), which means multiline filtering, even if present, would harm performance. 2) The aggrregate filter can do what the multiline filter did. It has the same single-threaded, ordered-processing limitations as the multiline filter did. 3) It could be possible to add codec support to the gelf input, but it is unclear if this will solve y'all's problems.
With filebeat being able to automatically process docker logs (via json-file plugin) and it can manage and merge the multiline in the multiple json events, i would say again that trying to do multiline in logstash is a lost battle... either you can only use one thread to get in-order logs so it can work, or you will get sooner or badly merged log lines or lines that are not merged. The more multi-line events you have and the more load you have, the higher is the risk of broken logs.
the only sane way to solve the multi-line is closer to the source as possible. If not possible in the app producing the logs, the next guy (filebeat, fluent, etc) should be the one trying to do
please close this. it would be nice if elastic docs incorporated a note about this.
@Shokodemon filebeat is the recommended solution for what you are doing. https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-docker.html
@jordansissel fixed. and thank you. luckily, i have a filebeat setup at hand that i've prepared before.
also thank you for the emotional circuit breaker nudge.
@Shokodemon :+1: I hope filebeat gets you on a path to success! Let us know (probably in elastic/beats or on discuss.elastic.co) if you have issues doing multiline w/ filebeat for docker logs.
Per https://github.com/moby/moby/issues/17763 , docker considers the json-files as private and won't support changes that make it easier to read them. This makes gelf logging attractive as it doesn't require fighting dockerd or running filebeat as privileged.
From that issue, docker specifically said this in relation to exposing the json-files for logstash consumption:
If you want to use logstash, please use one of the available logging drivers (syslog, fluentd, journald, gelf), which logstash seems to support any/all of natively.
Yes, docker do not officially support parsing of their logs and prefer the docker plugin... elastic build the docker support based on the log file because it works in all docker versions (not only in recent, like the support for docker log plugins) and is easier to work with, based on the existent filebeat code.
As a bonus. running docker log
still works fine and huge messages can be merged with the multi-line support.
If using the docker plugin support, you would be limited to the latest docker versions, it would take longer for include support in filebeat and docker log
would output anything. While docker plugin code support partial messages generated by docker, most plugins do not support it.
But fear not, if you do not trust the filebeat docker log parser, just use a docker log plugin, like the kafka-logdrive to output logs to a kafka https://github.com/MickayG/moby-kafka-logdriver
or redis https://github.com/pressrelations/docker-redis-log-driver
or sematext (but i do not know enough about this one, but i think it can output to elasticsearch): https://github.com/sematext/sematext-agent-docker https://hub.docker.com/_/sematext-agent-monitoring-and-logging
I run docker container with gelf driver and would like to collapse multiline logs in Logstash. My Logstash conf.
It works perfectly when I process logs from one docker container, but for two or more it does not work, because it collapse messages of both (or more) logs streams.
I would expect, that setting up multilining in input would solve the problem.
but multilining does not work correctly with this set up (seems because of bug). Any suggestions? Thanks.
I am using: Docker 1.9.1, Logstash 2.1