Open stevedearl opened 9 years ago
Sorry - I should probably have attached my custom patterns too, for completeness. They're provided below: ############################### TIBCO_DATESTAMP ((%{YEAR}[\s/-]%{MONTH}[\s/-]%{MONTHDAY}[\s]%{TIME})|(%{TIMESTAMP_ISO8601}))
BW_APP_NAME (BW.%{WORD}) ###############################
Hi,
A quick update on this. I retested with the Multiline codec rather than filter and could not recreate this issue.
Unfortunately, the Multiline codec doesn't seem capable of flushing the last entry within the input file to Elasticsearch/output file - but this seems to be a known limitation of the codec (and people suggest switching to the filter instead...;().
Cheers, Steve
Hi, any updates or work-arounds for this bug?
I have just started using Logstash today and have encountered this exact symptom. Pretty disappointing.
Everything is fine except when using Elasticsearch as output, then the one grok parse failure comes. I'm using Logstash version 1.5.4.
-Henrik
Hi Logstash,
First off, I'm relatively new to Logstash/Elasticsearch so apologies if this is either a known issue or a stupid error on my part (more likely to be the latter, I think ;)).
I've got an odd issue with Logstash/ElasticSearch. I have an input file containing 3,000 log entries each of which spans multiple lines. When I read the file with Logstash into ElasticSearch with the multiline filter enabled I get a grokparsefailure on one of the records, basically because Logsearch has split one of the multiline records in half (the second half doesn't conform to the grok pattern so I get an error).
I don't know why I get this error, which happens consistently. However, from further testing I have noticed the following:
My Logstash configuration file is shown below:
input { file { path => "C:/Applications/Elastic/BWLogReader_Test/test_logs/test_multiline_3.log"
}
filter { multiline { patterns_dir => "C:\Applications\Elastic\BWLogReader_Test\logstash_patterns" pattern => "^%{TIBCO_DATESTAMP} " negate => true what => "previous" } if [type] == "tibco_bw" {
Extract main fields
}
output {
elasticsearch {
host => localhost
}
}
A sample of the input file I was using is shown below. I couldn't find a way to attach the actual file to this issue (perhaps someone can tell me how to do that):
2015 Aug 16 09:35:52:000 GMT +1 BW.APPLICATION-ENGINE Info [BW-User] Job-123456789 Log Entry 1 11 12 13 14 15 16 17 2015 Aug 16 09:35:52:001 GMT +1 BW.APPLICATION-ENGINE Info [BW-User] Job-123456789 Log Entry 2 21 22 23 24 25 26 27 2015 Aug 16 09:35:52:002 GMT +1 BW.APPLICATION-ENGINE Info [BW-User] Job-123456789 Log Entry 3 31 32 33 34 35 36 37 2015 Aug 16 09:35:52:003 GMT +1 BW.APPLICATION-ENGINE Info [BW-User] Job-123456789 Log Entry 4 41 42 43 44 45 46 47
I didn't see the error until I got up to Log Entry #24.
I've also attached a image from BeyondCompare which shows the comparison output with the ElasticSearch output enabled and disabled.
Hopefully someone can tell me if this is a known issue, or what I'm doing wrong. For information I'm using: Logstash 1.5.3 Elasticsearch 1.7.1
Cheers, Steve