elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
104 stars 3.51k forks source link

Not able to parse logs having spaces between key value pair in json #16017

Open dhruvishah15 opened 8 months ago

dhruvishah15 commented 8 months ago

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version) [8.11.4]
  2. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker) [expanded from tar]
  3. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes) [using systemd]

JVM (e.g. java -version): openjdk version "17.0.8" 2023-07-18 LTS

If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:

  1. JVM version (java -version)
  2. JVM installation source (e.g. from the Operating System's package manager, from source, etc).
  3. Value of the LS_JAVA_HOME environment variable if set.

OS version (uname -a if on a Unix-like system): GNU/Linux 4.18.0-477.27.1.el8_8.x86_64

Description of the problem including expected versus actual behavior: I have a log containing json object. The log gets parsed if json object has no spaces. If it has spaces between key value pair, it is not getting parsed.

Configuration file used input { syslog { port => 3011 } }

filter { grok { match => { "message" => [ "%{SYSLOGTIMESTAMP:timestamp4} %{DATA:time_ms}|%{DATA:field1}|%{DATA:field2}|99|%{DATA:field3}|%{DATA:field4}|%{DATA:field5}|%{DATA:field6}|%{DATA:field7}|%{DATA:field8}|%{DATA:field9}|%{DATA:field10}|%{DATA:field11}|%{DATA:field12}|%{GREEDYDATA:field13}" ] } } date { match => ["timestamp4", "MMM dd HH:mm:ss"] } if [field13] { mutate { add_field => {"log_type" => "my-logs"} } } }

output { if [log_type] == "my-logs" { stdout { codec => rubydebug } elasticsearch { hosts => ["ES_HOST:9200"] index => "my-logs-000001" } } }

Provide logs (if relevant): Logs getting parsed:

echo "Mar 21 13:27:11 11:11.366293|dataadwhw1|ebsmp4713user5_@maiator|99|4064|22|SUCCESS|data|19|UA101|10.1.1.70|https|data.com|{"wrg_id":"200000337"}|200" | nc localhost 3011

Log not getting parsed: echo "Mar 21 13:27:11 11:11.366293|dataadwhw1|ebsmp4713user5_@maiator|99|4064|22|SUCCESS|data|19|UA101|10.1.1.70|https|data.com|{"wrg_id": "200000337"}|200" | nc localhost 3011

dhruvishah15 commented 7 months ago

When I installed logstash in another environment, it was able to parse both the logs with same conf file. But in the environment where I need logstash, there this issue is observed. In fact, this is not just observed with json logs. Even for log patterns where datatype is greedydata if the value has "data: 1" or any such value with ": " in the log it is not getting parsed. Can anyone help me out with this issue?