billgraziano / xelogstash

Send SQL Server Extended Events to Logstash, Elastic Search, or JSON
Other
26 stars 11 forks source link

Trying to add fields to the transfer #91

Closed fribse closed 1 year ago

fribse commented 1 year ago

I have a logstash -> elasticsearch output that looks like this

output {
  elasticsearch {
    hosts => "https://elasticsearch:9200"
      index => "%{[fields][logtype]}-%{[@metadata][version]}-%{+YYYY.MM}"
      document_type => "%{[@metadata][type]}"
      cacert => "/usr/share/logstash/config/certs/ca/ca.crt"
      user => user
#logstash_internal
      password => password
  }
}

So I get different indexes split up, and it shows the version of the agent, and gives it a month stamp so the indexes don't get too big.

I tried adding this to xelogwriter:

timestamp_field_name = "@timestamp"

adds =  [   "global.log.vendor:Microsoft",
            "global.log.type:Application",
            "global.log.collector.application:sqlxewriter.exe",
            "global.log.collector.version:'$(VERSION)'",
            "fields.logtype:sqllogs",
        ] 

copies = [  "event.mssql_computer:global.host.name",
            "event.mssql_domain:global.host.domain",
            "event.mssql_version:global.log.version"
        ]

But logstash can't get those fields out of the message? So in elasticsearch it looks like this: billede The logstash doesn't pick up the fields, and thus the index is not named correctly. I even tried adding a 'mutate' to force the names for tcp-input logs (this is the only one I have so far), but that didn't help either.

fribse commented 1 year ago

Ok, I found out that I need to add the JSON filter to it. So I added this to logstash:

filter {
  if "tcp-input" in [type] {
    json {
      source => "message"
    }
  }
}

And now it works, now I just need to make sure I get the fields out as I need them.