logstash-plugins / logstash-input-syslog

Apache License 2.0
37 stars 38 forks source link

Syslog changes since logstash version 2.4.x #32

Closed bcecchinato closed 8 years ago

bcecchinato commented 8 years ago

Hi there !

I've migrated from logstash 2.4.x to 5.0.1, and found out that the input-syslog plugin doesn't work the same way than before.

Before the update, when a log is sent on the logstash input, the fields were set and discarded from the message :

I'm using the following versions :

Example of event :

<134>Nov 18 17:27:40 tqdbr001s default: data/data-nginx-private: 10.42.179.10 - - es.qlf-fpl-data-middle.svc.meshcore.net [18/Nov/2016:17:27:40 +0100] POST "/_xpack/monitoring/_bulk" "system_id=kibana&system_api_version=2&interval=10000ms" "HTTP/1.1" 200 27 0.419 "-" "-",data/data-nginx-private: 10.42.179.10 - - es.qlf-fpl-data-middle.svc.meshcore.net [18/Nov/2016:17:27:40 +0100] POST "/_xpack/monitoring/_bulk" "system_id=kibana&system_api_version=2&interval=10000ms" "HTTP/1.1" 200 27 0.419 "-" "-"

Is it a normal change or a regression ?

Regards,

jordansissel commented 8 years ago

Can you include a sample syslog message to test with?

On Fri, Nov 18, 2016 at 8:24 AM Bastien Cecchinato notifications@github.com wrote:

Hi there !

I've migrated from logstash 2.4.x to 5.0.1, and found out that the input-syslog plugin doesn't work the same way than before.

Before the update, when a log is sent on the logstash input, the fields were set and discarded from the message :

  • facility
  • facility_label
  • priority
  • program
  • severity
  • severity_label

I'm using the following versions :

  • Version: Logstash 5.0.1 - Plugin 3.1.1
  • Operating System: CentOS7 with docker container
  • Config File (if you have sensitive info, please remove it):

input { syslog { port => 514 timezone => "Europe/Paris" type => "syslog" } }

Is it a normal change or a regression ?

Regards,

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/logstash-plugins/logstash-input-syslog/issues/32, or mute the thread https://github.com/notifications/unsubscribe-auth/AAIC6lTvcu5FKhRumUfq34OmjhY7PD2Lks5q_dE6gaJpZM4K2qTY .

bcecchinato commented 8 years ago

@jordansissel Just did.

bcecchinato commented 8 years ago

What's weird is that there has been no changes from the sources that sends the logs on the platform. I've done some digging, and it seems that logstash adds some extra-informations before the syslog data :

{
  "_index": "logstash-failed-2016.11.18",
  "_type": "logs",
  "_id": "AVh4Q9NpVsG_MQi7dxom",
  "_score": null,
  "_source": {
    "@timestamp": "2016-11-18T16:27:40.138Z",
    "@version": "1",
    "client": "failed",
    "message": "2016-11-18T16:27:40.003Z 10.42.179.24 <134>Nov 18 17:27:40 tqdbr001s default: data/data-nginx-private: 10.42.179.10 - - es.qlf-fpl-data-middle.svc.meshcore.net [18/Nov/2016:17:27:40 +0100] POST \"/_xpack/monitoring/_bulk\" \"system_id=kibana&system_api_version=2&interval=10000ms\" \"HTTP/1.1\" 200 27 0.419 \"-\" \"-\",data/data-nginx-private: 10.42.179.10 - - es.qlf-fpl-data-middle.svc.meshcore.net [18/Nov/2016:17:27:40 +0100] POST \"/_xpack/monitoring/_bulk\" \"system_id=kibana&system_api_version=2&interval=10000ms\" \"HTTP/1.1\" 200 27 0.419 \"-\" \"-\"",
    "index_day": "2016.11.18",
    "tags": []
  },
  "fields": {
    "@timestamp": [
      1479486460138
    ]
  },
  "sort": [
    1479486460138
  ]
}

See in the document the 2016-11-18T16:27:40.003Z 10.42.179.24, that has been added. More interesting, the _type should be syslog, but i have logs instead. Hope this helps.

bcecchinato commented 8 years ago

@jordansissel I think i've found out where the issue is. We are using a kafka broker, and the output codec has changed from json to plain. I've made a test using the json codec again, and the fields are there.

Sorry for the issue.

Regards,

jordansissel commented 8 years ago

I am glad you got things working :)

On Fri, Nov 18, 2016 at 9:37 AM Bastien Cecchinato notifications@github.com wrote:

@jordansissel https://github.com/jordansissel I think i've found out where the issue is. We are using a kafka broker, and the output codec has changed from json to plain. I've made a test using the json codec again, and the fields are there.

Sorry for the issue.

Regards,

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/logstash-plugins/logstash-input-syslog/issues/32#issuecomment-261592575, or mute the thread https://github.com/notifications/unsubscribe-auth/AAIC6keMTN7bPe-U4gZBFv_MrazoQhmkks5q_eIqgaJpZM4K2qTY .