Closed TheBig159 closed 7 years ago
Solved. After sending the above issue I saw the issue Add a notice about deserializers in logstash-5 and tried it. It solved the problem.
It would be nice to have an example of hou using this codec with Kafka putting in evidence that the parameters named into Add a notice about deserializers in logstash-5 had to be set.
Thanks. Regards
Hello, I've installed the plugin on my LS installation (v.5.2.1) and I configured my kafka input section for using it. I started LS and the events start flowing from my Kafka queue towards a file (just for testing) and I was surprised to see that every field is correctly converted except the time fields.
{"eventId":"my_eventId","@timestamp":"2017-08-11T10:26:53.799Z","activity":{"activityTime":-111032204562303474814398456,"startTime":-52944280892516839416,"type":"my_type","watchType":"my_watch_type","uuid":"my_uuid"},"@version":"1","type":"My_type","device":{"deviceId":"my_deviceId","version":"My_version"},"timestamp":-111032204562303474814398456}
I thought about that there were problems on the data so I tried to consume them with a kafka-consumer and they are correctly interpreted and translated
{"eventId":"my-eventId","timestamp":1502445430,"device":{"deviceId":"my_deviceId","version":"my_version"},"activity":{"activityTime":1502445430,"type":"my_type","watchType":"my_watch_type","startTime":1501848509,"uuid":"my_uuid"}}
As you can see, the events managed by Logstash are damaged: all the fields that contain a date are wrongly converted while the kafka consumer converts them to the correct UNIX format.
Into the AVRO schema the fields activityTime, activityStart and timestamp are defined as int. Logstash do nothing apart taking the input from the Kafka queue and sending it to a test file adding its service fields (no filters are applied).
Regards.