Open davewat opened 5 years ago
Issue is invalid field definition for EID 148. Wrong data type is causing the error. The name is also incorrect. Will submit PR with fix.
This issue is still relevant as of 4.2.1. Merging this PR would be very helpful.
Seeing the same error using V 7.16.1. Is there any workaround?
I found that Filebeat handles Netflow very well, and doesn't have this (or other) issues, so we switched over for Netflow. It is a simple listener that forwards everything over to a Kafka topic where we process it further. To help anyone get started, I've included a sample of my config. YMMV:
filebeat.inputs:
- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192
#custom_definitions:
#- /data/override.yml
detect_sequence_reset: true
processors:
- add_tags:
tags: [netflow]
- add_fields:
target: ''
fields:
log.group: netflow
output.kafka:
# initial brokers for reading cluster metadata
enabled: true
hosts: ["10.10.10.1:9092", "10.10.10.2:9092", "10.10.10.3:9092"]
topic: "raw"
logging.level: debug
logging.to_files: true
logging.files:
path: /var/log/filebeat
name: filebeat
keepfiles: 7
permissions: 0644
Docker Container Logstash 7.3 logstash-codec-netflow
Receiving Netflow from Palo Alto
Millions of errors: [logstash.codecs.netflow ] Reduced-size encoding for uint32 is larger than uint32 {:field=>[:uint32, :conn_id], :length=>8}
No data.