Open pvcasillasg opened 6 months ago
Pinging @elastic/elastic-agent-data-plane (Team:Elastic-Agent-Data-Plane)
I will let @belimawr keeping me honest here but max_bytes
is only for log input and message_max_bytes
is the way to go with Filestream
I will let @belimawr keeping me honest here but
max_bytes
is only for log input andmessage_max_bytes
is the way to go with Filestream
Yeah, I know, but I was desperate and I tried a lot of things, like combinations like those.
Finally I manage to make it work, with message_max_bytes, that I have also tried it before.
But I still get the same error prompt
write error: data size (11554286 bytes) is greater than the max file size (10485760 bytes)
Yes, for the Filestream input, you need to use message_max_bytes
as stated in our documentation.
@pvcasillasg, could you post here the whole log file at debug level?
Just to clarify your case: you're trying to ingest a file that is 10Mb+ as a single event with Filebeat and using Logstash as output, is that correct?
Yes, for the Filestream input, you need to use
message_max_bytes
as stated in our documentation.@pvcasillasg, could you post here the whole log file at debug level?
Just to clarify your case: you're trying to ingest a file that is 10Mb+ as a single event with Filebeat and using Logstash as output, is that correct?
Yes, that's exactly the case, and i have the message_max_bytes setup correctly in my configuration file, i will upload the log within a few days, since im out of home
Yes, for the Filestream input, you need to use
message_max_bytes
as stated in our documentation.@pvcasillasg, could you post here the whole log file at debug level?
Just to clarify your case: you're trying to ingest a file that is 10Mb+ as a single event with Filebeat and using Logstash as output, is that correct?
root@docker:/var/lib/filebeat/registry/filebeat# filebeat -c /etc/filebeat/filebeat.yml
2024-06-03 16:17:14.301447465 +0000 UTC m=+21.541617428 write error: data size (11491365 bytes) is greater than the max file size (10485760 bytes)
2024-06-03 16:17:14.312396991 +0000 UTC m=+21.552566934 write error: data size (11554385 bytes) is greater than the max file size (10485760 bytes)
Attached the log for the filebeat run
Also, after cleaning tests in my filebeat.yml, I found that if I don't set max_bytes, the output file stream keeps sending incomplete
The working config for filebeat in my case is:
filebeat.inputs:
- type: filestream
id: xml-oscap_pre
enabled: true
encoding: utf-8
message_max_bytes: 52428800
max_bytes: 52428800
paths:
- /home/ansible/ansible_openscap/reports/pre_reports/*.xml
parsers:
- multiline:
type: pattern
pattern: '^<\?xml*'
negate: true
match: after
max_lines: 30000000
timeout: 20s
Without max_bytes everything crashes
@pvcasillasg could you also post your output configuration? Redact all sensitive information like credentials, Domains, IPs, etc.
@pvcasillasg could you also post your output configuration? Redact all sensitive information like credentials, Domains, IPs, etc.
Here it's the full yaml file, as I said before, if I comment or delete de max_bytes lane, it stop working.
filebeat.inputs:
- type: filestream
id: xml-oscap
enabled: true
encoding: utf-8
message_max_bytes: 52428800
max_bytes: 52428800
paths:
- $PATH
parsers:
- multiline:
type: pattern
pattern: '^<\?xml*'
# flush_pattern: '^[\S]*<\/Benchmark>'
negate: true
match: after
max_lines: 3000000
timeout: 20s
logging.level: debug
logging.to_files: true
logging.files:
path: /var/log/filebeat
name: filebeat
keepfiles: 7
permissions: "0644"
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 1
setup.kibana:
output.logstash:
hosts: ["192.168.1.220:5044"]
bulk_max_size: 4096
processors:
# - add_host_metadata:
# when.not.contains.tags: forwarded
# - add_cloud_metadata: ~
# - add_docker_metadata: ~
# - add_kubernetes_metadata: ~
# - decode_xml:
# field: message
# target_field: TestResult
# to_lower: true
@pvcasillasg I think the error is arising because your have logging.level
set to debug
and it actually logs the entire event (20 MB XML in your case). logging
has a maximum permissible limit of 10MB by default per file.
You can fix it via two options:
logging.files > rotateeverybytes
to more than 20MB (maybe 21MB, as the log event has some other information as well)logging.level
to info
@pvcasillasg I think the error is arising because your have
logging.level
set todebug
and it actually logs the entire event (20 MB XML in your case).logging
has a maximum permissible limit of 10MB by default per file.You can fix it via two options:
- set
logging.files > rotateeverybytes
to more than 20MB (maybe 21MB, as the log event has some other information as well)- set
logging.level
toinfo
Huh, I think nope.
I only enable the debug log level in order to attach them for this issue. Without configuring any log level im still getting the same error prompt.
@pvcasillasg Okay, understood. Can you still update logging.files.rotateeverybytes
to a bigger value and try to run it and see you face any error?
In my case, this issue was reproducible and increasing logging.files.rotateeverybytes
fixed it.
@pvcasillasg Hi! Just checking in if the workaround worked for you?
For confirmed bugs, please report:
filebeat.yaml
Error:
2024-05-28 16:54:38.965305489 +0000 UTC m=+6.335550826 write error: data size (11554286 bytes) is greater than the max file size (10485760 bytes)
It seems to filebeat not setting the max_bytes parameter I configured in filebeat.yaml