Closed KarnGusain closed 5 years ago
This warning is logged infrequently, IIRC.
It is saying that if the file input reaches the open file limit (default 4095) you will get a warning that this limit has been reached every 300 seconds.
This limit can be reached if your path glob discovers more than 4095 files (some users in the past have had millions) so we open a fixed amount of files for tailing. In order to give the other files a chance of being processed we have close_older
which closes files (but does not abandon them) to make way for others.
The limit reached warning is important because you might never otherwise know that some files are not being processed especially if you have a "dropbox" style mechanism in place.
You can quieten the filewatch.tailmode.handlers.createinitial
logging sub-system only by adding an entry in the log4j config file, e.g.
logger.createinitial.name = filewatch.tailmode.handlers.createinitial
logger.createinitial.level = error
The logging API allows for different levels of logging for different components in LS.
Thanks a mile for your take on this @guyboertje . However, is there a way to increase file input
limit .
Secondly, In case it reaching the open file limit the what would be the aggregate value to set for close_older
like close_older => 100
.
Will adding the suggested entries in log4j will only suppress the Warning only?
Be aware that that warning is not saying you have reached the limit. It is saying that if you reached the limit you'll get a warning every 300 seconds - so as not to flood the logs.
max_open_files
is the setting that sets the limit. Do you have more that 4095 discoverable files?
The log4j config I gave will suppress all logging at WARN, INFO and TRACE for that component only.
@guyboertje , Yes i have more than 4095 discoverable files as i have 5k Servers which are throwing syslogs and logstash processing them So each time if there is log update on the syslog file then it will process them all , where as i have another logs rmlogs which again having some file may be 100 or 10000 based on that.
Ahh.
Now you do not need to increase the max_open_files limit in order to process more than 4095 files.
You should think of mx_open_files as a 'sliding window' N number of files wide.
The close_older setting controls when the windows slides over by a few files.
Each file is handled by a state machine and can be in one of 'watched', 'active', 'closed', 'ignored' and some other more transient states ('delayed_delete', 'rotation_in_progress')
close_older
will move file from the 'active' state to the 'closed' state when the file size has not changed in the close_older
interval.
The exact effect of this depends on whether the files discovered are actually being written to (like in the classic 'tailing' scenario) or being dropped in place (having a fixed size) by some script or rotation activity.
If the current set of open files have some that are written to frequently and some that are infrequent then the file input favours the frequent ones and will close the infrequent ones.
A file in the closed is still checked for changes and if detected it is put into the active state, meaning it becomes eligible for moving to the active state when there is capacity in the sliding window.
If you have tens of thousands of frequently updated files that need monitoring then you should consider using filebeat across the 5000 servers. We should talk more about this in Discuss
Thanks again @guyboertje , i have already a thread there opened Discuss , lets talk more there.
Closing, because we successfully discussed this in Discuss
I'm facing until recently a very peculiar warning on the
logstash
log file[filewatch.tailmode.handlers.createinitial] open_file OPEN_WARN_INTERVAL is '300'
, i've searched around the web for every possible help to dig it myself but could not found any thread providing a solution in turn i though to post it here with complete details:I had already opened a questions on our forum but did not get anything.
Details:
LOG FILE: /var/log/logstash/logstash-plain.log
AND STATUS:
LOGSTASH OPEN FILE SETTING:
I have Created below file in order to set the Open file limit for logstash ..
Below File i saw already there i have just adjusted the limit:
SYSTEM LEVEL File Descriptor settings:
LOGSTASH JVM Settings:
LOGSTASH INPUT Configurations:
Logstash config for syslog:
Logstash config for other custom logs:
LOGSTASH logstash.yml file:
Would appreciate any help.