elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
14.1k stars 3.48k forks source link

ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" #8744

Open robin13 opened 6 years ago

robin13 commented 6 years ago

Unpack logstash from tar.gz If you start logstash within the unpacked directory it starts without errors:

logstash-6.0.0$ bin/logstash -e 'input { stdin { } } output { stdout {} }'
Sending Logstash's logs to /home/rclarke/elastic/stack/logstash-6.0.0/logs which is now configured via log4j2.properties
[2017-11-28T09:47:00,382][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/rclarke/elastic/stack/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-28T09:47:00,386][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/rclarke/elastic/stack/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-11-28T09:47:00,722][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-11-28T09:47:01,045][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-28T09:47:01,517][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x7724a1d2@/home/rclarke/elastic/stack/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-11-28T09:47:01,565][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
The stdin plugin is now waiting for input:
[2017-11-28T09:47:01,587][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}

However if you start from any directory which contains the log4j.options file, a lot of errors are produced:

rclarke@es-rclarke:config$ /home/rclarke/elastic/stack/logstash-6.0.0/bin/logstash -e 'input { stdin { } } output { stdout {} }'
2017-11-28 10:11:30,920 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2017-11-28 10:11:30,922 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2017-11-28 10:11:30,924 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2017-11-28 10:11:30,924 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
2017-11-28 10:11:35,620 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2017-11-28 10:11:35,621 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2017-11-28 10:11:35,623 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2017-11-28 10:11:35,624 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
Sending Logstash's logs to /home/rclarke/elastic/stack/logstash-6.0.0/logs which is now configured via log4j2.properties
[2017-11-28T10:11:35,838][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/rclarke/elastic/stack/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-28T10:11:35,845][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/rclarke/elastic/stack/logstash-6.0.0/modules/fb_apac

and quite oddly, a directory ${sys:ls.logs} is created:

rclarke@es-rclarke:config$ ls -al
total 36
drwxrwxr-x  3 rclarke rclarke 4096 Nov 28 10:11 .
drwxrwxr-x 12 rclarke rclarke 4096 Nov 23 10:06 ..
-rw-r--r--  1 rclarke rclarke 1873 Nov 10 20:59 jvm.options
-rw-r--r--  1 rclarke rclarke 3958 Nov 10 20:59 log4j2.properties
-rw-r--r--  1 rclarke rclarke 6368 Nov 10 20:59 logstash.yml
-rw-r--r--  1 rclarke rclarke 3190 Nov 10 20:59 pipelines.yml
-rw-r--r--  1 rclarke rclarke 1702 Nov 10 20:59 startup.options
drwxrwxr-x  2 rclarke rclarke 4096 Nov 28 10:11 ${sys:ls.logs}

This happens if you use the --path.settings parameter as well:

rclarke@es-rclarke:config$ /home/rclarke/elastic/stack/logstash-6.0.0/bin/logstash -e 'input { stdin { } } output { stdout {} }' --path.settings /home/rclarke/elastic/stack/logstash-6.0.0/config
2017-11-28 10:16:31,687 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2017-11-28 10:16:31,688 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2017-11-28 10:16:31,689 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2017-11-28 10:16:31,689 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
2017-11-28 10:16:36,471 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2017-11-28 10:16:36,472 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2017-11-28 10:16:36,479 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2017-11-28 10:16:36,480 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
lhcunha commented 6 years ago

I am having a similar problem...

bat9r commented 6 years ago

+1 Same error with same config

Update: Previosly i used deb package and installed using apt-get and had this bug. After i reinstall, just unpacks logstash-6.0.0.tar.gs in /etc/logstash, all starts work corecttly.

rbabyuk-vs commented 6 years ago

faced equally the same situation. LS ver. 6.1.0

Bharathkumarraju commented 6 years ago

` status = error name = LogstashPropertiesConfig

appender.console.type = Console appender.console.name = plain_console appender.console.layout.type = PatternLayout appender.console.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %m%n

appender.json_console.type = Console appender.json_console.name = json_console appender.json_console.layout.type = JSONLayout appender.json_console.layout.compact = true appender.json_console.layout.eventEol = true

appender.rolling.type = RollingFile appender.rolling.name = plain_rolling appender.rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 1 appender.rolling.policies.time.modulate = true appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %-.10000m%n appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size = 100MB appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 30

appender.json_rolling.type = RollingFile appender.json_rolling.name = json_rolling appender.json_rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.json_rolling.policies.type = Policies appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling.policies.time.interval = 1 appender.json_rolling.policies.time.modulate = true appender.json_rolling.layout.type = JSONLayout appender.json_rolling.layout.compact = true appender.json_rolling.layout.eventEol = true appender.json_rolling.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling.policies.size.size = 100MB appender.json_rolling.strategy.type = DefaultRolloverStrategy appender.json_rolling.strategy.max = 30

rootLogger.level = ${sys:ls.log.level} rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console rootLogger.appenderRef.rolling.ref = ${sys:ls.log.format}_rolling

Slowlog

appender.console_slowlog.type = Console appender.console_slowlog.name = plain_console_slowlog appender.console_slowlog.layout.type = PatternLayout appender.console_slowlog.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %m%n

appender.json_console_slowlog.type = Console appender.json_console_slowlog.name = json_console_slowlog appender.json_console_slowlog.layout.type = JSONLayout appender.json_console_slowlog.layout.compact = true appender.json_console_slowlog.layout.eventEol = true

appender.rolling_slowlog.type = RollingFile appender.rolling_slowlog.name = plain_rolling_slowlog appender.rolling_slowlog.fileName = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}.log appender.rolling_slowlog.filePattern = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.rolling_slowlog.policies.type = Policies appender.rolling_slowlog.policies.time.type = TimeBasedTriggeringPolicy appender.rolling_slowlog.policies.time.interval = 1 appender.rolling_slowlog.policies.time.modulate = true appender.rolling_slowlog.layout.type = PatternLayout appender.rolling_slowlog.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %.10000m%n appender.rolling_slowlog.policies.size.type = SizeBasedTriggeringPolicy appender.rolling_slowlog.policies.size.size = 100MB appender.rolling_slowlog.strategy.type = DefaultRolloverStrategy appender.rolling_slowlog.strategy.max = 30

appender.json_rolling_slowlog.type = RollingFile appender.json_rolling_slowlog.name = json_rolling_slowlog appender.json_rolling_slowlog.fileName = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}.log appender.json_rolling_slowlog.filePattern = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.json_rolling_slowlog.policies.type = Policies appender.json_rolling_slowlog.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling_slowlog.policies.time.interval = 1 appender.json_rolling_slowlog.policies.time.modulate = true appender.json_rolling_slowlog.layout.type = JSONLayout appender.json_rolling_slowlog.layout.compact = true appender.json_rolling_slowlog.layout.eventEol = true appender.json_rolling_slowlog.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling_slowlog.policies.size.size = 100MB appender.json_rolling_slowlog.strategy.type = DefaultRolloverStrategy appender.json_rolling_slowlog.strategy.max = 30

logger.slowlog.name = slowlog logger.slowlog.level = trace logger.slowlog.appenderRef.console_slowlog.ref = ${sys:ls.log.format}_console_slowlog logger.slowlog.appenderRef.rolling_slowlog.ref = ${sys:ls.log.format}_rolling_slowlog logger.slowlog.additivity = false

`

and it is creating a weird directory as ${sys:is:log} any clue guys?

rbabyuk-vs commented 6 years ago

this one works good for me `status = error name = LogstashPropertiesConfig

appender.rolling.type = RollingFile appender.rolling.name = plain_rolling appender.rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log-%i.zip appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 1 appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size=5GB appender.rolling.policies.time.modulate = true appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %-.10000m%n appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 5 appender.rolling.strategy.fileIndex = max appender.rolling.strategy.compressionLevel = 9 appender.rolling.strategy.action.type = Delete appender.rolling.strategy.action.basepath = ${sys:ls.logs} appender.rolling.strategy.action.condition.type = IfLastModified appender.rolling.strategy.action.condition.age = 7D appender.rolling.strategy.action.PathConditions.type = IfFileName appender.rolling.strategy.action.PathConditions.glob = logstash-${sys:ls.log.format}-*

appender.json_rolling.type = RollingFile appender.json_rolling.name = json_rolling appender.json_rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log-%i.zip appender.json_rolling.policies.type = Policies appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling.policies.time.interval = 1 appender.json_rolling.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling.policies.size.size=5GB appender.json_rolling.policies.time.modulate = true appender.json_rolling.layout.type = JSONLayout appender.json_rolling.layout.compact = true appender.json_rolling.layout.eventEol = true appender.json_rolling.strategy.type = DefaultRolloverStrategy appender.json_rolling.strategy.max = 5 appender.json_rolling.strategy.fileIndex = max appender.json_rolling.strategy.compressionLevel = 9 appender.json_rolling.strategy.action.type = Delete appender.json_rolling.strategy.action.basepath = ${sys:ls.logs} appender.json_rolling.strategy.action.condition.type = IfLastModified appender.json_rolling.strategy.action.condition.age = 7D appender.json_rolling.strategy.action.PathConditions.type = IfFileName appender.json_rolling.strategy.action.PathConditions.glob = logstash-${sys:ls.log.format}-*

rootLogger.level = ${sys:ls.log.level} rootLogger.appenderRef.rolling.ref = ${sys:ls.log.format}_rolling`

vicky23 commented 6 years ago

2018-01-03 11:04:32,915 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2018-01-03 11:04:35,440 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"

I am still getting the error in version 6.1.* , when I use an another config file which has less data to be imported it works, but with large data around 9 million records it throws above error.

Ax0r commented 6 years ago

Hello all,

Any update on this matter ?

Thank you

cl0udgeek commented 6 years ago

same here!

wolfguoliang commented 6 years ago

Do not use root account run it ,when I try use other user ,it works normal.

edperry commented 6 years ago

It should work the same as root is a normal user (with extra abilities) if anything it should work better running as root. not worse

wangwanyou commented 6 years ago

I use a non-root user and I still get an error...

wolfguoliang commented 6 years ago

which version of your logstash?

umnya commented 6 years ago

any update about it?

ghost commented 6 years ago

The same problem here ...

fxjean commented 6 years ago

Try export LOGSTASH_HOME=/usr/share/logstash export LOGSTASH_CONF=/etc/logstash cd ${LOGSTASH_HOME} bin/logstash-keystore --path.settings ${LOGSTASH_CONF} [create|list|add]

snowater commented 6 years ago

2018-06-07 16:23:42,305 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root" 2018-06-07 16:23:42,306 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2018-06-07 16:23:42,307 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog" 2018-06-07 16:23:42,307 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"

I get the stack too. Do anybody solved ?

colinsurprenant commented 5 years ago

I've had another report for this problem with a user upgraded from 6.3.1 to 6.3.2 using «rpmnew files to update the existing files where it was appropriate, like log4j jvm and start up files»

I personally haven't been able to reproduce. I am suspecting either permissions problems or environment problems. This also seems to only happen with deb/rpm packages.

colinsurprenant commented 5 years ago

also on Discuss https://discuss.elastic.co/t/logstash-is-creating-a-directory-literally-called-sys-ls-logs/100126

securesean commented 5 years ago

I'm having this issue too

Martin-Arockiaraj commented 5 years ago

Hi,

I have implemented loggers (log4j2 in my project), i'm able to print the loggers to a specific file but can anyone please tell me what changes i need to do if i want to print the same in the console.

below is the sample implementation;


status = error dest = err name = PropertiesConfig

property.filename = target/rolling/rollingtest.log

filter.threshold.type = ThresholdFilter filter.threshold.level = debug

appender.console.type = Console appender.console.name = STDOUT appender.console.layout.type = PatternLayout appender.console.layout.pattern = %m%n appender.console.filter.threshold.type = ThresholdFilter appender.console.filter.threshold.level = error

appender.rolling.type = RollingFile appender.rolling.name = RollingFile appender.rolling.fileName = ${filename} appender.rolling.filePattern = target/rolling2/test1-%d{MM-dd-yy-HH-mm-ss}-%i.log.gz appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = %d %p %C{1.} [%t] %m%n appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 2 appender.rolling.policies.time.modulate = true appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size=100MB appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 5

logger.rolling.name = com.example.my.app logger.rolling.level = debug logger.rolling.additivity = false logger.rolling.appenderRef.rolling.ref = RollingFile

rootLogger.level = info rootLogger.appenderRef.stdout.ref = STDOUT


implemented through this link: https://logging.apache.org/log4j/2.x/manual/configuration.html

thanks in advance!

yaauie commented 5 years ago

@Martin-Arockiaraj this open issue is about a specific open bug; if you have questions about your logstash configuration that do not involve this bug, please ask in the forum.

Martin-Arockiaraj commented 5 years ago

@yaauie ok got it, thanks for addressing.

yuleiqq commented 5 years ago

什么

krishanranditha commented 5 years ago

Double check whether you have killed existing logstash. In my case PID was there even if I killed before. After kill that running PID, logstash started successfully.

$ ps -ef | grep logstash

dev 16820 11257 0 05:16 pts/0 00:00:00 grep --color=auto logstash dev 50216 1 2 May10 ? 03:46:27 /bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -Djruby.jit.threshold=0 -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -cp /home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/commons-compiler-3.0.8.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/google-java-format-1.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/guava-19.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-annotations-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-core-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-databind-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-dataformat-cbor-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/janino-3.0.8.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jruby-complete-9.1.13.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-api-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-core-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-slf4j-impl-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/logstash-core.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.commands-3.6.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.contenttype-3.4.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.expressions-3.4.300.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.filesystem-1.3.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.jobs-3.5.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.resources-3.7.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.runtime-3.7.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.app-1.3.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.common-3.6.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.preferences-3.4.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.registry-3.5.101.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.jdt.core-3.10.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.osgi-3.7.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.text-3.5.101.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/slf4j-api-1.7.25.jar org.logstash.Logstash -f kafkaLogstash.conf

keithpjolley commented 4 years ago

I was able to fix this error by changing log/data directories (recursively) to owner/group "elasticsearch".

and0x000 commented 3 years ago

I think I ran into the same issue when switching from elasticsearch 5.x to elasticsearch-oss 6.x on a Debian Stretch. Looks like elasticsearch-oss has different data/log directory expectations than expected by Debian.

What fixed it for me was specifically setting

path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch

in the elasticsearch.yaml, as those 2 directories were the ones used by the previous elasticsearch 5.x installation.

Mohamed-Hamouda commented 3 years ago

I'm not sure if your issues are the same as mine.

My Issue: I was trying to change the directories for data and logs in the "/etc/elasticsearch/elasticsearch.yml" to new paths I created and I had the same errors.

Solution: I just had to change owner over those two new directories to "elasticsearch" with the "chown" command and it's fixed for me.