Open robin13 opened 6 years ago
I am having a similar problem...
+1 Same error with same config
Update: Previosly i used deb package and installed using apt-get and had this bug. After i reinstall, just unpacks logstash-6.0.0.tar.gs in /etc/logstash, all starts work corecttly.
faced equally the same situation. LS ver. 6.1.0
` status = error name = LogstashPropertiesConfig
appender.console.type = Console appender.console.name = plain_console appender.console.layout.type = PatternLayout appender.console.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %m%n
appender.json_console.type = Console appender.json_console.name = json_console appender.json_console.layout.type = JSONLayout appender.json_console.layout.compact = true appender.json_console.layout.eventEol = true
appender.rolling.type = RollingFile appender.rolling.name = plain_rolling appender.rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 1 appender.rolling.policies.time.modulate = true appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %-.10000m%n appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size = 100MB appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 30
appender.json_rolling.type = RollingFile appender.json_rolling.name = json_rolling appender.json_rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.json_rolling.policies.type = Policies appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling.policies.time.interval = 1 appender.json_rolling.policies.time.modulate = true appender.json_rolling.layout.type = JSONLayout appender.json_rolling.layout.compact = true appender.json_rolling.layout.eventEol = true appender.json_rolling.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling.policies.size.size = 100MB appender.json_rolling.strategy.type = DefaultRolloverStrategy appender.json_rolling.strategy.max = 30
rootLogger.level = ${sys:ls.log.level} rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console rootLogger.appenderRef.rolling.ref = ${sys:ls.log.format}_rolling
appender.console_slowlog.type = Console appender.console_slowlog.name = plain_console_slowlog appender.console_slowlog.layout.type = PatternLayout appender.console_slowlog.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %m%n
appender.json_console_slowlog.type = Console appender.json_console_slowlog.name = json_console_slowlog appender.json_console_slowlog.layout.type = JSONLayout appender.json_console_slowlog.layout.compact = true appender.json_console_slowlog.layout.eventEol = true
appender.rolling_slowlog.type = RollingFile appender.rolling_slowlog.name = plain_rolling_slowlog appender.rolling_slowlog.fileName = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}.log appender.rolling_slowlog.filePattern = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.rolling_slowlog.policies.type = Policies appender.rolling_slowlog.policies.time.type = TimeBasedTriggeringPolicy appender.rolling_slowlog.policies.time.interval = 1 appender.rolling_slowlog.policies.time.modulate = true appender.rolling_slowlog.layout.type = PatternLayout appender.rolling_slowlog.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %.10000m%n appender.rolling_slowlog.policies.size.type = SizeBasedTriggeringPolicy appender.rolling_slowlog.policies.size.size = 100MB appender.rolling_slowlog.strategy.type = DefaultRolloverStrategy appender.rolling_slowlog.strategy.max = 30
appender.json_rolling_slowlog.type = RollingFile appender.json_rolling_slowlog.name = json_rolling_slowlog appender.json_rolling_slowlog.fileName = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}.log appender.json_rolling_slowlog.filePattern = ${sys:ls.logs}/logstash-slowlog-${sys:ls.log.format}-%d{yyyy-MM-dd}-%i.log.gz appender.json_rolling_slowlog.policies.type = Policies appender.json_rolling_slowlog.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling_slowlog.policies.time.interval = 1 appender.json_rolling_slowlog.policies.time.modulate = true appender.json_rolling_slowlog.layout.type = JSONLayout appender.json_rolling_slowlog.layout.compact = true appender.json_rolling_slowlog.layout.eventEol = true appender.json_rolling_slowlog.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling_slowlog.policies.size.size = 100MB appender.json_rolling_slowlog.strategy.type = DefaultRolloverStrategy appender.json_rolling_slowlog.strategy.max = 30
logger.slowlog.name = slowlog logger.slowlog.level = trace logger.slowlog.appenderRef.console_slowlog.ref = ${sys:ls.log.format}_console_slowlog logger.slowlog.appenderRef.rolling_slowlog.ref = ${sys:ls.log.format}_rolling_slowlog logger.slowlog.additivity = false
`
and it is creating a weird directory as ${sys:is:log} any clue guys?
this one works good for me `status = error name = LogstashPropertiesConfig
appender.rolling.type = RollingFile appender.rolling.name = plain_rolling appender.rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log-%i.zip appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 1 appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size=5GB appender.rolling.policies.time.modulate = true appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %-.10000m%n appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 5 appender.rolling.strategy.fileIndex = max appender.rolling.strategy.compressionLevel = 9 appender.rolling.strategy.action.type = Delete appender.rolling.strategy.action.basepath = ${sys:ls.logs} appender.rolling.strategy.action.condition.type = IfLastModified appender.rolling.strategy.action.condition.age = 7D appender.rolling.strategy.action.PathConditions.type = IfFileName appender.rolling.strategy.action.PathConditions.glob = logstash-${sys:ls.log.format}-*
appender.json_rolling.type = RollingFile appender.json_rolling.name = json_rolling appender.json_rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log-%i.zip appender.json_rolling.policies.type = Policies appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy appender.json_rolling.policies.time.interval = 1 appender.json_rolling.policies.size.type = SizeBasedTriggeringPolicy appender.json_rolling.policies.size.size=5GB appender.json_rolling.policies.time.modulate = true appender.json_rolling.layout.type = JSONLayout appender.json_rolling.layout.compact = true appender.json_rolling.layout.eventEol = true appender.json_rolling.strategy.type = DefaultRolloverStrategy appender.json_rolling.strategy.max = 5 appender.json_rolling.strategy.fileIndex = max appender.json_rolling.strategy.compressionLevel = 9 appender.json_rolling.strategy.action.type = Delete appender.json_rolling.strategy.action.basepath = ${sys:ls.logs} appender.json_rolling.strategy.action.condition.type = IfLastModified appender.json_rolling.strategy.action.condition.age = 7D appender.json_rolling.strategy.action.PathConditions.type = IfFileName appender.json_rolling.strategy.action.PathConditions.glob = logstash-${sys:ls.log.format}-*
rootLogger.level = ${sys:ls.log.level} rootLogger.appenderRef.rolling.ref = ${sys:ls.log.format}_rolling`
2018-01-03 11:04:32,915 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2018-01-03 11:04:35,440 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
I am still getting the error in version 6.1.* , when I use an another config file which has less data to be imported it works, but with large data around 9 million records it throws above error.
Hello all,
Any update on this matter ?
Thank you
same here!
Do not use root account run it ,when I try use other user ,it works normal.
It should work the same as root is a normal user (with extra abilities) if anything it should work better running as root. not worse
I use a non-root user and I still get an error...
which version of your logstash?
any update about it?
The same problem here ...
Try
export LOGSTASH_HOME=/usr/share/logstash
export LOGSTASH_CONF=/etc/logstash
cd ${LOGSTASH_HOME}
bin/logstash-keystore --path.settings ${LOGSTASH_CONF}
[create|list|add]
2018-06-07 16:23:42,305 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root" 2018-06-07 16:23:42,306 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2018-06-07 16:23:42,307 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog" 2018-06-07 16:23:42,307 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
I get the stack too. Do anybody solved ?
I've had another report for this problem with a user upgraded from 6.3.1 to 6.3.2 using «rpmnew files to update the existing files where it was appropriate, like log4j jvm and start up files»
I personally haven't been able to reproduce. I am suspecting either permissions problems or environment problems. This also seems to only happen with deb/rpm packages.
I'm having this issue too
Hi,
I have implemented loggers (log4j2 in my project), i'm able to print the loggers to a specific file but can anyone please tell me what changes i need to do if i want to print the same in the console.
below is the sample implementation;
status = error dest = err name = PropertiesConfig
property.filename = target/rolling/rollingtest.log
filter.threshold.type = ThresholdFilter filter.threshold.level = debug
appender.console.type = Console appender.console.name = STDOUT appender.console.layout.type = PatternLayout appender.console.layout.pattern = %m%n appender.console.filter.threshold.type = ThresholdFilter appender.console.filter.threshold.level = error
appender.rolling.type = RollingFile appender.rolling.name = RollingFile appender.rolling.fileName = ${filename} appender.rolling.filePattern = target/rolling2/test1-%d{MM-dd-yy-HH-mm-ss}-%i.log.gz appender.rolling.layout.type = PatternLayout appender.rolling.layout.pattern = %d %p %C{1.} [%t] %m%n appender.rolling.policies.type = Policies appender.rolling.policies.time.type = TimeBasedTriggeringPolicy appender.rolling.policies.time.interval = 2 appender.rolling.policies.time.modulate = true appender.rolling.policies.size.type = SizeBasedTriggeringPolicy appender.rolling.policies.size.size=100MB appender.rolling.strategy.type = DefaultRolloverStrategy appender.rolling.strategy.max = 5
logger.rolling.name = com.example.my.app logger.rolling.level = debug logger.rolling.additivity = false logger.rolling.appenderRef.rolling.ref = RollingFile
rootLogger.level = info rootLogger.appenderRef.stdout.ref = STDOUT
implemented through this link: https://logging.apache.org/log4j/2.x/manual/configuration.html
thanks in advance!
@Martin-Arockiaraj this open issue is about a specific open bug; if you have questions about your logstash configuration that do not involve this bug, please ask in the forum.
@yaauie ok got it, thanks for addressing.
什么
Double check whether you have killed existing logstash. In my case PID was there even if I killed before. After kill that running PID, logstash started successfully.
$ ps -ef | grep logstash
dev 16820 11257 0 05:16 pts/0 00:00:00 grep --color=auto logstash dev 50216 1 2 May10 ? 03:46:27 /bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -Djruby.jit.threshold=0 -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -cp /home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/commons-compiler-3.0.8.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/google-java-format-1.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/guava-19.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-annotations-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-core-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-databind-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jackson-dataformat-cbor-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/janino-3.0.8.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/jruby-complete-9.1.13.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-api-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-core-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/log4j-slf4j-impl-2.9.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/logstash-core.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.commands-3.6.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.contenttype-3.4.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.expressions-3.4.300.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.filesystem-1.3.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.jobs-3.5.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.resources-3.7.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.core.runtime-3.7.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.app-1.3.100.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.common-3.6.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.preferences-3.4.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.equinox.registry-3.5.101.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.jdt.core-3.10.0.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.osgi-3.7.1.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/org.eclipse.text-3.5.101.jar:/home/dev/elk/logstash-6.2.4/logstash-core/lib/jars/slf4j-api-1.7.25.jar org.logstash.Logstash -f kafkaLogstash.conf
I was able to fix this error by changing log/data directories (recursively) to owner/group "elasticsearch".
I think I ran into the same issue when switching from elasticsearch 5.x to elasticsearch-oss 6.x on a Debian Stretch. Looks like elasticsearch-oss has different data/log directory expectations than expected by Debian.
What fixed it for me was specifically setting
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
in the elasticsearch.yaml, as those 2 directories were the ones used by the previous elasticsearch 5.x installation.
I'm not sure if your issues are the same as mine.
My Issue: I was trying to change the directories for data and logs in the "/etc/elasticsearch/elasticsearch.yml" to new paths I created and I had the same errors.
Solution: I just had to change owner over those two new directories to "elasticsearch" with the "chown" command and it's fixed for me.
Unpack logstash from tar.gz If you start logstash within the unpacked directory it starts without errors:
However if you start from any directory which contains the
log4j.options
file, a lot of errors are produced:and quite oddly, a directory
${sys:ls.logs}
is created:This happens if you use the
--path.settings
parameter as well: