Open ghost opened 7 years ago
@Pero1 I think the issue you are experiencing is the file output cannot write to the path that you are specifying. "/etc/logstash/log.log". Changing the path to new location that logstash can write to will fix your problem.
Details:
Dockerfile (build image "my-logstash"):
FROM logstash
RUN logstash-plugin install logstash-filter-multiline logstash-output-file
RUN mkdir /etc/logstash/logs
CMD ["-f", "/etc/logstash/conf.d"]
Running container:
docker run -d --restart=always --name=logstash \
-p 5000:5000 \
-v $PWD/logstash/config:/etc/logstash/conf.d \
-v $PWD/logstash/patterns:/etc/logstash/patterns \
-v $PWD/logstash/logs:/etc/logstash/logs \
my-logstash
Logstash config:
input {
tcp {
port => 5000
}
udp {
port => 5000
}
}
filter {
}
output {
file {
path => ["/etc/logstash/logs/log.log"]
}
}
Logstash starts correctly, but when I send some log data it sends error:
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
15:14:29.291 [main] INFO logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
15:14:29.318 [LogStash::Runner] INFO logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"8f389b84-b16c-47a1-8fc6-96a574d425e3", :path=>"/var/lib/logstash/uuid"}
15:14:29.677 [[main]-pipeline-manager] INFO logstash.inputs.tcp - Starting tcp input listener {:address=>"0.0.0.0:5000"}
15:14:29.702 [[main]<udp] INFO logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5000"}
15:14:29.727 [[main]<udp] INFO logstash.inputs.udp - UDP listener started {:address=>"0.0.0.0:5000", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
15:14:29.864 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/ruby"}
15:14:29.866 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/aws"}
15:14:29.867 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/mongodb"}
15:14:29.869 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/rails"}
15:14:29.870 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/junos"}
15:14:29.872 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/redis"}
15:14:29.873 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/linux-syslog"}
15:14:29.874 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/haproxy"}
15:14:29.876 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/firewalls"}
15:14:29.880 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/bro"}
15:14:29.886 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/postgresql"}
15:14:29.887 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/nagios"}
15:14:29.891 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/bacula"}
15:14:29.894 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/grok-patterns"}
15:14:29.900 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/mcollective"}
15:14:29.903 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/java"}
15:14:29.905 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/exim"}
15:14:29.906 [[main]-pipeline-manager] INFO logstash.filters.multiline - Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/mcollective-patterns"}
15:14:29.977 [[main]-pipeline-manager] WARN logstash.pipeline - Defaulting pipeline worker threads to 1 because there are some filters that might not work with multiple worker threads {:count_was=>4, :filters=>["multiline"]}
15:14:29.979 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
15:14:29.981 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
15:14:30.040 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
15:14:58.060 [[main]>worker0] INFO logstash.outputs.file - Opening file {:path=>"/etc/logstash/logs/log.log"}
15:14:58.111 [LogStash::Runner] FATAL logstash.runner - An unexpected error occurred! {:error=>#<Errno::EACCES: Permission denied - /etc/logstash/logs/log.log>, :backtrace=>["org/jruby/RubyFile.java:370:in `initialize'", "org/jruby/RubyIO.java:871:in `new'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.1/lib/logstash/outputs/file.rb:280:in `open'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.1/lib/logstash/outputs/file.rb:132:in `multi_receive_encoded'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.1/lib/logstash/outputs/file.rb:131:in `multi_receive_encoded'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.1/lib/logstash/outputs/file.rb:130:in `multi_receive_encoded'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:90:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:12:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:336:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:335:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:293:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:in `start_workers'"]}
error=>#<Errno::EACCES: Permission denied - /etc/logstash/logs/log.log>,
Indeed, this error indicates that Logstash has no permissions from your operating system to write to that path.
Hi.
I'm having the same issue here. It definitely points to a permissions issue, but not sure where the root cause is. I have tried different combinations, unfortunately none of them have worked so far.
This are the relevant bits of my Dockerfile:
FROM logstash:5.2
..
RUN mkdir -p /var/log/logstash/
RUN chown -R logstash.logstash /var/log/logstash/
RUN chmod -R 777 /var/log/logstash/
Inspecting the container yields the following, which confirms the mount is RW:
"Mounts": [
{
"Type": "bind",
"Source": "/Users/israel/sf/global/logstash",
"Destination": "/var/log/logstash",
"Mode": "rw",
"RW": true,
"Propagation": ""
}
]
And this is my pipeline configuration:
input {
gelf {
type => docker
}
}
output {
stdout { codec => rubydebug }
file {
file_mode => 0777
dir_mode => 0777
path => "/var/log/logstash/%{container_name}.log"
codec => line { format => "%{message}"}
}
}
So, what I'm seeing is the following:
root@5b473ab20a4c:/# ls -la /var/log/logstash/
total 200
drwxrwxrwx 1 1000 staff 170 Feb 13 07:57 .
drwxr-xr-x 12 root root 4096 Feb 13 07:28 ..
-rwxrwxrwx 1 1000 staff 0 Feb 13 07:12 p-frontend.log
-rw-r--r-- 1 1000 staff 0 Feb 13 07:57 p-git-http-server.log
-rwxrwxrwx 1 1000 staff 199524 Feb 13 07:57 p-platform.log
My log directory used by logstash-output-file is created with the expected permissions as per my Dockerfile. Without any manual intervention, the plugin creates file 'p-git-http-server.log' with 644 permissions, even though I'm setting a file_mode of 777 in my pipeline output configuration (just for testing purposes). After creating the file, LS dies with:
08:02:15.456 [[main]>worker0] INFO logstash.outputs.file - Opening file {:path=>"/var/log/logstash/p-git-http-server.log"}
08:02:15.481 [LogStash::Runner] FATAL logstash.runner - An unexpected error occurred! {:error=>#<Errno::EACCES: Permission denied - /var/log/logstash/p-git-http-server.log>, :backtrace=>["org/jruby/RubyFile.java:370:in `initialize'", "org/jruby/RubyIO.java:871:in `new'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.0/lib/logstash/outputs/file.rb:278:in `open'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.0/lib/logstash/outputs/file.rb:132:in `multi_receive_encoded'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.0/lib/logstash/outputs/file.rb:131:in `multi_receive_encoded'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.0/lib/logstash/outputs/file.rb:130:in `multi_receive_encoded'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:90:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:12:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:42:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:297:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:296:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:225:in `start_workers'"]}
08:02:15.487 [[main]<gelf] WARN logstash.inputs.gelf - gelf listener died {:exception=>#<SocketError: recvfrom: name or service not known>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-gelf-3.0.2/lib/logstash/inputs/gelf.rb:101:in `udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-gelf-3.0.2/lib/logstash/inputs/gelf.rb:77:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:331:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:325:in `start_input'"]}
If I create the file myself with 777 or change the file permissions (what I called manual intervention before) either through the mount directory or by exec'ing into the container, LS is then able to write to the log files (see p-platform.log above), so definitely LS can actually write to the directory/file in that scenario.
The same pipeline configuration, running on a LS 5.1.1 on the host has no permission issues, although I think this is expected, as it looks like a LS running on docker issue.
By looking at the stack trace, it feels that the file_mode used in the plugin configuration in the pipeline may not be used by file.rb, https://github.com/logstash-plugins/logstash-output-file/blob/v4.0.0/lib/logstash/outputs/file.rb#L278
Hopefully, this is the underlying reason, as have run out of ideas or leads to pull on my testing. Any help would be greatly appreciated.
Cheers.
I solved this problem with acl and group ownership inherance.
FROM logstash:8.0.0
USER root
RUN apt-get update && \
apt-get install -y acl && \
apt-get clean
RUN mkdir "/var/log/logstash_out/" && \
chmod -R g+s /var/log/logstash_out/ && \
chown -R logstash:logstash /var/log/logstash_out/ && \
setfacl -R -m d:g:logstash:rwX /var/log/logstash_out/ && \
setfacl -R -m g:logstash:rwX /var/log/logstash_out/
USER logstash
Logstash will create log files in /var/log/logstash_out/
.
I have a problem with docker logstash (latest image) when I want to output logs to file:
output { file { path => [ "/etc/logstash/logs/log.log" ] } }
Can someone help me?