Closed jordansissel closed 7 years ago
This bug is currently burning me. It means I have to have separate ingestion ports depending on which client I'm using. I started poking at the lumberjack output plugin and I'm wondering the fix is really as easy as changing https://github.com/logstash-plugins/logstash-output-lumberjack/blob/master/lib/logstash/outputs/lumberjack.rb#L25
@client.write({ 'line' => payload })
to
@client.write({'line' => payload }.merge(event.to_hash.reject { |key,value| ["message","@timestamp","@version"].include?key }))
@jordansissel tell me whats wrong with that approach?
edit: to merge the event keys right. edit2: looks like this gets fields, but skips tags. working on a fix. edit3: actually looks like any fields that are an array are stripped. Where would this happen, in the codec? I did some sketchy prints and an abort to get event.to_hash in the lumber jack input and didn't see the data there, so where would it be getting removed?
I think the solution here is to set codec => json
for lumberjack output and for your input, and things should just work after that.
Can anyone confirm?
On 1.5.0 if you use tags with the json codec for input, logstash-forwarder fails with an EOF error:
logstash lumberjack json output -> logstash lumberjack json input works logstash-forwarder -> logstash lumberjack json input fails
Using this config for logstash-forwarder:
{
"network": {
"servers": [ "localhost:9998" ],
"ssl certificate": "./lumberjack.crt",
"ssl key": "./lumberjack.key",
"ssl ca": "./lumberjack.crt",
"timeout": 15
},
"files": [
{
"paths": [
"./messages"
],
"fields": { "type": "syslog","tags": "foobar" }
}
]
}
Using this config for logstash:
input {
lumberjack {
port => 9998
ssl_certificate => "/home/msimos/logstash/lumberjack.crt"
ssl_key => "/home/msimos/logstash/lumberjack.key"
codec => json
tags => [ "blah_logs" ]
}
}
output {
stdout {
codec => rubydebug
}
}
logstash-forwarder throws this error repeatedly:
2015/06/02 16:34:32.595952 Loading client ssl certificate: ./lumberjack.crt and ./lumberjack.key 2015/06/02 16:34:32.701675 Setting trusted CA from file: ./lumberjack.crt 2015/06/02 16:34:32.701860 Connecting to [127.0.0.1]:9998 (localhost) 2015/06/02 16:34:32.755777 Connected to 127.0.0.1 2015/06/02 16:34:32.760652 Read error looking for ack: EOF
If you remove tags. It works fine.
Setting codec=json in both input and output solved this problem for me. we had to add a second lumberjack input on a different port for it so that we didn't break regular lumberjack messages.
For now a workaround (possibly the correct step?) is to set codec => json
here. Closing this with a known solution.
Alright, I just ran into this and it is very annoying. Can there at least be an update to the documentation which states that this is a known issue and what the workaround is?
This is also a problem I had. adding codec => json
to both the input and output made it work. I agree this should be documented somewhere. I spent awhile scratching my head trying to figure out why my logs were not being received when in reality they were there just not being caught by my filters because the [type]
field was not detected. Glad i stumbled across this thread.
same thing here. This really should be in the documentation pages
(This issue was originally filed by @Mister-X- at https://github.com/elastic/logstash/issues/2843)
I'm doing some kind of log collector using logstash (not the forwarder since we won't collect logs from files) and forwarding all the logs to a central location for analysis. I am using Logstash 1.5RC2 from the debian package on Ubuntu 14.04.2 LTS 64 bit (on both sides)
Here is the configuration of the log collector (this is a test system, hence why it only has a single input):
and here is one log message I get from that output.log file:
And we can see that it is parsed correctly and the location field has been added.
On the other side, on the central system, fields from that log entry are stripped:
Here is the config from that central system:
Based on a discussion I had on IRC in #logstash, it sounds like Lumberjack stripping all those fields is a bug.