Perhaps I'm doing something wrong but I can't find out what.
I'm trying out a simple setup with two vagrant servers running Ubuntu 12.04.
One is a Apache2 webserver with node-logstash that should send the accesslog entries to the other server running node-logstash, Elasticsearch and Kibana.
I can send the entries to the logging server but somehow ZeroMQ cannot parse the data. This is the message I get for every line in the access log:
ERROR [input_zeromq] Unable to parse data {"@message":"192.168.33.1 - - [23/Jul/2013:10:41:26 -0300] \"GET /favicon.ico HTTP/1.1\" 404 503 \"-\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/28.0.1500.52 Chrome/28.0.1500.52 Safari/537.36\"","@source":"/var/log/apache2/access.log","@source_host":"vagrant-ubuntu","@timestamp":"2013-07-23T13:56:18.752Z"}
On my 'client' (Apache2 server) I'm running node-logstash with this config file:
I also tried different combinations with the unserializer/serializer parameters on both the client and the logserver, but nothing helped. Removing the filter also does not help.
So my question is, am I doing something wrong here? Or did I found a bug or something?
Hi,
Perhaps I'm doing something wrong but I can't find out what.
I'm trying out a simple setup with two vagrant servers running Ubuntu 12.04. One is a Apache2 webserver with node-logstash that should send the accesslog entries to the other server running node-logstash, Elasticsearch and Kibana.
I can send the entries to the logging server but somehow ZeroMQ cannot parse the data. This is the message I get for every line in the access log:
On my 'client' (Apache2 server) I'm running node-logstash with this config file:
And on my 'logserver' I'm running node-logstash with this config file:
I also tried different combinations with the unserializer/serializer parameters on both the client and the logserver, but nothing helped. Removing the filter also does not help.
So my question is, am I doing something wrong here? Or did I found a bug or something?