bpaquet / node-logstash

Simple logstash implmentation in nodejs : file log collection, sent with zeromq
Other
517 stars 141 forks source link

ZeroMQ Unable to parse data #35

Closed ronaldvaneede closed 11 years ago

ronaldvaneede commented 11 years ago

Hi,

Perhaps I'm doing something wrong but I can't find out what.

I'm trying out a simple setup with two vagrant servers running Ubuntu 12.04. One is a Apache2 webserver with node-logstash that should send the accesslog entries to the other server running node-logstash, Elasticsearch and Kibana.

I can send the entries to the logging server but somehow ZeroMQ cannot parse the data. This is the message I get for every line in the access log:

ERROR [input_zeromq] Unable to parse data {"@message":"192.168.33.1 - - [23/Jul/2013:10:41:26 -0300] \"GET /favicon.ico HTTP/1.1\" 404 503 \"-\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/28.0.1500.52 Chrome/28.0.1500.52 Safari/537.36\"","@source":"/var/log/apache2/access.log","@source_host":"vagrant-ubuntu","@timestamp":"2013-07-23T13:56:18.752Z"}

On my 'client' (Apache2 server) I'm running node-logstash with this config file:

input://file:///var/log/apache2/access.log?start_index=0
output://zeromq://tcp://192.168.33.11:5555

And on my 'logserver' I'm running node-logstash with this config file:

input://zeromq://tcp://192.168.33.11:5555
filter://regex://http_combined
output://elasticsearch://localhost:9200

I also tried different combinations with the unserializer/serializer parameters on both the client and the logserver, but nothing helped. Removing the filter also does not help.

So my question is, am I doing something wrong here? Or did I found a bug or something?

bpaquet commented 11 years ago

Hi,

Thx you for reporting.

You can add &type=toto to your apache2 input, or update node-logstash.

Regards

ronaldvaneede commented 11 years ago

Thanks for fixing.