bpaquet / node-logstash

Simple logstash implmentation in nodejs : file log collection, sent with zeromq
Other
517 stars 141 forks source link

Logstash schema change coming soon #21

Closed jordansissel closed 11 years ago

jordansissel commented 11 years ago

Howdy!

Just wanted to let you know that the logstash 'json event schema' is going to change (for the better, I hope!) in a few weeks/months

The specific details of this are discussed here: https://logstash.jira.com/browse/LOGSTASH-675

I intend to support the old schema for some time, but use the new one as the default (eventually). If you have specific concerns, I would love to hear them, but otherwise this is just an informational notice :)

Thanks for helping make the logstash ecosystem better!

bpaquet commented 11 years ago

Hi @jordansissel,

I'm happy to known you are following this project :)

I will update node-logstash to the new schema when it will be available in logstash in Kibana.

The only problem I have with schema is typing : how to inform Elastic Search this fields is a int, not a string, a vice versa. Actually, Elastic Search chooses fields type on the first day insertion. If the first insertion is strange, the full day is wrong. Eg, I have fields with hash code. If first hash code contains only digits, ES chooses int, and next inserts with alpha char will fails ...

Regards,

Bertrand

jordansissel commented 11 years ago

the new logstash schema is implemented! Next release (1.2.0) will have it by default. I don't know when it'll land yet.

To answer your ES schema typing -

ElasticSearch has a feature called 'dynamic mapping' http://www.elasticsearch.org/guide/reference/mapping/dynamic-mapping/ This feature makes elasticsearch study previously-unknown fields to determine the best mapping. You can configure this in ElasticSearch and avoid the headaches you described. Another way to avoid headaches when you literally have no control over the data schema is to tell ElasticSearch to disable dynamic mapping and force everything to be strings, for example, but this may not be desirable :(

Ultimately, I don't think there is a best answer. Hope this helps! :)

vid commented 11 years ago

We're adopting logstash for a new project (devopsjs) and I'd much rather use this node.js version for all the excellent reasons cited. We'd hopefully have some chance to contribute code. However being compatible at the database level with logstash ruby would be critical. @bpaquet can you comment on the upcoming change? Thanks!

bpaquet commented 11 years ago

Hi,

Yes, I planned to update node-logstash. But I do not have studied all impacts, or if I should keep backward compatibility.

Regards,

Bertrand

On Mon, Jul 22, 2013 at 4:19 PM, vid notifications@github.com wrote:

We're adopting logstash for a new project (devopsjs) and I'd much rather use this node.js version for all the excellent reasons cited. We'd hopefully have some chance to contribute code. However being compatible at the database level with logstash ruby would be critical. @bpaquethttps://github.com/bpaquetcan you comment on the upcoming change? Thanks!

— Reply to this email directly or view it on GitHubhttps://github.com/bpaquet/node-logstash/issues/21#issuecomment-21346973 .

bpaquet commented 11 years ago

Hi,

It's not yet released in logstash 1.1.13.

I will implement it in node-logstash as soon as possible after public logstash release.

Regards,

Bertrand

bpaquet commented 11 years ago

@jordansissel @vld

The migration is now completed. Thx to @dax for pull request.