bpaquet / node-logstash

Simple logstash implmentation in nodejs : file log collection, sent with zeromq
Other
517 stars 141 forks source link

db_file not created #126

Closed jerome83136 closed 8 years ago

jerome83136 commented 8 years ago

Hello, I am using the file input plugin.

I launch my node-logstash process this way:

node-logstash-agent --log_level=info --config_file=/conf/logstash/logstash.webshop.conf --log_file=/logs/logstash/logs.node-logstash.webshop.log --db_file=/var/tmp/logs.node-logstash.webshop.dbfile

But, the .dbfile is not created by node-logstash. There is no file /var/tmp/logs.node-logstash.webshop.dbfile

File permissions are OK on that path.

What did I wrong ?

Another question about the .dbfile: My input logs files are rotating. So every 24hours a new log file gets created. It seems the input plugin has no problem with truncated files, so that is OK. But what will happen if node-logstash crash today and I have to restart it tomorrow ?

I think node-logstash will read the tomorrow's log file, but at which line number ? Will it resume at the last read line of the today's log file (by using the .dbfile) or at the beginning of the tomrrow's log file ?

Thank you for your help

Best regards Jérôme

bpaquet commented 8 years ago

On Mon, May 23, 2016 at 11:34 AM, jerome83136 notifications@github.com wrote:

Hello, I am using the file input plugin.

I launch my node-logstash process this way:

node-logstash-agent --log_level=info --config_file=/conf/logstash/logstash.webshop.conf --log_file=/logs/logstash/logs.node-logstash.webshop.log --db_file=/var/tmp/logs.node-logstash.webshop.dbfile

The db_file should be created when you stop node-logstash by hitting control-c. May be I can add a scheduled task to dump it periodically. But this feature is dedicated to handle properly a node-logstash restart, not a crash. What do you think about that ?

But, the .dbfile is not created by node-logstash. There is no file /var/tmp/logs.node-logstash.webshop.dbfile

File permissions are OK on that path.

What did I wrong ?

Another question about the .dbfile: My input logs files are rotating. So every 24hours a new log file gets created. It seems the input plugin has no problem with truncated files, so that is OK. But what will happen if node-logstash crash today and I have to restart it tomorrow ?

Node-logstash store last read lines in db_file. When you restart it, node-logstash will compare lines, show that lines does not match, and restart the reading from the begining of the file.

I think node-logstash will read the tomorrow's log file, but at which line

number ? Will it resume at the last read line of the today's log file (by using the .dbfile) or at the beginning of the tomrrow's log file ?

Thank you for your help

Best regards Jérôme

— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/bpaquet/node-logstash/issues/126

jerome83136 commented 8 years ago

Hello, I'm OK with the idea to implement a scheduler which dumps the .dbfile sometimes. So that, if node-logstash crash it will be able to resume input log files parsing by reading the .dbfile

Thanks Best regards Jérôme

bpaquet commented 8 years ago

On Mon, May 23, 2016 at 4:18 PM, jerome83136 notifications@github.com wrote:

Hello, I'm OK with the idea to implement a scheduler which dumps the .dbfile sometimes. So that, if node-logstash crash it will be able to resume input log files parsing by reading the .dbfile

No. node-logstash will resume input log at the last saved point, whatever he is ...

Thanks Best regards Jérôme

— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/bpaquet/node-logstash/issues/126#issuecomment-220993278

jerome83136 commented 8 years ago

Ok So scheduling a dump of the db_file is useless. No need to implement from my point of view, unless you want it. Best regards Jérôme