Open hzxGitHub opened 7 years ago
@hzxGitHub you should add more detailed description.
@kubejenkisThank you so much!
@kubejenkis If I hava many same log lines, what should I do to prevent data loss while using logstash to output log to influxdb?Except add another field to separate log lines from each other, do you hava any good advice?Thank you again!
what log they have is same? same timestamp or same contents?influxdb can not drop logs having same contents
@kubejenkis I use logstash to insert logs into influxdb. Same logs mean they have same content, and I didn't define timestamp by myself, I use the timestamp created by influxdb while inserting logs using logstash. If they have same content, the log data will lose.
I don't think so, you can have a test to find what cause that. For logstash you use stdin as input, filter is null, use stdout and influxdb as output .Then you write in console and watch what happened in influxdb(listen 8083) and stdout.
@kubejenkis I use logstash to read logs and output to stdin, file, influxdb at the same time, the stdin and file can receive full data while the influxdb just receive part data.
I know what you mean, but i did't find this problem in my env.So i hope you have above test. my logstash version is 5.5.1. influxdb-plugin version is 5.0.1. May be try set flush_size =1?
@kubejenkis Thank you so much. I will have a try.
@hzxGitHub I also have a idea. Because input to output in logstash is a very quick process. May be they done in the same second or millisecond. But the default precision of influxdb time is second or millisecond?if they done in same millisecond ,only one log will be written.
I had the same issue, but setting explicitly the time (instead of using @timestamp
) fixed it:
allow_time_override => true
data_points => { "time" => "%{time}" ...
Issue root cause is:
Influxdb overrides duplicate timestamp To know more Why influxdb does override duplicate timestamp entry http://techathon.techinnolab.com/why-influxdb-does-override-duplicate-timestamp-entry/
Add unique filter as:
uuid {
target => "uuid"
overwrite => true
}
Then add in output tag as:
send_as_tags => ["uuid"]
data_points => {
"uuid"=>"%{[uuid]}"
}
I encountered same problem, thank you guys.
I used logstash to read log file and output the content to influxdb. There are 115 lines in the log file, but influxdb can only receive points less than 115, and strangely receive different number of points every time. I change the logstash to output to file and stdin, there were all no question.Can you tell what's wrong?Thank a lot.