Open vaijab opened 9 years ago
Hi @vaijab, yes, that could cause above error. A log stream is sequence of log events from a single emitter of logs. You can read more about CloudWatch Logs concepts at http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/WhatIsCloudWatchLogs.html#CloudWatchLogsConcepts. Pointing multiple logstash instances to the same log stream could cause data duplicates and reduce the ingestion throughput.
Here is a simple example. If you have several apache servers, each generating access log and error log. Then you could have two log groups (e.g. AccessLog and ErrorLog) and use instance id as log stream name, and you end up with below log streams.
#log group/log stream
AccessLog/i-instance123
AccessLog/i-instance456
AccessLog/i-instance789
ErrorLog/i-instance123
ErrorLog/i-instance456
ErrorLog/i-instance789
@wanghq thanks for the explanation. I had a feeling that was the case. I'd love to be able to create dynamic streams based on types of events, but see #1 why I cannot do that unfortunately.
Today, in containerized world instances don't mean much. Containers move around and can be short-lived.
i only have 1 output, and im seeing these errors.
I am getting errors like one below:
I have multiple instances running logstash and pushing to the same
LOG_GROUP_NAME/LOG_STREAM_NAME
. Would running multiple logstash instances cause an error like the one above?