Closed nareshnani529 closed 4 years ago
Hello,
Its been awhile since i've worked with this plug-in and I remember having a similar issue. Take a look at the copy plugin https://docs.fluentd.org/output/copy. This is mostly likely what your looking for.
Jason
Yeah, @jayslife answer is right. AWS ES Output plugin should emit nothing. Instead, we can reprecate Fluentd events with copy plugin.
Thanks for the reply @cosmo0920 @jayslife
I tried with Copy plugin but not working. Could you give us an example here that will be great!
Config that I tried with :
<source>
@type forward
@id input_forward
</source>
<filter *.**>
@type record_transformer
<record>
container_id ${record["container_id"]}
</record>
</filter>
<match *.**>
type aws-elasticsearch-service
logstash_format true
logstash_prefix ${tag[0]}
include_tag_key true
tag_key "Application_name"
flush_interval 5s
<buffer>
flush_at_shutdown true
flush_mode immediate
flush_thread_count 8
flush_thread_interval 1
flush_thread_burst_interval 1
retry_forever true
retry_type exponential_backoff
</buffer>
<endpoint>
url https://vpc-xxxx.us-east-1.es.amazonaws.com
region us-east-1
access_key_id "xxxx"
secret_access_key "xxxx"
</endpoint>
</match>
<match *.**>
type copy
<store>
type s3
aws_key_id "xxxxx"
aws_sec_key "xxxx"
s3_bucket "logstream-testbucket"
s3_region "us-east-1"
path logs/
buffer_path /var/log/td-agent/buffer/s3
time_slice_format %Y-%m-%d/%H
time_slice_wait 10m
</store>
</match>
This issue should be continued to #58. This issue should be closing.
Problem
Unable to send logs to AWS ES and S3 at same time and nothing shows up in logs. At this point of time would like to check AWS ES plugin will support s3?
Td-agent version : 2.5.x
Infrastructure > AWS ECS > Fluentd Aggregrators (2 instances and load balanced) > ES > Kibana
As we see some lags (5 to 6mins) in logs processing would like to check with adding other output(like s3) to view the logs. Is this right action to take?
Config :
<filter *.**> @type record_transformer
<match *.**> type aws-elasticsearch-service logstash_format true logstash_prefix ${tag[0]} include_tag_key true tag_key "Application_name" flush_interval 5s