fluent / fluent-plugin-s3

Amazon S3 input and output plugin for Fluentd
https://docs.fluentd.org/output/s3
314 stars 217 forks source link

s3 input is not separating log entries #420

Open tfmm opened 1 year ago

tfmm commented 1 year ago

Describe the bug

Using s3 input from cloudwatch logs (gzipped json) to send to opensearch, and all entries from each log file in s3 are being put into one entry in opensearch.

To Reproduce

Setup fluentd using below config, with proper permissions on s3 and sqs.

Expected behavior

Each log entry should be parsed into separate entries in opensearch.

Your Environment

- Fluentd version:1.16-1
- TD Agent version:
- fluent-plugin-s3 version: latest
- aws-sdk-s3 version:
- aws-sdk-sqs version:
- Operating system:
- Kernel version:

Your Configuration

<source>
  @type s3
  s3_bucket S3_BUCKET_NAME
  s3_region us-west-2
  add_object_metadata true
  format json
  <sqs>
    queue_name SQS_QUEUE_NAME
  </sqs>
</source>

<match **>
  @type opensearch
  host OPENSEARCH_HOST
  port 9200
  user %{OPENSEARCH_USER}
  password OPENSEARCH_PASSWORD
  scheme https
  include_timestamp true 
  logstash_format true
  logstash_prefix OS_INDEX_NAME
  suppress_type_name true
  ssl_verify false
  include_tag_key true
  tag_key _key
</match>

Your Error Log

No applicable errors being shown.

Additional context

No response

valentinacala commented 1 year ago

@tfmm didi u find any solution?

tfmm commented 1 year ago

Not yet, but I haven't been looking into it recently.

On Mon, Aug 7, 2023, 10:48 valentinacala @.***> wrote:

@tfmm https://github.com/tfmm didi u find any solution?

— Reply to this email directly, view it on GitHub https://github.com/fluent/fluent-plugin-s3/issues/420#issuecomment-1668012234, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABUP2GRMZDEIEFI2JSIFLYLXUD55HANCNFSM6AAAAAAW4BTYNU . You are receiving this because you were mentioned.Message ID: @.***>