Closed zmoog closed 1 year ago
Docs about how to set up an S3 client at https://www.linode.com/docs/guides/migrate-to-linode-object-storage/
I used the Frankfurt, DE (eu-central-1) region, so the bucket URL is https://mbranca-esf-logs.eu-central-1.linodeobjects.com/
The endpoint
config setting to set up the custom AWS Logs integration is https://eu-central-1.linodeobjects.com (bucket URL minus the bucket name).
Another essential setting is the Access Key:
Time to run some tests.
Here's the aws-s3 input settings from the agent policy:
inputs:
- id: aws-s3-aws_logs-afe56f1c-6312-411f-a8c8-b369327c943f
name: aws_logs-1
revision: 9
type: aws-s3
use_output: default
meta:
package:
name: aws_logs
version: 0.5.1
data_stream:
namespace: default
package_policy_id: afe56f1c-6312-411f-a8c8-b369327c943f
streams:
- id: aws-s3-aws_logs.generic-afe56f1c-6312-411f-a8c8-b369327c943f
data_stream:
dataset: aws_logs.generic
access_key_id: <REDACTED>
secret_access_key: <REDACTED>
parsers: null
sqs.max_receive_count: 5
max_bytes: 10MiB
non_aws_bucket_name: mbranca-esf-logs
max_number_of_messages: 5
tags:
- preserve_original_event
- forwarded
publisher_pipeline.disable_host: true
file_selectors: null
endpoint: 'https://eu-central-1.linodeobjects.com'
bucket_list_prefix: 2023-02-14-13-41-08-79BF7A8FA7821B47_D
number_of_workers: 5
sqs.wait_time: 20s
bucket_list_interval: 120s
I uploaded a couple of file into the bucket:
And here is the end result in Elasticsearch:
The test is successful!
Logged into https://cloud.linode.com/object-storage/buckets with my existing account and created a new Object Storage bucked named
mbranca-esf-logs
: