zmoog / public-notes

Apache License 2.0
0 stars 1 forks source link

Figure out how to deploy ESF (CloudFormation install method) #64

Open zmoog opened 10 months ago

zmoog commented 10 months ago

I want to install ESF to ingest files from sqs-s3 input.

I will use the the CloudFormation install method.

zmoog commented 10 months ago

Requirements

We need to create the following resources:

  1. One S3 bucket (with the data to ingest)
  2. One SQS queue (for receiving the S3 object creation notifications)
  3. One S3 bucket (to store the config.yaml file)

image

(1) S3 bucket for the data

You probably already have an S3 bucket with actual data. For this research, I will create one bucket for testing with sample data.

$ aws s3api create-bucket \
    --bucket zmoog-esf-howto-data \
    --region eu-west-1 \
    --create-bucket-configuration LocationConstraint=eu-west-1
{
    "Location": "http://zmoog-esf-howto-data.s3.amazonaws.com/"
}

(2) SQS queue

We need an SQS queue where we will send the S3 object creation notifications for the zmoog-esf-howto-data bucket.

Create a new SQS queue named zmoog-esf-howto-notificaions and set the visibility timeout to 910 seconds.

$ cat create-queue.json
{
  "VisibilityTimeout": "910"
}

$ aws sqs create-queue --queue-name zmoog-esf-howto-notifications --attributes file://create-queue.json
{
    "QueueUrl": "https://sqs.eu-west-1.amazonaws.com/123/zmoog-esf-howto-notificaions"
}

Enable S3 object creation from zmoog-esf-howto-data to zmoog-esf-howto-notifications:

$ cat policy.json
{
  "Version": "2008-10-17",
  "Id": "__default_policy_ID",
  "Statement": [
    {
      "Sid": "__owner_statement",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::123:root"
      },
      "Action": "SQS:*",
      "Resource": "arn:aws:sqs:eu-west-1:123:zmoog-esf-howto-notifications"
    },
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "s3.amazonaws.com"
      },
      "Action": "SQS:SendMessage",
      "Resource": "arn:aws:sqs:eu-west-1:123:zmoog-esf-howto-notifications"
    }
  ]
}

# Set the SQS access policy
TBD

# Enable notifications
$ cat notifications.json
{
    "QueueConfigurations": [
        {
            "Id": "Creations",
            "QueueArn": "arn:aws:sqs:eu-west-1:123:zmoog-esf-howto-notifications",
            "Events": [
                "s3:ObjectCreated:*"
            ],
            "Filter": {
                "Key": {
                    "FilterRules": [
                        {
                            "Name": "Prefix",
                            "Value": ""
                        },
                        {
                            "Name": "Suffix",
                            "Value": ""
                        }
                    ]
                }
            }
        }
    ]
}

aws s3api put-bucket-notification-configuration \
    --bucket zmoog-esf-howto-data \
    --notification-configuration file://notification.json

Create a sample.log file and upload to the zmoog-esf-howto-data bucket:

$ cat sample.log
Sample log line 1
Sample log line 2

$ aws s3 cp sample.log s3://zmoog-esf-howto-data/
upload: ./sample.log to s3://zmoog-esf-howto-data/sample.log

(3) S3 bucket

$ aws s3api create-bucket \
    --bucket zmoog-esf-howto-configs \
    --region eu-west-1 \
    --create-bucket-configuration LocationConstraint=eu-west-1
{
    "Location": "http://zmoog-esf-howto-configs.s3.amazonaws.com/"
}

And upload a basic configuration file like this:

inputs:
  - type: "s3-sqs"
    id: "arn:aws:sqs:eu-west-1:123:zmoog-esf-howto-notifications"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence
          elasticsearch_url: "<REDACTED>"
          # either api_key or username/password, api_key takes precedence
          api_key: “<REDACTED>"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500
          batch_max_bytes: 10485760
          ssl_assert_fingerprint: ""
aws s3 cp config.yml s3://zmoog-esf-howto-configs/config.yml
zmoog commented 10 months ago

Deploy ESF

Using https://www.elastic.co/guide/en/esf/master/aws-deploy-elastic-serverless-forwarder.html#aws-serverless-forwarder-deploy-cloudformation

List ESF versions available for deployments:

aws serverlessrepo list-application-versions \
  --application-id arn:aws:serverlessrepo:eu-central-1:267093732750:applications/elastic-serverless-forwarder
cat sar-application.yaml
Transform: AWS::Serverless-2016-10-31
Resources:
  SarCloudformationDeployment:
    Type: AWS::Serverless::Application
    Properties:
      Location:
        ApplicationId: 'arn:aws:serverlessrepo:eu-central-1:267093732750:applications/elastic-serverless-forwarder'
        SemanticVersion: '1.9.0'  ## SET TO CORRECT SEMANTIC VERSION (MUST BE GREATER THAN 1.6.0)
      Parameters:
        ElasticServerlessForwarderS3ConfigFile: "s3://zmoog-esf-howto-configs/config.yml"
        ElasticServerlessForwarderSSMSecrets: ""
        ElasticServerlessForwarderKMSKeys: ""
        ElasticServerlessForwarderSQSEvents: ""
        ElasticServerlessForwarderS3SQSEvents: "arn:aws:sqs:eu-west-1:123:zmoog-esf-howto-notifications"
        ElasticServerlessForwarderKinesisEvents: ""
        ElasticServerlessForwarderCloudWatchLogsEvents: ""
        ElasticServerlessForwarderS3Buckets: "arn:aws:s3:::zmoog-esf-howto-data"
        ElasticServerlessForwarderSecurityGroups: ""
        ElasticServerlessForwarderSubnets: ""

Deploy ESF using the given configuration at sar-application.yaml:

aws cloudformation deploy \
    --template-file sar-application.yaml \
    --stack-name esf-cloudformation-deployment \
    --capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND

Waiting for changeset to be created..
Waiting for stack create/update to complete
Successfully created/updated stack - esf-cloudformation-deployment
zmoog commented 9 months ago

Test

Create a new object in the S3 bucket:

$ cat sample.log
Sample log line 1
Sample log line 2

aws s3 cp sample.log s3://zmoog-esf-howto-data/sample.2.log

And then check if the two log lines landed in the data stream logs-generic-default:

CleanShot 2023-12-08 at 10 48 00@2x