Open zmoog opened 1 month ago
Exporting logs from CloudWatch to an Elastic deployment using Firehose requires setting up a subscription filter to send log events to a Firehose stream.
In practice,
In the following steps, we will:
If you already have a Lambda function, or any other service or application that logs on a CloudWatch log group, you can skip this section. Take note of the log group from which you want to collect log events and move to the next section.
Otherwise, let's create a lambda function.
In this tutorial, we will write a simple AWS Lambda-based app, collect the application logs, and forward them to Elastic.
Like many other services and platforms in AWS, Lambda functions natively log directly to CloudWatch out of the box. Lambda functions are a great tool for experimenting on AWS.
When AWS completes the creation of the function, visit the Code source section and paste the following Python code as function source code:
import json
print('Loading function')
def lambda_handler(event, context):
print("Received event: " + json.dumps(event))
Important: Click on Deploy to deploy the updated source code.
With the function ready to go, we can invoke it a few times to generate sample logs.
On the function page,
Visit the function's log group (usually, the AWS console offers a handy link to jump straight to the log group it created for this function's logs).
You should see something similar:
Take note of the log group name for this Lambda function, and move to the next section.
We need a Firehose stream to collect the Lambda function logs and send them to a data stream on an Elastic stack.
To create a Firehose stream, you can use the instructions at Monitor Amazon Web Services (AWS) with Amazon Data Firehose up to step 3.
Follow the instructions up to step 3.
However, you must set the Parameters in the Destination settings section.
Use the following parameters:
Name | Value |
---|---|
es_datastream_name |
logs-aws.generic-default |
include_cw_extracted_fields |
? |
The Firehose stream is ready to send logs to our Elastic Cloud deployment.
Next steps:
Please open the log group where the Lambda service is sending the events. We must forward these events to an Elastic stack using the Firehose delivery stream.
The CloudWatch log group offers a subscription filter. The subscription filter allows users to pick log events from the log group and forward them to other services, such as an Amazon Kinesis stream, an Amazon Data Firehose stream, or AWS Lambda.
Please select the Firehose stream we create in the previous step.
Grant the CloudWatch service to send log events to the stream in Firehose.
This step is made of multiple parts:
Create a new role and use the following JSON as the trust policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "logs.eu-north-1.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringLike": {
"aws:SourceArn": "arn:aws:logs:eu-north-1:<YOUR ACCOUNT ID>:*"
}
}
}
]
}
Create and assign a new IAM policy to the role using the following JSON:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "firehose:PutRecord",
"Resource": "arn:aws:firehose:eu-north-1:<YOUR ACCOUNT ID>:deliverystream/mbranca-dev-cloudtrail-logs"
}
]
}
When the new role is ready, you can select it in the subscription filter.
Select the "Other" in the Log format option.
TBA
Visit the AWS Lambda page again, select the function we created, and execute it a few more times to generate log events.
Check if there are destination error logs.
On the AWS console, visit your Firehose stream and check for entries in the "Destination error logs":
If everything is running smoothly, this list will be empty. If there's an error, you can check the details. Here is a delivery stream that fails to send records to the Elastic stack due to bad authentication settings:
The Firehose delivery stream reports:
Goal
Suppose I own an AWS account, and I want to export log events from CloudWatch logs to an Elastic cluster.
Context
What are the CloudWatch logs?
Requirements & Limitations
We need to keep in mind a few requirements and limitations when using Amazon Data Firehose to ingest.
Preparation
Install the latest versions of the following integrations on your Elastic cluster:
Steps