Open zmoog opened 5 months ago
We need a Firehose stream to collect the VPC Flow logs and send them to a data stream on an Elastic stack.
To create a Firehose stream, you can use the instructions at Monitor Amazon Web Services (AWS) with Amazon Data Firehose up to step 3.
However, you must set two things differently.
Name
Pick a name for your Firehose stream.
Parameters
Use the following parameters:
Name | Value |
---|---|
es_datastream_name |
logs-aws.vpcflow-default |
If you're publishing flow logs to a different account, create the required IAM roles, as described in IAM roles for cross account delivery.
Do one of the following:
Choose Actions, Create flow log.
For Filter, specify the type of traffic to log.
For Maximum aggregation interval, choose the maximum period of time during which a flow is captured and aggregated into one flow log record.
For Destination, choose either of the following options:
For Amazon Data Firehose stream name, choose the delivery stream that you created.
[Cross account delivery only] For IAM roles, specify the required roles (see IAM roles for cross account delivery).
For Log record format, specify the format for the flow log record.
For Additional metadata, select if you want to include metadata from Amazon ECS in the log format.
(Optional) Choose Add tag to apply tags to the flow log.
Choose Create flow log.
Now, the network interface / VPC / subnet you set up is sending the VPC flow logs to the Firehose data stream, which is forwarding them to the Elasticsearch cluster.
Here are the VPC flow logs in the logs-aws.vpcflow-default
data stream using Discover and Log Explorer:
Goal
Suppose I own an AWS account, and I want to export AWS VPC Flow log events from AWS to an Elastic cluster.
Context
What are the VPC Flow logs?
Requirements & Limitations
Preparation
Steps
Overview
Resources