It is often useful to stream data, as it gets generated, for indexing in an Amazon Elasticsearch Service domain. This helps fresh data to be available for search or analytics. To do this requires:
Lambda is an AWS service that takes care of these requirements. Put simply, it is an "event handling" service in the cloud. Lambda lets us implement the event handler (in Node.js or Java), which it hosts and invokes in response to an event.
The handler can be triggered by a "push" or a "pull" approach. Certain event sources (such as S3) push an event notification to Lambda. Others (such as Kinesis) require Lambda to poll for events and pull them when available.
For more details on AWS Lambda, please see the documentation.
This package contains sample Lambda code (in Node.js) to stream data to ES from two common AWS data sources: S3 and Kinesis. The S3 sample takes apache log files, parses them into JSON documents and adds them to ES. The Kinesis sample reads JSON data from the stream and adds them to ES.
Note that the sample code has been kept simple for reasons for clarity. It does not handle ES document batching, or eventual consistency issues for S3 updates, etc.
While some detailed instructions are covered later in this file and elsewhere (in the Lambda documentation), this section aims to show the larger picture that the individual steps work to accomplish. We assume that the data source (an S3 bucket or a Kinesis stream, in this case) and an ES domain are already set up.
Deployment Package: The "Deployment Package" is the event handler code files and its dependencies packaged as a zip file. The first step in creating a new Lambda function is to prepare and upload this zip file.
Lambda Configuration:
.handler
suffix.Authorization: Since there is a need here for various AWS services making calls to each other, appropriate authorization is required. This takes the form of configuring an IAM role, to which various authorization policies are attached. This role will be assumed by the Lambda function when running.
Note:
On your development machine, download and install Node.js.
Anywhere, create a directory structure similar to the following:
eslambda (place sample code here) | +-- node_modules (dependencies will go here)
Modify the sample code with the correct ES endpoint, region, index and document type.
Install each dependency imported by the sample code
(with the require()
call), as follows:
npm install
Verify that these are installed within the node_modules
subdirectory.
Create a zip file to package the code and the node_modules
subdirectory
zip -r eslambda.zip *
The zip file thus created is the Lambda Deployment Package.
Set up the Lambda function and the S3 bucket as described in the Lambda-S3 Walkthrough. Please keep in mind the following notes and configuration overrides:
The walkthrough uses the AWS CLI for configuration, but it's probably more convenient to use the AWS Console (web UI)
The S3 bucket must be created in the same region as Lambda is, so that it can push events to Lambda.
When registering the S3 bucket as the data-source in Lambda, add a filter
for files having .log
suffix, so that Lambda picks up only apache log files.
The following authorizations are required:
The Lambda console provides a simple way to create an IAM role with policies for (1). For (2), when creating the IAM role, choose the "S3 execution role" option; this will load the role with permissions to read from the S3 bucket. For (3), add the following access policy to permit ES operations to the role.
{ "Version": "2012-10-17", "Statement": [ { "Action": [ "es:" ], "Effect": "Allow", "Resource": "" } ] }
Set up the Lambda function and the Kinesis stream as described in the Lambda-Kinesis Walkthrough. Please keep in mind the following notes and configuration overrides:
The walkthrough uses the AWS CLI, but it's probably more convenient to use the AWS Console (web UI) for Lambda configuration.
To the IAM role assigned to the Lambda function, add the following access policy to permit ES operations.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"es:*"
],
"Effect": "Allow",
"Resource": "*"
}
]
}
For testing: If you have a Kinesis client, use it to stream a record to Lambda. If not, the AWS CLI could be used to push a JSON document to Lambda.
aws kinesis put-record --stream-name
Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
SPDX-License-Identifier: MIT-0