amazon-archives / cloudwatch-logs-subscription-consumer

A specialized Amazon Kinesis stream reader (based on the Amazon Kinesis Connector Library) that can help you deliver data from Amazon CloudWatch Logs to any other system in near real-time using a CloudWatch Logs Subscription Filter.
Other
397 stars 152 forks source link

Can't be used with Amazon Elasticsearch service #9

Open exidy opened 8 years ago

exidy commented 8 years ago

This connector can't be used in conjunction with the Amazon Elasticsearch service because it requires the ES transport protocol, which Amazon ES doesn't expose. Would it be possible for this connector to use the REST protocol?

Jinkxed commented 8 years ago

Would love to see this as well. No point in standing up your own ES clusters anymore :)

dvassallo commented 8 years ago

Hello - We're looking into making this consumer compatible with the Amazon Elasticsearch Service. In the meantime, CloudWatch Logs offers an alternative integration option with the Amazon Elasticsearch Service which you can set up from the AWS Management Console. If you already have an Amazon ES cluster set up, you can simply click on a log group from the CloudWatch Logs section and choose "Actions -> Start Streaming to Amazon Elasticsearch Service":

image

That solution uses a Lambda function to convert the CWL logs to ES documents and it is implemented very similarly to this consumer application. One other benefit of that setup is that you do not need to run any EC2 instances either. It's a completely serverless setup between CWL and your ES cluster.

exidy commented 8 years ago

Unfortunately this connector can't be used in AWS Regions without Lambda (e.g. Sydney)

dvassallo commented 8 years ago

Yes unfortunately the feature in the AWS Management Console is only available in AWS regions where Lambda is currently available. However it is actually possible to have just the Lambda function in one region (e.g. Toyko) and have the Amazon ES cluster and the CWL log group in another (e.g Sydney).

The following solution is definitely a sub-optimal setup experience, but it's just a one-time effort and should be quite straight forward:

aws logs put-subscription-filter \
   --log-group-name "<LOG_GROUP_NAME>" \
   --filter-name "LambdaToElasticsearch" \
   --filter-pattern "" \
   --destination-arn "arn:aws:lambda:ap-southeast-2:<AWS_ACCOUNT_ID>:function:<LAMBDA_FUNCTION_NAME>"

The filter pattern option is important to have the fields properly indexed in Elasticsearch (unless your log data is in JSON format). You may want to check the Getting CloudWatch Logs data indexed in Elasticsearch section in the README.md of this project for more info and example filter patterns for common log formats.

NareshDealer commented 7 years ago

@DVassallo Its very interesting alternative way to use. Thank you for sharing this info.

However, i have one tricky situation where i want to get some guidance.

I have a setup where i streamed all log groups to elastic search cluster service using lambda. Now that we achieved that part, we want to setup our inhouse Elastic search cluster( for various reasons ) where i want to use the lambda function to send those cloudwatch logs to our inhouse cluster. I thought of modifying that lambda function to point it to our cluster but it doesnt seems to be liking that.

Any guidance or help is really appreciated.

Thanks.

vegardvaage commented 7 years ago

@DVassallo is the above approach still the best alternative for this? I'm trying to solve a cross-account CloudTrail -> centralized AWS ES setup in the most streamlined manner possible.

Nomane commented 6 years ago

It is also possible to convert this trick (CW logs to AES) in Cloudformation as well ?

Thanks