aws-samples / amazon-connect-data-analytics-sample

MIT No Attribution
15 stars 2 forks source link

Configuration to account for existing Kinesis Stream/Firehose and S3 buckets #7

Open trkelly320 opened 1 year ago

trkelly320 commented 1 year ago

This is less issue and more like a feature request. Could the package be designed to account for and prompt for existing Kinesis Stream/Firehose and S3 paths for CTRs and Agent Events? We already have solutions utilizing these configurations and it would be great if this package could be configured to use those existing resources as part of the deployment.

angieyu commented 11 months ago

Hi, you can make the change in 2 different ways.

1) Through the CDK code. The solution is designed to handle any customizations you want to develop

Start with importing the existing AE Kinesis stream. In the file https://github.com/aws-samples/amazon-connect-data-analytics-sample/tree/main/cdk-stacks/lib/agent-events/ae-stack.ts, on line 48, instead of creating a new Kinesis steam, you would import an existing one with static fromStreamArn(scope, id, streamArn) https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_kinesis.Stream.html.

If you make the change, you can create a pull request and we will review your work.

2) Manually through the console

Deploy Data Analytics Sample project as is. In the console, go to the Kinesis data stream. Select the existing one you want to use. Create a consumer that is a Firehose delivery. In the wizard, copy over the same configurations as DataAnalyticsSample-AEKinesisFirehose.

Aikleong7 commented 10 months ago

Hi, angieyu. I'm trying to do step 1 but I don't get it on how to change it. do enlighten me. Continue on when I try to do step 2. Do you have an example or a video on how to do it as well. what does create a consumer that is a firehose delivery means?

air720boarder commented 1 month ago

Bumping this. Would love a super easy option here to use existing streams and S3 buckets.