Closed knihit closed 3 years ago
knihit@ This pattern makes sense and can be added to the backlog, let us know if you're interested in contributing it!
Suggest to break up the pattern into 1 - aws-events-rule-kinesisstream which is under active development, and 2 - aws-kinesisstream-gluejob
aws-events-rule-kinesisstream + aws-kinesisstream-gluejob = aws-events-rule-kinesisstream-gluejob
AWS Events bridge acts as integration bus either within applications, across applications within an enterprise as well as AWS Marketplace vendors. Events coming in through the events bridge may required to be buffered or streamed and then processed through the application infrastructure. Kinesis data stream can provide the buffering of event messages. The event messages may require transformations based on where they are coming from. Glue ETL jobs can then transform that data and store this information either in DDB or Redshift or S3 or any other datastore for further processing.
Use Case
In my use case, I have streaming data that undergoes machine learning inferences (text, image, video among others). The ingested data can come from different sources with different schema structure, storing the input data and the machine learning inference to a normalized/ standard structure requires it go through an ETL transformation.
Proposed Solution
I can provide more details on this one if required.
Other