aws-amplify / amplify-cli

The AWS Amplify CLI is a toolchain for simplifying serverless web and mobile development.
Apache License 2.0
2.81k stars 821 forks source link

Support for AWS SQS Queues and Triggers #3326

Open luislinietsky opened 4 years ago

luislinietsky commented 4 years ago

Note: If your feature-request is regarding the AWS Amplify Console service, please log it in the official AWS Amplify Console forum

Is your feature request related to a problem? Please describe. Include support for AWS SQS queues

Describe the solution you'd like It would be great if Amplify Cli would support creating AWS SQS queues as resources, assigning a Lambda trigger from existing (or new) functions and providing permissions to other Lambdas to send messages to such queue.

Describe alternatives you've considered So far I am using amplify to create the lambda that will work as the trigger and I have to create, manually and after pushing the environment, the queue and the trigger to such lambda. I also need to provide manually the permissions to the lambdas that are going to be sending messages to such queue.

Additional context This si great for building more complex solutions and architectures using amplify.

undefobj commented 4 years ago

@luislinietsky what is the application use case you're trying to build that would require an SQS queue? You state "more complex solutions" but what is the use case in an application that you are trying to solve rather than the implementation? Thank you.

luislinietsky commented 4 years ago

I need the ability to create a worker queue. The main idea behind Work Queues (aka: Task Queues) is to avoid doing a resource-intensive task immediately and having to wait for it to complete, in this case it would be from an HTTP Endpoint. Instead I schedule the task to be done later. I encapsulate a task as a message and send it to the queue, so the Lambda function answering to the HTTP request would return ASAP. Then, a worker process (Lambda function) would run in background, pop the task from the queue and execute the job.

undefobj commented 4 years ago

@luislinietsky Thanks for the reply. "worker queue" is an implementation though. Can you let us know what use cases in your application you have? Amplify is a category based system that customers program against, for instance we have an Analytics category that has several implementations (Pinpoint, Kinesis, Personalize) and similar setups for Predictions, API, etc. Understanding the use case for the application scenario you're trying to address will let us look into potential implementations in the future, which could be completely different than SQS alone.

luislinietsky commented 4 years ago

Oh, you needed a sample of the use case of the application? Sure, there are a few:

1) I am building a mobile app for requesting microcredits. At some point, I have an integration with an external provider that grants me access to the history of bank transactions of the customer, prior proper authorization, so that customer can preview them from within the app. But at the same time, I need to feed those transactions into another system that would use them to run a machine learning model, to predict customer behaviours and offer different products in the future. This is a completely detached process, that shouldn't have an impact on the performance of the initial use case, which is just to let the user see their bank account resume.

2) This app I mentioned, is used to request micro credits, loans. At to some point, I need to both send and recover the founds of the loans. For that purpose, I have a DynamoDB table where I store all the transactions I need to ran in the future (When, to who, how much, status, etc). I will have a Cron (I was going to request another feature for scheduled events on Lambda Functions) that would pull all the transactions that needs to be processed by an ACH provider given a window of time, which could be many, making it impossible to be run in a single lambda execution, due to time limits and to prevent throttling from the ACH provider. So for this scenario, once I collect the definitions for all the ACH transactions I neesd to run, which is only a list of objects from DynamoDB, I want so schedule them as tasks in a Queue so that each Lambda execution process a single ACH transaction. This way I don't have to worry about Lambda's time limit or getting throttled by the API of the ACH provider.

3) The same will happen for the previous scenario, when I need to build a Pooling process that needs to retrieve the status of any open ACH transaction. They can take up to 72 hours to be completed and I need to check the status. Based on a cron process, I will collect all the tasks that needs to be run independently, so that each task would individually check on the status a single transaction, taking different actions on any possible status of a given ACH transaction.

So basically, in one scenario, I need to run a slow process in background and in the other two, I need to split a large task in many smaller tasks.

I have done similar things with AWS using lambdas and SQS, while building my custom cloud formation templates for the entire solution.

luislinietsky commented 4 years ago

I was also going to request a feature to create a Scheduled Event Trigger for Lambda Functions from the Amplify Cli, like for example, after creating a lambda, "Run this lambda on every 5 minutes". Should I create another issue for this? Or it can be included in this one? I haven't found any other open issues for this. Regards, Luis.

kaustavghosh06 commented 4 years ago

@luislinietsky I believe there is an open feature request for this - #1789 and we're tracking it in our backlog.

magjack commented 4 years ago

Is there any update on the prospect of sqs support for amplify cli ?

lookea commented 4 years ago

I am surprised a software developer ask about if there is an use case for a queue, don't know ask AWS why they have that useless thing. basically for anything outside the scope of a single microservice. anything that require more than a couple of simple queries to aggregate data, building a report let say, doing a monthly balance, processing a video file, have several applications notified about a change in the data, not needing all of them to be online at the same time.

undefobj commented 4 years ago

@lookea We base roadmap priorities off of customer feedback in channels such as GitHub, Discord, etc. Part of operating in an open source manner is getting requirements on functional use cases which helps us design the features correctly. Amplify is based on use cases that help mobile and web developers build apps, not arbitrary service implementations. This is why we ask questions like the above so that we can work backwards from the use case and address it holistically.

foranuj commented 4 years ago

Another use case would be a photo upload application

Users upload a photo from a mobile app to a central server and that image gets stored in s3. Now while the upload is on-going, one would like to run some rekognition apis on that image and store the response. However doing this in the live path would be too slow and the user would not get the response fast enough.

Putting the image in the sqs would make sense from that point, and there'd be a trigger that runs a consumer which runs the rekognition setups and saves the response to DynamoDB for future retrieval.

kristianmandrup commented 3 years ago

This was "sort of" made available when Kinesis stream support was being added, but SQS as a target option was removed (no existing category). See https://github.com/aws-amplify/amplify-cli/pull/2463#issuecomment-538174966

SourceCode commented 3 years ago

Our use case/need is that of a scheduled task + Queue worker pattern. We have other systems sending to SNS alerting one or more SQS that an event has occurred or something has completed. The user is generally in the amplify zone with heavier work happening outside of it by other systems. These tasks may contain data that updates their requests/content/state/etc.

I know that this could be scripted through cloud formation, creating the queue and subscribing to topics, as well as creating the lambda and its invocation schedule - but - it would be incredibly awesome to have this as a command or plugin that walks users through this process.

There may be a better way of accomplishing this (please provide insight) but this is the common way to decouple the messages between systems safely and scale it / add more listeners.

r0zar commented 3 years ago

IMO the "Amplify way" to think about queues, would be as "triggers" to/from Lambdas/functions.

This would be similar to how Lambdas/functions can be invoked by Cloudwatch events.

hogarthww-labs commented 3 years ago

You can see SQS category/service removed here

Wish it would be added back in

hogarthww-labs commented 3 years ago

I've created the following amplify plugins which could serve as a good starting point

Both of these plugins are based on amplify-category-template

I've also created the following NodeJS utilities pack for working with SQS

Note: The above mentioned projects are all "untested" so I recommend using npm link to install locally in your project and then polish them off to suit your needs.

Note that custom amplify resources can be found under the Cloud Formation service in the AWS Console.

Happy days ;)

fkunecke commented 3 years ago

@hogarthww-labs thanks for putting that together. Your template files, combined with https://medium.com/@navvabian/how-to-add-an-sqs-queue-to-your-amplify-cli-bootstrapped-project-cb7781c636ed I was able to piece together an SQS + lambda trigger to fire off some machine learning/computationally heavy tasks from a queue. Would love to see this integrated completely into amplify, but definitely understand things take time.

hogarthww-labs commented 3 years ago

Great to hear :) Do you have a repo with a working example you can share?

hogarthww-labs commented 3 years ago

Just added a category for SQS consumer/publisher subscribing to SNS topic. The consumer SQS triggers a lambda.

https://github.com/hogarthww-labs/amplify-category-sns-sqs-lambda

jonmifsud commented 3 years ago

If it helps to add to the list :)

We consume a lot of webhook data. In some instances webhook data is provided 1-1. which we are happy to process right away.

however, sometimes we may get a webhook with hundreds of data points to handle. Whilst not too heavy - this would be multi-tenant user data that would require reaching out to external APIs (for each instance) to get refreshed/updated data.

Needless to say, given API limits and whatnot, the webhook is not able to respond within 25s unless we somehow queue and process these separately.

ababushkin commented 3 years ago

Perhaps the category of "SQS" doesn't make sense given the use-cases mentioned, and something more like: job instead?

amplify add job

Which prompts you to:

Later on, you can:

amplify update function and give access to the queue so you can put jobs inside it

I would personally use this together with the api category, similar to how ApiGateway let's us send messages to SQS when an API is invoked.

Some use-cases:

Naturally, some might say you can do this with events but this is problematic as you still need to handle the underlying retry mechanisms so a queue is almost always needed to make sure these things are done reliably.

rafaelfaria commented 2 years ago

Any updates on this subject? Would love to see some built-in support for SNS and SQS in AmplifyJS

BBopanna commented 2 years ago

Any updates on time lines when this will be available ?

ericmarcos commented 1 year ago

I guess a "good enough" solution for many use cases described in this thread is to just invoke a Lambda asynchronously from the http endpoint Lambda:

import json
import boto3

def lambda_handler(event, context):
    # Parse the JSON body from the API Gateway POST request
    body = json.loads(event['body'])

    # Set the name of the Lambda function to invoke
    function_name = 'my-function-name'

    # Set the payload to pass to the Lambda function
    payload = {
        'key': body['key'],
        'value': body['value']
    }

    # Invoke the Lambda function asynchronously
    client = boto3.client('lambda')
    response = client.invoke(
        FunctionName=function_name,
        InvocationType='Event', # Set this to 'RequestResponse' for a synchronous invocation
        Payload=json.dumps(payload)
    )

    # Return a successful response to the API Gateway
    return {
        'statusCode': 200,
        'body': json.dumps({'message': 'Lambda function invoked successfully'})
    }

Am I missing something?