swift-server / swift-aws-lambda-runtime

Swift implementation of AWS Lambda Runtime
Apache License 2.0
1.12k stars 100 forks source link

[Draft] Detached tasks #334

Open Buratti opened 1 week ago

Buratti commented 1 week ago

Introduces a system that allows to keep the execution environment running after submitting the response for a synchronous invocation.

Note:

This is a draft PR, only meant to gather feedbacks. This code requires some polishing and testing. The lines commented with // To discuss: require careful consideration.

Motivation:

It is common for a Lambda function to depend on several other serverless services (SQS, APIGW WebSockets, SNS, EventBridge, X-Ray just to name a few). In many occasions, such services might perform non-critical work like sending a push notification to the user, store metrics or flush a set of spans. In these scenarios, the occurrence of a non-recoverable error usually doesn't mean that the whole invocation should fail and result in a sad 500. At the same time, awaiting for such tasks to finish gives no benefits and drastically increases the latency for the API consumers.

This PR implements a hook in the runtime which allows the developer to dispatch code that can continue its execution after the invocation of /response and before having the environment being frozen by /next. This is a common practice described here.

Modifications:

A new class called DetachedTaskContainer has been added, which follows the design of LambdaTerminator as close as possible. A LambdaContext now owns an instance of DetachedTaskContainer, which can be used by the handler to dispatch asynchrouns non-critical work that can finish after the invocation of /response and before /next.

Result:

It is now possible to return the result of a synchronous invocation (like API Gateway integration, CloudFront origin, Lambda function URL, Cognito Custom Auth, etc), as soon as it is ready, reducing the overall API latency.

func handle(_ buffer: ByteBuffer, context: LambdaContext) -> EventLoopFuture<ByteBuffer?> {
    // EventLoop API
    context.tasks.detached(name: "Example 1") { eventLoop in
        return performSomeWork(on: eventLoop).map { result in
            context.logger.info("\(result) has been provided at \(Date())")
        }
    }
    // Async API
    context.tasks.detached(name: "Example 2") {
        try await Task.sleep(for: .seconds(3))
        context.logger.info("Hello World!")
    }
    let response = encodedAPIGatewayResponse()
    return context.eventLoop.makeSucceededFuture(response)
}
Buratti commented 1 week ago

@sebsto @adam-fowler any feedback is welcome

sebsto commented 1 week ago

I love the idea. I need more time to look at the implementation FYI, we're planning to rewrite the runtime for a v2. a rewrite to simplify and better embrace Swift concurrency and service lifecycle. I wonder at which point it is pertinent to include this type of functionality now or wait for the rewrite. @fabianfett wdyt ?

Buratti commented 1 week ago

That's great! I would love to contribute to a more Swift Concurrency friendly runtime. About this specific feature, I wouldn't mind seeing in the v1 as well. I have been testing it in our staging environment and the difference in latency is indeed noticeable. Also, there are no breaking changes in this PR.

sebsto commented 1 week ago

@swift-server-bot test this please

Buratti commented 1 week ago
  1. Actually that's not correct. AWS Lambda freezes the execution environment when /next is invoked. The time elapsed between /response|error and /next is billed normally. The article from AWS Blogs I linked in the first comment explains in details both approaches (this one and the one leveraging the Internal Extensions API).

  2. The proposed design tries to be as much as possible aligned to the way Swift concurrency works.

    
    // Pure Swift
    Task.detached {
    try await fireAndForget()
    }

// Lambda handler context.task.detached { try await fireAndForget() }

fabianfett commented 2 days ago

Having thought about this a bit more, I think I have to conclusion that I'm open to this patch in v1. For v2, we should 100% make the concurrency approach structured. A new API proposal will be made shortly.

@Buratti thanks for pointing out that lambda supports background processing.

fabianfett commented 2 days ago

@Buratti Sorry for not making this explicit. Thanks for pushing on this! Really appreciated. Gave me quite a bit food for thought! Let's try to land this!

Buratti commented 2 days ago

Hi @fabianfett! Thank you for giving this a deeper thought and for the feedbacks, I'll make sure to address them in the next couple of days. I'm very glad to hear you have reconsidered this feature, and yes, I definitely agree on having a fully structured concurrency version of it for the v2!