micronaut-projects / micronaut-aws

Projects specific to integrating Micronaut and Amazon Web Services (AWS)
Apache License 2.0
87 stars 80 forks source link

Extremely slow cold starts for a micronaut functions deployed on AWS Lambda #18

Closed codependent closed 5 years ago

codependent commented 5 years ago

This issue refers to the following function: https://github.com/codependent/micronaut-aws-authentication-function

It's a basic function that generates a JWT token:

@FunctionBean("micronaut-aws-authentication-function")
class MicronautAwsAuthenticationFunction : Function<Credentials, AuthenticationResponse> {

    private val logger = LoggerFactory.getLogger(javaClass)

    override fun apply(t: Credentials): AuthenticationResponse {
        logger.info("Authenticating {} - {}", t.user, t.password)
        return if (t.user == "jose" && t.password == "MyPassword") {
            //HMAC
            val algorithm = Algorithm.HMAC256("secret")
            val token = JWT.create()
                    .withIssuer("MicronautAwsAuthenticationFunction")
                    .withSubject(t.user)
                    .sign(algorithm)
            logger.info("Generated token {}", token)
            AuthenticationResponse(token)
        } else {
            logger.info("Incorrect user/password")
            AuthenticationResponse()
        }
    }
}

Normally (warm behaviour) this AWS executes really fast (~5ms):

Duration 2.05 ms Billed duration 100 ms Resources configured 320 MB Max memory used 153 MB

However the cold start is extremely slow ~20s. I was expecting that Micronaut-based java8 lambdas start way faster than normal Java applications but this shows there's no improvement at all.

The function is invoked directly in the Lambda web console, through the io.micronaut.function.aws.MicronautRequestStreamHandler handler and has 320MB RAM assigned. Also, there's no VPC configured:

Duration 19475.13 ms Billed duration 19500 ms Resources configured 320 MB Max memory used 154 MB

I wonder if there may be something wrong about my setup/implementation or this is the expected behaviour and you guys have found similar times for the cold start. In this case, do you have a way to improve this horrible cold start times??

graemerocher commented 5 years ago

Hmm there seems to be have been a performance regression with 1.1.x

Downgrading to 1.0.x results in 600ms cold start times.

Looking into it. Thanks for the report

graemerocher commented 5 years ago

Actually I take that back the behaviour is the same in 1.0.4 as 1.1 RC1 it is interesting because the Groovy function approach executes in 600ms from a cold start https://github.com/micronaut-projects/micronaut-examples/tree/master/hello-world-lambda

So it seems to be something todo with the FunctionalApplication approach

graemerocher commented 5 years ago

Seems this doesn't impact applications created with mn create-app hello-gateway --features aws-api-gateway either as I get around 1s cold start times for that, so must be something specifically wrong with MicronautRequestStreamHandler however what that is I haven't been able to establish yet.

graemerocher commented 5 years ago

So I still haven't gotten to the bottom of this, it appears to be something specific to using io.micronaut.function.aws.MicronautRequestStreamHandler that makes the whole function slower for some bizarre reason. The logs in CloudWatch show that the execution is just generally slower.

If I alter the function to instead subclass FunctionInitializer and change the deploy task:

    task deploy(type: jp.classmethod.aws.gradle.lambda.AWSLambdaMigrateFunctionTask, dependsOn: shadowJar) {
        functionName = "hello-lambda"
//        handler = "io.micronaut.function.aws.MicronautRequestStreamHandler"
        handler = "hello.lambda.HelloFunction::execute"

And with a function such as the below:


import io.micronaut.function.executor.FunctionInitializer;
import javax.inject.Inject;
import javax.inject.Singleton;

@Singleton
public class HelloFunction extends FunctionInitializer {

    @Inject GreetingService greetingService;

    public Hello execute(Hello msg) {
        return greetingService.hello(msg);
    }
}

This results in the following cold start times even with only 192mb of memory assigned:

Duration
247.03 ms
Billed duration
300 ms
Resources configured
192 MB
Max memory used
144 MB

Switching back using MicronautRequestStreamHandler makes the cold start time jump to 20s again, however no difference in the code paths. For new applications created with create-function I may well change to the function initializer approach until this mystery is solved.

The weird thing is that the API Gateway support also uses a subclass of RequestStreamHandler and doesn't suffer from the cold start issues.

codependent commented 5 years ago

Wow, this is REALLY weird. I've been having a quick look at this and seen that the two places where most time is wasted are always:

For a 17s cold start:

return applicationContext
                    .start()
                    .getEnvironment();
return decoder.decode(arg, input);
graemerocher commented 5 years ago

Yeah it is really bizarre, if you enable logging the function is just slower, with no indication why and the only difference in the code paths I can see is one uses MicronautRequestStreamHandler

sapessi commented 5 years ago

Hi @codependent, is your function configured to connect to a VPC? I suspect a cold start that slow could be caused by the well-document VPC cold start issues. The good news is that a new feature that eliminates this issue is rolling out as I write this comment - see announcement here. However, if you do not need your function to operate within a VPC - ie. you don't need to access VPC resources such as a relational database or an ElastiCache instance - I would recommend you avoid configuring VPC on your function at all.

msmsimondean commented 5 years ago

@sapessi I see the same slow start times when running a Micronaut Lambda function locally on SAM (https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-reference.html#serverless-sam-cli)

sapessi commented 5 years ago

@msmsimondean is it also in the tens of seconds range?

Looking through the code the only potential issue I could find is in the initialization strategy. My recommendation (for the Micronaut team) is to start the app, context – anything you know you’ll need for each invocation – at the same time as Lambda constructs your handler. Either use the class constructor or a static block and make sure the executor is created in the constructor of the handler. Lambda allows the sandbox a burst of CPU access during initialization. Initialization is the phase when the Lambda runtime loads your handler class and calls your constructor to initialize the handler singleton. After the initialization phase, when Lambda calls the handler, the CPU is throttled to an amount proportional to the memory allocated to the function. This can make a huge difference in cold start times.

Be mindful that the initialization time is capped at 10 seconds. If your constructor takes longer than 10 seconds to return Lambda will assume that the sandbox is dead.

blazejkarmelita commented 3 years ago

Yeah it is really bizarre, if you enable logging the function is just slower, with no indication why and the only difference in the code paths I can see is one uses MicronautRequestStreamHandler

Hi. Have you manged to find out what is going on with this one? I did the test with the Micronaut Launcher demo for serverless function with the latest Micronaut. When i refactor the generated Book handler to extend MicronautRequestStreamHandler and introduce the @FunctionBean than the lambda execution takes about 8 sec while the original one takes 100 millis (both including cold start).

vilunet commented 3 years ago

I have the same behavior - Can't believe why Micronuat is not that great with Lambda as it claims.

surbhi65 commented 2 years ago

How is this issue closed when it still exists?

msailes commented 2 years ago

@surbhi65 maybe you could make a new issue with the problem you're having and details of what you're doing.