Closed Toyo-Daichi closed 1 year ago
Hi @Toyo-Daichi, thank you for reaching out :)
I understand that your function needs to run for almost 15 minutes, every time. And this becomes a problem with num>=5
because the overall execution time for the entire workflow is 15 minutes too.
Have you considered using the input parameter "parallelInvocation": true
? That way, all your functions will be invoked in parallel and they can run as long as the overall workflow.
If the functions are operating on the same ETL data (input or output), I would suggest modifying the business logic to write into a "random" location to avoid conflicts between different invocations.
Would that work for your use case?
@alexcasalboni
Thank you very much for your prompt response, and I apologize for the lack of confirmation input parameters.
I successfully executed "parallelInvocation": true
setting and mesured optimal memory.
@Toyo-Daichi super 🚀 I'm glad we solved the problem quickly 😄
Out of curiosity, what was the optimal memory? Could you share the visualization URL with me?
@alexcasalboni Thanks for your interest! The result is this URL.
Thanks to your application, I verified that if we can allocate memory above a certain border, the performance will not change much. If lower than the border, the job failed.
Thank you for sharing, @Toyo-Daichi 🙏 That makes sense!
Hello. Our team used your application with the following settings.
I have a question or feature request (I confirmed to check past issues, but I apologize if there was a similar issue). Our Lambda expects an execution time of about15 minutes at a time (because of ETL job). I understand that this application also implements Lambda function to handle the number of times set by
num
args. serverlessrepo-aws-lambda-power-tuning-initializer-XXXXTherefore, since the minimum condition for
num
args is 5 times, one Lambda execution must be within 3 minutes. What's the best solution for memory tuning when one Lambda execution lasts just under 15 minutes?If it is difficult, I would appreciate it if you could consider the next update.