Closed mcollina closed 2 years ago
interesting... so we could just call await fastify.ready()
outside of the handler function (user land) and it should work?
This will probably be a breaking change... I don't know if it will be possible to have esm and commonjs compatibility with the same source...
Are you already working on a POC?
Not yet, I just saw this today. However it's a great improvement that matches Fastify philosophy.
If I've enough time next week, I'll first migrate all our locize and localistars lambdas to ESM.
Afterwards, I'll try to do some investigations with calling .ready()
outside of the lambda handler...
I did a first very simple local test, without any changes to aws-lambda-fastify: https://gist.github.com/adrai/39fa41bda8249f0645c6087efdc5c789
what | without .ready() | with .ready() | difference |
---|---|---|---|
import 1) | 205ms | 1087ms | +530% |
handler 1) | 910ms | 57ms | -94% |
import 2) | 2ms | 2ms | +-0% |
handler 2) | 1ms | 1ms | +-0% |
handler 3) | 0ms | 0ms | +-0% |
total | 1119ms | 1147ms | +2% |
phase | without .ready() | with .ready() | difference |
---|---|---|---|
Init Duration | 205ms | 1087ms | +530% |
Duration | 913ms | 60ms | -94% |
total | 1119ms | 1147ms | +2% |
invocation | without .ready() | with .ready() | difference |
---|---|---|---|
cold | 1119ms | 1147ms | +2% |
cold (with provisioned concurrency) | 914ms | 60ms | -94% |
warm | 3ms | 3ms | +-0% |
So for cold starts using provisioned concurrency there would be a performance optimization of about 882ms (910ms - 57ms) (approx. 80%). If not using provisioned concurrency, there would be probably no real difference (+2%).
So if I understand this correctly, we do not need to do any change in aws-lambda-fastify.
If someone using provisioned concurrency want to optimize it, the .ready()
function can be called outside of the handler: https://gist.github.com/adrai/39fa41bda8249f0645c6087efdc5c789#file-lambda-js-L4
This is a great blog post BTW ^^.
I think we should only update the documentation.
Will test this first on our production environments... an then, I'll update the readme...
On a project I'm working on I ran into problems using ES modules due you our tooling not supporting it yet (jest and ts-node specifically), and found that it's possible to make a commonjs module look like an ES module to the lambda runtime so you can do await app.ready()
entirely at initialization time in a commonjs module. The trick is to make the commonjs module export a Promise that resolves to the exports (e.g. {handler}
) like so:
import { app } from './app.js';
import awsLambdaFastify from '@fastify/aws-lambda';
const main = async () => {
const handler = awsLambdaFastify(app);
await app.ready();
return { handler };
};
# note we aren't exporting main here, but rather the result
# of calling main() which is a promise resolving to {handler}:
module.exports = main();
Posting this here in case it proves useful to anyone else.
I've modified the above for concurrent calls with async initialisation, again, posting for anyone else who might find it useful:
const app = require("./app");
const awsLambdaFastify = require("@fastify/aws-lambda");
let lambdaHandler;
let isInitializing = false;
const main = async () => {
if (!lambdaHandler && !isInitializing) {
isInitializing = true;
lambdaHandler = awsLambdaFastify(app);
await app.ready();
isInitializing = false;
}
// If initialization is in progress, wait until it's complete
while (isInitializing) {
await new Promise(resolve => setTimeout(resolve, 10));
}
return lambdaHandler;
};
module.exports.handler = async (event, context) => {
const handler = await main();
return handler(event, context);
};
Prerequisites
🚀 Feature Proposal
Check out https://aws.amazon.com/blogs/compute/using-node-js-es-modules-and-top-level-await-in-aws-lambda/. By separating instantiation from execution we can significantly reduce cold starts.
Motivation
No response
Example
No response