Vercel puts serverless functions to sleep after an unknown period of time.
The first request is a cold start which measures 2-6 seconds which is not an acceptable response time for an API.
When is this a problem?
Currently our API has very little traffic outside of our frontend, so most older releases will show a cold start on first request.
See for yourself by picking an old version i.e. https://v2.0.4.api.carbonmark.com/
Each API release is a single serverless function. Consequentially, as releases age, and users migrate to newer versions, the probability increases that an API consumer might run into a cold start on an older release.
Currently the Vercel dashboard says 4.3 MB function Size and 1GB of memory used.
In the deployment summary, it appears that we are uploading .generated Types files as static assets. I don't think this is right. Types should be excluded by the tsc at compile time. We should be able to deploy only the raw .js dist files...
2. Reduce the number of public releases
This solution is to reduce the impact of cold starts on end consumers by concentrating traffic on a smaller number of releases. This solution is not super relevant right now as traffic is low. But it is worth considering.
In summary we can switch to canary releases for every API change, instead of major/minor/patch versions.
Our applications would consume the canary releases.
Then, when we can schedule monthly or ad-hoc public releases for end consumers.
example:
Day
Version
Note
Monday
v6.0.0
public release
Tuesday
v7.0.0-beta.0.0.1
Canary patch change consumed internally or by early adopters
Wednesday
v7.0.0-beta.0.1.0
Canary minor change
Thursday
v7.0.0-beta.1.0.0
Canary major change
Friday
null
no friday releases :P
Monday
v7.0.0
Stable public release of all changes
3. Use UptimeRobot or another chron job to keep the function alive
As a Vercel fanboy I like having everything in one dashboard. But some day we will need to Grow Up and choose a more cost effective way to host this API...
Describe the bug
Vercel puts serverless functions to sleep after an unknown period of time. The first request is a cold start which measures 2-6 seconds which is not an acceptable response time for an API.
When is this a problem?
Currently our API has very little traffic outside of our frontend, so most older releases will show a cold start on first request. See for yourself by picking an old version i.e. https://v2.0.4.api.carbonmark.com/
Each API release is a single serverless function. Consequentially, as releases age, and users migrate to newer versions, the probability increases that an API consumer might run into a cold start on an older release.
Solutions
See guide here: https://vercel.com/guides/how-can-i-improve-serverless-function-lambda-cold-start-performance-on-vercel
1. Reduce the size of the JS bundle.
Currently the Vercel dashboard says 4.3 MB function Size and 1GB of memory used.
In the deployment summary, it appears that we are uploading .generated Types files as static assets. I don't think this is right. Types should be excluded by the tsc at compile time. We should be able to deploy only the raw .js dist files...
2. Reduce the number of public releases
This solution is to reduce the impact of cold starts on end consumers by concentrating traffic on a smaller number of releases. This solution is not super relevant right now as traffic is low. But it is worth considering.
In summary we can switch to canary releases for every API change, instead of major/minor/patch versions. Our applications would consume the canary releases. Then, when we can schedule monthly or ad-hoc public releases for end consumers. example:
3. Use UptimeRobot or another chron job to keep the function alive
Self explanatory. Increases hosting costs. Somewhat hacky.
4. Ditch serverless or find a new host.
As a Vercel fanboy I like having everything in one dashboard. But some day we will need to Grow Up and choose a more cost effective way to host this API...