Open algoflows opened 1 year ago
We do have a Lambda layer you can use!
https://github.com/oven-sh/bun/tree/main/packages/bun-lambda
We need to document it though, so will convert this issue into a docs request.
Then how about adding Lambda Response streaming?
Seems that the official package isn't working correctly on macOS
Ventura 13.4. Whether I try to publish or build the layer, I get the same error:
Publish
eric@Erics-MacBook-Pro bun-lambda $ bun run publish-layer --arch x64 --layer 'Bun v0.6.9 Lambda Test' --region eu-west-1 --release '0.6.9'
$ bun scripts/publish-layer.ts --arch x64 --layer Bun v0.6.9 Lambda Test --region eu-west-1 --release 0.6.9
4 | const settings_1 = require("./settings");
5 | function termwidth(stream) {
6 | if (!stream.isTTY) {
7 | return 80;
8 | }
9 | const width = stream.getWindowSize()[0];
^
TypeError: stream.getWindowSize is not a function. (In 'stream.getWindowSize()', 'stream.getWindowSize' is undefined)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:9:18
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:19:34
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:20:58)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/errors/pretty-print.js:6:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/errors/pretty-print.js:48:18)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/handle.js:7:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/handle.js:41:17)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/index.js:4:4
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/index.js:62:15)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/cli-ux/index.js:4:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/cli-ux/index.js:183:48)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/command.js:7:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/command.js:238:26)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/index.js:5:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/index.js:57:17)
error: script "publish-layer" exited with code 1 (SIGHUP)
Build
eric@Erics-MacBook-Pro bun-lambda $ bun run build-layer --arch x64 --release '0.6.9' --output ./bun-lambda.zip
$ bun scripts/build-layer.ts --arch x64 --release 0.6.9 --output ./bun-lambda.zip
4 | const settings_1 = require("./settings");
5 | function termwidth(stream) {
6 | if (!stream.isTTY) {
7 | return 80;
8 | }
9 | const width = stream.getWindowSize()[0];
^
TypeError: stream.getWindowSize is not a function. (In 'stream.getWindowSize()', 'stream.getWindowSize' is undefined)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:9:18
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:19:34
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/screen.js:20:58)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/errors/pretty-print.js:6:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/errors/pretty-print.js:48:18)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/handle.js:7:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/handle.js:41:17)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/index.js:4:4
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/errors/index.js:62:15)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/cli-ux/index.js:4:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/cli-ux/index.js:183:48)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/command.js:7:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/command.js:238:26)
at /Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/index.js:5:6
at (eval) (/Users/eric/code/temp/bun/packages/bun-lambda/node_modules/@oclif/core/lib/index.js:57:17)
error: script "build-layer" exited with code 1 (SIGHUP)
I experience the same with Ubuntu on WSL2. :(
would it be possible to create a base docker image for lambdas?
yep, this doesn't work as per the steps and examples documented.
Lambda AL2 runtime fails to execute the bun binary stating an error code 126.
yep, this doesn't work as per the steps and examples documented.
Lambda AL2 runtime fails to execute the bun binary stating an error code 126.
It looks like this broke since the time we first released it, we'll try to fix this ASAP. Thanks for bringing to our attention
Is there any chance of getting that documentation sorted?
I'd love to give bun on Lambda a go, but there is literally no explanation as to how to implement it, especially when using Cloudformation.
The current readme just shows a bit of code using the fetch method, which wouldn't be used in a Lambda, so it's quite confusing.
Is there any chance of getting that documentation sorted?
I'd love to give bun on Lambda a go, but there is literally no explanation as to how to implement it, especially when using Cloudformation.
The current readme just shows a bit of code using the fetch method, which wouldn't be used in a Lambda, so it's quite confusing.
I was just able to get this up and running on my end.
import type { Server } from "bun";
export default {
async fetch(request: Request, server: Server): Promise<Response | undefined> {
console.log("Request", {
url: request.url,
method: request.method,
headers: request.headers.toJSON(),
body: request.body ? await request.text() : null,
});
if (server.upgrade(request)) {
console.log("WebSocket upgraded");
return;
}
return new Response("Hello from Bun on Lambda!", {
status: 200,
headers: {
"Content-Type": "text/plain;charset=utf-8",
},
});
}
};
Is there any chance of getting that documentation sorted?
I'd love to give bun on Lambda a go, but there is literally no explanation as to how to implement it, especially when using Cloudformation.
The current readme just shows a bit of code using the fetch method, which wouldn't be used in a Lambda, so it's quite confusing.
I was just able to get this up and running on my end.
- Cloned the github repo
- published the layer to aws using the steps on packages/bum-lambda/README.md (I have aws cli setup) (this shows an error on the terminal but doesn't seem to stop it from working)
- Create a lambda function.
- Select Provide your own runtime with amazon linux 2 (bun builds the layer to aarch by default so I had to select arm when creating the lambda)
- After creating the lambda function, attatch the layer deployed by bun
- Create a file called handler.ts and paste the code snipet from for the lambda for the documentation. (I'll post it here too)
- Select the handler as handler.fetch
import type { Server } from "bun"; export default { async fetch(request: Request, server: Server): Promise<Response | undefined> { console.log("Request", { url: request.url, method: request.method, headers: request.headers.toJSON(), body: request.body ? await request.text() : null, }); if (server.upgrade(request)) { console.log("WebSocket upgraded"); return; } return new Response("Hello from Bun on Lambda!", { status: 200, headers: { "Content-Type": "text/plain;charset=utf-8", }, }); } };
What do you set as the handler? handler.fetch?
I don't see why you'd use fetch in a lambda though, that's what's confusing me.
@paul-uz Yeah I used handler.fetch there's a sort of transformer in a runtime.ts present in the layer. I beieive that's what actually handles the request and sends it down to this function. also the name of the function can be anything you want actually.
The error I keep getting
101 | async load() {
102 | this.type = this.options.type || 'core';
103 | this.tag = this.options.tag;
104 | const root = await findRoot(this.options.name, this.options.root);
105 | if (!root)
106 | throw new Error(`could not find package.json with ${(0, util_1.inspect)(this.options)}`);
^
warn: could not find package.json with { type: 'dev',
root: '/Users/festus/Work/personal/bun/packages/bun-lambda/node_modules/@oclif/core',
name: '@oclif/plugin-plugins' }
at /Users/festus/Work/personal/bun/packages/bun-lambda/node_modules/@oclif/core/lib/config/plugin.js:106:18
at processTicksAndRejections (:1:2602)
@paul-uz Yeah I used handler.fetch there's a sort of transformer in a runtime.ts present in the layer. I beieive that's what actually handles the request and sends it down to this function. also the name of the function can be anything you want actually.
Does it have to be fetch
?
I typically use a index.js file with a Handler
class with a main
method so the lambda handler is index.main
Would really like more real world examples of bun being used so I can wrap my head around it
@paul-uz You can name both the file and function anything you want actually
@festusyuma Confirmed your steps work for me as well! I'm going to see if I can get a CDK project scaffolded and working for a full development environment workflow.
@festusyuma
The error I keep getting
101 | async load() { 102 | this.type = this.options.type || 'core'; 103 | this.tag = this.options.tag; 104 | const root = await findRoot(this.options.name, this.options.root); 105 | if (!root) 106 | throw new Error(`could not find package.json with ${(0, util_1.inspect)(this.options)}`); ^ warn: could not find package.json with { type: 'dev', root: '/Users/festus/Work/personal/bun/packages/bun-lambda/node_modules/@oclif/core', name: '@oclif/plugin-plugins' } at /Users/festus/Work/personal/bun/packages/bun-lambda/node_modules/@oclif/core/lib/config/plugin.js:106:18 at processTicksAndRejections (:1:2602)
I had the same problem, after looking through the code I found a missing dependency:
bun install @oclif/plugin-plugins
FYI I did a pull request to fix hopefully someone approves: https://github.com/oven-sh/bun/pull/4769
Would be good if the lambda layer was published somewhere that auto updates along with releases so everyone doesn't have to publish their own.
Would be good if the lambda layer was published somewhere that auto updates along with releases so everyone doesn't have to publish their own.
I have done automatic layer publishing and making it public for the @napi-rs/canvas which builds every week.
Have a look here - https://github.com/ShivamJoker/Canvas-Lambda-Layer
Anyone know what cold and hot boot times are like compared to using the built in NodeJS?
Would love to use bun (or even deno) but only if the boot times are similar or ideally, quicker
Anyone know what cold and hot boot times are like compared to using the built in NodeJS?
Would love to use bun (or even deno) but only if the boot times are similar or ideally, quicker
@paul-uz I spent the better part of yesterday answering this question for myself and will be publishing an article about it tomorrow morning. But tl;dr:
I also found that the CRUD API example had an average of 25% reduction in billed time. Applications that have any amount of sustained load where provisioned concurrency could make sense may find that it is cheaper to use Bun + provisioned concurrency vs Node without provisioned concurrency and have a faster application.
That said, I am personally holding off on jumping to Bun just yet, as I had to make several updates to the Lambda Layer because of bugs with handling V1 API Gateway events, so I plan to hold off until it's more hardened and battle-tested.
Anyone know what cold and hot boot times are like compared to using the built in NodeJS?
Would love to use bun (or even deno) but only if the boot times are similar or ideally, quicker
Cold start for simple application with no dependencies on bun was >450ms compared to <150ms on node, I will test out the differnce in an existing node application I have with a cold start time of about 1s
Cold start for simple application with no dependencies on bun was >450ms compared to <150ms on node, I will test out the differnce in an existing node application I have with a cold start time of about 1s
How did you measure it? From CloudWatch logs what I saw NodeJS was taking like 2 seconds for cold boot. Where bun took <1s
Here's the article with the benchmark tests: Serverless Bun vs Node: Benchmarking on AWS Lambda https://medium.com/@mitchellkossoris/serverless-bun-vs-node-benchmarking-on-aws-lambda-ecd4fe7c2fc2
Cold start for simple application with no dependencies on bun was >450ms compared to <150ms on node, I will test out the differnce in an existing node application I have with a cold start time of about 1s
How did you measure it? From CloudWatch logs what I saw NodeJS was taking like 2 seconds for cold boot. Where bun took <1s
I looked at it from cloudwatch logs, the duration is written there, here are screenshots
https://res.cloudinary.com/festusyumanew/image/upload/v1694462666/Screenshot_2023-09-11_at_21.04.19.png (NodeJS) https://res.cloudinary.com/festusyumanew/image/upload/v1694462696/Screenshot_2023-09-11_at_21.04.50.png (Bun)
Here's the article with the benchmark tests: Serverless Bun vs Node: Benchmarking on AWS Lambda https://medium.com/@mitchellkossoris/serverless-bun-vs-node-benchmarking-on-aws-lambda-ecd4fe7c2fc2
There's definitely something off with how Bun's Lambda layer is implemented, based on those cold-start times. We're looking into this right now, along with other fixes.
Here's the article with the benchmark tests: Serverless Bun vs Node: Benchmarking on AWS Lambda https://medium.com/@mitchellkossoris/serverless-bun-vs-node-benchmarking-on-aws-lambda-ecd4fe7c2fc2
There's definitely something off with how Bun's Lambda layer is implemented, based on those cold-start times. We're looking into this right now, along with other fixes.
We also need to keep in mind that lambda has first class support for nodejs and is very optimised on lambda while bun probably isn't. I can only imaging it can get better if it ever gets full support like nodejs has
My suggestion is to look at aws-lambda-nodejs-runtime-interface-client and create a bun variant for it ideally using bun's native APIs and/or bun:ffi. This is what the nodejs runtime client ideally does. Lastly, pulling a layer will always add to the cold start time, so AWS needs to be involved here to either bring in an official runtime or use a docker image.
@paul-uz I've publish an article on: How to use Bun on AWS Lambda — A Step-By-Step Guide (AWS console)
Here is the code to do same for CDK:
import { Stack, StackProps } from "aws-cdk-lib";
import {
Architecture,
Code,
Function,
LayerVersion,
Runtime,
} from "aws-cdk-lib/aws-lambda";
import { Construct } from "constructs";
export class BunFnCdkStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
new Function(this, "hello-bun-fn", {
code: Code.fromAsset("./lambda/"),
handler: "index.fetch",
runtime: Runtime.PROVIDED_AL2,
architecture: Architecture.ARM_64,
layers: [
LayerVersion.fromLayerVersionArn(
this,
"bun-layer",
"arn:aws:lambda:us-east-1:205979422636:layer:bun:1",
),
],
});
}
}
@ShivamJoker I recommend you remove your AWS account ID from the Layer ARN in the example. It's typically best not to publicly share your account IDs - not that it's a major concern, just a best practice security policy.
@mkossoris Thanks for your recommendation but If I remove account ID, people won't be able to use the Lambda layer without publishing their own. I want them to try it out without building their own layer unless they need to. I've done same for one another lambda layer.
The bun-lambda readme says I can test the function locally using bun run <handler>.ts
but this doesn't work, even with the simple code below
export default {
async fetch() {
console.log('hello world')
},
};
@paul-uz please go through the article I shared above.
@ShivamJoker thanks. I guess it's better than the current docs, but its still not a real way to test the lambda layer as such is it? As the layer does the request transformation, but running it via a simple bun.serve()
doesn't do that.
If you really want to test the layer, you'll have to deploy your code to Lambda.
The best way would be to use AWS CLI or CDK keep deploying your local code to Lambda.
Hi, When deploying the lambda function using the bun layer there is a message in the AWS console in the "Code Source" section that says: "The runtime is no longer supported. We recommend that you migrate your functions that use to a newer runtime as soon as possible"
Can you please let me know if this is something to be concerned about or is there a fix available for this?
Below is the CDK code used to setup the function that uses the Bun Lambda Layer.
const BunLayer = lambda.LayerVersion.fromLayerVersionArn(
this,
"BunLayer",
"arn:aws:lambda:eu-west-1:xxxxxxxxxxxx:layer:bun:1"
);
const handler = new lambda.Function(this, "functionname", {
runtime: lambda.Runtime.PROVIDED_AL2,
code: lambda.Code.fromAsset("lambda"),
handler: "functionname.fetch",
architecture: lambda.Architecture.ARM_64,
layers: [BunLayer],
});
Thank you
I was facing a similar issue. Here is how I solved it https://github.com/hhimanshu/bun-cdk
What is the problem this feature would solve?
Be able to run bun on aws lambda.
What is the feature you are proposing to solve the problem?
Use bun on serverless lambda architecture
What alternatives have you considered?
Lots of alternatives, but would love to see an officially supported solution.