CodeGenieApp / serverless-express

Run Express and other Node.js frameworks on AWS Serverless technologies such as Lambda, API Gateway, Lambda@Edge, and more.
https://codegenie.codes
Apache License 2.0
5.18k stars 672 forks source link

Unable to Stream Responses from AWS Lambda #655

Open sarthak-m-das opened 1 year ago

sarthak-m-das commented 1 year ago

I am facing difficulties in streaming responses from AWS Lambda. Initially, I attempted to use the serverless-http package for this purpose, but I learned that it doesn't support streaming. Subsequently, I switched to the @vendia/serverless-express package, hoping it would resolve the issue. However, even with this package, I can still not achieve streaming functionality.

My code:

const serverless = require("serverless-http"),
  express = require("express"),
  morgan = require("morgan"),
  helmet = require("helmet"),
  bodyParser = require("body-parser"),
  awsServerlessExpress = require("@vendia/serverless-express");

const cors = require("cors");

const rootPrefix = ".",
  coreConstants = require(rootPrefix + "/config/coreConstants"),
  apiRoutes = require(rootPrefix + "/routes/index"),
  setResponseHeader = require(rootPrefix + "/middlewares/setResponseHeader");

// Set worker process title.
process.title = "";

const app = express();

// Use Morgan middleware to log requests
app.use(morgan("combined"));

// Enable CORS
app.use(cors());

// Helmet can help protect the app from some well-known web vulnerabilities by setting HTTP headers appropriately.
app.use(helmet());

// Node.js body parsing middleware. Default limit is 100kb
app.use(bodyParser.json({ limit: "2mb" }));

// Parsing the URL-encoded data with the qs library (extended: true). Default limit is 100kb
app.use(bodyParser.urlencoded({ extended: true, limit: "2mb" }));

app.get("/stream", async (req, res) => {
  res.setHeader("Content-Type", "text/plain");
  for (let i = 0; i < 10; i++) {
    res.write(`Line ${i}\n`);
    await new Promise((resolve) => setTimeout(resolve, 1000));
  }
  res.end();
});

// Set response header to prevent response caching
app.use(setResponseHeader());

// If running in development mode, start the server on port 8080, else export handler for lambda
if (
  coreConstants.ENVIRONMENT === "development" ||
  coreConstants.ENVIRONMENT === "test"
) {
  console.log("Server running on 8080");
  app.listen(8080);
  module.exports = { handler: app };
} else {
  // Export the handler for Lambda on production
  module.exports.handler = awsServerlessExpress({ app })
}
metaskills commented 1 year ago

For those curious on streaming support. https://aaronstuyvenberg.com/posts/introducing-response-streaming

Is anyone looking into this for express? I kind of assumed given the popularity this was done already. Totally willing to help if Express itself allows this?

metaskills commented 1 year ago

Maybe related.

brettstack commented 1 year ago

@metaskills happy to receive PRs for this. Express does support streaming response https://stackoverflow.com/a/38789462/436540

carontony commented 1 year ago

Hello, Anyone find a solution to stream content?

paya-cz commented 10 months ago

So I found a temporary workaround: just grab the entire payload from serverlessExpress, wrap it in a stream and send it out via the streaming API. This gives the ability to send bigger payloads out of the lambda but obviously it somewhat defeats the purpose of streaming when the entire response is buffered first. So make sure to give enough memory to the lambda to store these buffers.

const pipeline = require('util').promisify(require('stream').pipeline);
const { Readable } = require('stream');

let serverlessExpressInstance;

const getServerlessExpress = async () => {
  if (!serverlessExpressInstance) {
    serverlessExpressInstance = serverlessExpress({
      app,
      resolutionMode: 'PROMISE',
    });
  }

  return serverlessExpressInstance;
}

// @ts-ignore: https://github.com/aws/aws-lambda-nodejs-runtime-interface-client/issues/74
exports.handler = awslambda.streamifyResponse(async (event, responseStream, context) => {
  const serverlessHandler = await getServerlessExpress();
  const result = await serverlessHandler(event, context, () => { });

  const metadata = {
    statusCode: result.statusCode,
    headers: result.headers,
    cookies: result.cookies,
  };

  let payload;
  if (result.body) {
    payload = Buffer.from(result.body, result.isBase64Encoded ? 'base64' : 'utf8');
  } else {
    payload = Buffer.alloc(0);
  }

  // Assign to the responseStream parameter to prevent accidental reuse of the non-wrapped stream.
  // @ts-ignore: https://github.com/aws/aws-lambda-nodejs-runtime-interface-client/issues/74
  responseStream = awslambda.HttpResponseStream.from(responseStream, metadata);

  await pipeline(
    Readable.from(payload),
    responseStream,
  );
});

I have also encountered some issues when there is no payload to send, only headers. So if your expressjs server returns status code only without any response body, you might encounter a dangling connection. This might be a bug in AWS infrastructure. More info https://github.com/aws/aws-lambda-nodejs-runtime-interface-client/issues/95

Deanfei commented 4 months ago

Any native support by this library for the lambda streaming response?