dherault / serverless-offline

Emulate AWS λ and API Gateway locally when developing your Serverless project
MIT License
5.2k stars 794 forks source link

Rust is Unsupported runtime? #1463

Open orenbenya1 opened 2 years ago

orenbenya1 commented 2 years ago

Bug Report

After running sls offline start command, I try to access the API gateway by accessing: http://localhost:3000/dev/health, and then I get an HTTP error 502, while in the terminal I see that sls offline start doesn't support rust. Any idea how to fix it? Because I really want to work with rust and I don't want to deploy my lambda function each time in order to access the API gateway (yes when I deploy it, it works).

Current Behavior

Server ready: http://localhost:3000 🚀

Enter "rp" to replay the last request

GET /dev/health (λ: base_http)

Warning: Warning: found unsupported runtime 'rust' for function 'base_http'
✖ Unsupported runtime
✖ Error: Unsupported runtime
      at HandlerRunner._loadRunner (/usr/lib/node_modules/serverless-offline/dist/lambda/handler-runner/HandlerRunner.js:177:11)
      at HandlerRunner.run (/usr/lib/node_modules/serverless-offline/dist/lambda/handler-runner/HandlerRunner.js:215:72)
      at LambdaFunction.runHandler (/usr/lib/node_modules/serverless-offline/dist/lambda/LambdaFunction.js:369:92)
      at processTicksAndRejections (node:internal/process/task_queues:96:5)
      at async hapiHandler (/usr/lib/node_modules/serverless-offline/dist/events/http/HttpServer.js:740:18)
      at async exports.Manager.execute (/usr/lib/node_modules/serverless-offline/node_modules/@hapi/hapi/lib/toolkit.js:60:28)
      at async Object.internals.handler (/usr/lib/node_modules/serverless-offline/node_modules/@hapi/hapi/lib/handler.js:46:20)
      at async exports.execute (/usr/lib/node_modules/serverless-offline/node_modules/@hapi/hapi/lib/handler.js:31:20)
      at async Request._lifecycle (/usr/lib/node_modules/serverless-offline/node_modules/@hapi/hapi/lib/request.js:371:32)
      at async Request._execute (/usr/lib/node_modules/serverless-offline/node_modules/@hapi/hapi/lib/request.js:281:9)

Sample Code

service: my-service

frameworkVersion: "3"

plugins:
  - serverless-rust
  - serverless-offline

provider:
  runtime: rust
  stage: dev

package:
  individually: true

custom:
  rust:
    dockerless: true

functions:
  base_http:
    handler: base_http
    events:
      - http:
          method: get
          path: health
use lambda_http::{service_fn, Error, IntoResponse, Request, RequestExt};

#[tokio::main]
async fn main() -> Result<(), Error> {
    lambda_http::run(service_fn(hello)).await?;
    Ok(())
}

async fn hello(request: Request) -> Result<impl IntoResponse, Error> {
    let _context = request.lambda_context();

    Ok(format!(
        "hello {}",
        request
            .query_string_parameters()
            .first("name")
            .unwrap_or_else(|| "stranger")
    ))
}

Environment

dnalborczyk commented 2 years ago

the "problem" with your setup is that serverless itself and also aws lambda do not support a "rust" runtime. the serverless-rust plugin you are using is creating a binary called bootstrap and pushes that up to lambda via serverless instead, either as zipped-up binary or packaged in a docker image. (not 100% sure, only glanced at the code).

I'd definitely like serverless-offline to work with rust binaries as well, but haven't found the time yet to look into it. that said, I believe the docker image approach should or could work, albeit not with the serverless-rust plugin. it might be also possible to make runtime: provided/provided.al2 work as well locally without a docker container.