softprops / serverless-rust

⚑ πŸ¦€ a serverless framework plugin for rustlang applications
https://www.npmjs.com/package/serverless-rust
MIT License
548 stars 82 forks source link
aws aws-lambda lambda rustlang serverless
⚑ πŸ¦€

serverless-rust

A ⚑ Serverless framework ⚑ plugin for Rustlang applications

GitHub actions build badge npm release badge


πŸ“¦ Install

Install the plugin inside your serverless project with npm.

$ npm i -D serverless-rust

πŸ’‘The -D flag adds it to your development dependencies in npm speak

πŸ’‘ This plugin assumes you are building Rustlang lambdas targeting the AWS Lambda "provided" runtime. The AWS Lambda Rust Runtime makes this easy.

Add the following to your serverless project's serverless.yml file

service: demo
provider:
  name: aws
  runtime: rust
plugins:
  # this registers the plugin
  # with serverless
  - serverless-rust
# creates one artifact for each function
package:
  individually: true
functions:
  test:
    # handler value syntax is `{cargo-package-name}.{bin-name}`
    # or `{cargo-package-name}` for short when you are building a
    # default bin for a given package.
    handler: your-cargo-package-name
    events:
      - http:
          path: /test
          method: GET

πŸ’‘ The Rust Lambda runtime requires a binary named bootstrap. This plugin renames the binary cargo builds to bootstrap for you. You do not need to do this manually in your Cargo.toml configuration file.

The default behavior is to build your lambda inside a docker container. Make sure you have a docker daemon running if you are not opting into the dockerless mode.

πŸ–οΈ customize

You can optionally adjust the default settings of the dockerized build env using a custom section of your serverless.yaml configuration

custom:
  # this section customizes of the default
  # serverless-rust plugin settings
  rust:
    # flags passed to cargo
    cargoFlags: '--features enable-awesome'
    # custom docker tag
    dockerTag: 'some-custom-tag'
    #  custom docker image
    dockerImage: 'dockerUser/dockerRepo'

πŸ₯Ό (experimental) local builds

While it's useful to have a build environment that matches your deployment environment, dockerized builds come with some notable tradeoffs.

The external dependency on docker itself often causes friction as an added dependency to your build.

Depending on a docker image limits which versions of rust you can build with. The default docker image tracks stable rust. Some users might wish to try unstable versions of rust before they stabilize. Local builds enable that.

If you wish to build lambda's locally, use the dockerless configuration setting.

custom:
  # this section allows for customization of the default
  # serverless-rust plugin settings
  rust:
    # flags passed to cargo
    cargoFlags: '--features enable-awesome'
    # experimental! when set to true, artifacts are built locally outside of docker
+   dockerless: true

    # when using local builds (dockerless), optionally provide a different target and linker for the compiler
    # for example, allow local running on ARM macs
    target: aarch64-apple-darwin
    linker: clang

The following assumes that you have not specified a different target or linker. If you do, make sure have that you have installed the specified target (via rustup) and linker.

This will build and link your lambda as a static binary outside a container that can be deployed in to the lambda execution environment using MUSL. The aim is that in future releases, this might become the default behavior.

In order to use this mode its expected that you install the x86_64-unknown-linux-musl target on all platforms locally with

$ rustup target add x86_64-unknown-linux-musl

On linux platforms, you will need to install musl-tools

$ sudo apt-get update && sudo apt-get install -y musl-tools

On Mac OSX, you will need to install a MUSL cross compilation toolchain

$ brew install filosottile/musl-cross/musl-cross

Using MUSL comes with some other notable tradeoffs. One of which is complications that arise when depending on dynamically linked dependencies.

If you find other MUSL specific issues, please report them by opening an issue.

🎨 Per function customization

If your serverless project contains multiple functions, you may sometimes need to customize the options above at the function level. You can do this by defining a rust key with the same options inline in your function specification.

functions:
  test:
    rust:
      # function specific flags passed to cargo
      cargoFlags: '--features enable-awesome'
    # handler value syntax is `{cargo-package-name}.{bin-name}`
    # or `{cargo-package-name}` for short when you are building a
    # default bin for a given package.
    handler: your-cargo-package-name
    events:
      - http:
          path: /test
          method: GET

🀸 usage

Every serverless workflow command should work out of the box.

invoke your lambdas locally

$ npx serverless invoke local -f hello -d '{"hello":"world"}'

deploy your lambdas to the cloud

$ npx serverless deploy

invoke your lambdas in the cloud directly

$ npx serverless invoke -f hello -d '{"hello":"world"}'

view your lambdas logs

$ npx serverless logs -f hello

πŸ—οΈ serverless templates

^0.2.*

0.1.*

Older versions targeted the python 3.6 AWS Lambda runtime and rust crowbar and lando applications

Doug Tangren (softprops) 2018-2019