dherault / serverless-offline

Emulate AWS λ and API Gateway locally when developing your Serverless project
MIT License
5.2k stars 794 forks source link

No support for Rust? :_( #677

Closed brainstorm closed 5 years ago

brainstorm commented 5 years ago

It does tilt up the server though... I'm new to your codebase but could you outline how big is the effort to add a new runtime to serverless-offline?

$ sls offline
Serverless: Starting Offline: dev/ap-southeast-2.

Serverless: Warning: found unsupported runtime 'rust'

Serverless: Offline listening on http://localhost:3000
dherault commented 5 years ago

Hello @brainstorm, it would not be that hard. You can reproduce what was done for the Python language and PR the result if you're satisfied with it!

brainstorm commented 5 years ago

Fair point.

By "what was done for the Python language", do you mean writing the equivalent of manual_test_python in rust and set the rust runtime on index.js? ... if that's all it takes, no worries, I can wrap it up.

OTOH, I'm having errors with the routes handling with proxy enabled. My application relies on APIGW {proxy+} url handling and apparently serverless-offline sends all my routes to 404? I've set the appropriate custom: directive as advised in your docs, but I still see this when visiting localhost:3000:

{"statusCode":404,"error":"Serverless-offline: route not found.","currentRoute":"get - /reads/chr1","existingRoutes":[]}

Whereas in the real APIGW response I see this:

$ sls invoke -f reads --path tests/rest/apigw_proxy_request.json 
{
    "statusCode": 200,
    "headers": {
        "content-type": "application/json"
    },
    "multiValueHeaders": {
        "content-type": [
            "application/json"
        ]
    },
    "body": "{\"message\":\"Reads: Your function executed successfully!\"}",
    "isBase64Encoded": false
}

Thanks for your attention @dherault!

dherault commented 5 years ago

First part: yes that's it ! And making sure it works of course... 😇

Second part: that's another issue. You get "existingRoutes":[] wich means serverless-offline has no route for you. Can you paste a safe version of your serverless.yml file ?

brainstorm commented 5 years ago

Second part: Yeah, it should have /dev/reads route for which I can pass --path like this:

https://github.com/brainstorm/htsget-aws/blob/master/tests/rest/apigw_proxy_request.json

I'm not entirely sure what you mean by "safe version", but here's my only serverless.yml I'm using right now:

https://github.com/brainstorm/htsget-aws/blob/master/serverless.yml

First part: alright then, I'll take a look ;)

dherault commented 5 years ago

Erf, I knew it. Your runtime is rust ! No wonder none of your routes are appearing. As you know it is not supported yet.

brainstorm commented 5 years ago

Yeah, after adding the extra runtime in index.js:

https://github.com/dherault/serverless-offline/blob/master/src/index.js#L406

It works but it seems to crave for some other changes since Rust recompiles and builds a docker container on each change... plus there's no AWS_IAM support either:

Serverless: WARNING: Serverless Offline does not support the AWS_IAM authorization type

Which adds up to the list of things to be simulated/implemented... oh well, I think I'll just invoke directly against dev against AWS proper instead of using anything locally for now:

Serverless: GET /reads/chr1 (λ: reads)
Proxy Handler could not detect JSON: Serverless: Building native Rust htsget.reads func...

Proxy Handler could not detect JSON:    Compiling htsget v0.1.0 (
Proxy Handler could not detect JSON: /code)

Serverless: Replying timeout after 30000ms
Proxy Handler could not detect JSON:     Finished release [optimized] target(s) in 35.31s

Proxy Handler could not detect JSON:   adding: bootstrap
Proxy Handler could not detect JSON:  (deflated 60%)

Proxy Handler could not detect JSON: Serverless: Packaging service...

Proxy Handler could not detect JSON: Serverless: Building Docker image...

Proxy Handler could not detect JSON: START RequestId: 52fdfc07-2182-154f-163f-5f0f9a621d72 Version: $LATEST

Proxy Handler could not detect JSON: chr1

Proxy Handler could not detect JSON: Error: Unknown(BufferedHttpResponse {status: 400, body: "{\"__type\":\"UnrecognizedClientException\",\"message\":\"The security token included in the request is invalid.\"}", headers: {"content-type": "application/x-amz-json-1.1", "date": "Mon, 03 Jun 2019 00:27:07 GMT", "x-amzn-requestid": "8db1220b-3fd3-43d3-9e56-4c74950e4f05", "content-length": "107", "connection": "keep-alive"} })

Proxy Handler could not detect JSON: END RequestId: 52fdfc07-2182-154f-163f-5f0f9a621d72
REPORT RequestId: 52fdfc07-2182-154f-163f-5f0f9a621d72  Init Duration: 71.84 ms Duration: 192.15 ms     Billed Duration: 200 ms Memory Size: 1536 MB    Max Memory Used: 13 MB

Serverless: Warning: handler 'reads' returned a promise and also uses a callback!
This is problematic and might cause issues in your lambda.
dnalborczyk commented 5 years ago

@brainstorm

I have 0 knowledge what Rust concerns. are you using https://github.com/awslabs/aws-lambda-rust-runtime ? or similar? there's a custom runtime (provided runtime) PR: https://github.com/dherault/serverless-offline/pull/648 which does seem to work and might get you closer. only did a quick check against a node custome runtime: https://github.com/lambci/node-custom-lambda some time ago.

I think it uses serverless invoke docker internally if I remember right - which, btw, you might be able to use as an alternative as well.

this https://github.com/dherault/serverless-offline/blob/master/src/index.js#L406 shouldn't be needed in that case.

you'd have to use:

# severless.yml
runtime: provided
brainstorm commented 5 years ago

Yes, I'm using that runtime with the http extensions and yes it uses docker internally to recompile the payload, but it is pretty slow to do all that vs just re-deploying and invoking against AWS proper.

In addition, I'm currently testing the backend service (Athena) that my lambda is connected to, so there's no sensible local invoke emulation for that. Thanks for the provided explanation/detail, didn't know about that one! :)

dnalborczyk commented 5 years ago

ah, cool. no problem.

but it is pretty slow to do all that

yeah, I was thinking the same initially - even with non-rust provided runtimes it didn't seem to have a great developer experience as far as performance goes.

dnalborczyk commented 4 years ago

@brainstorm if you ever wanted to give it a try: we just released initial docker support with v6.0.0 alpha 54. although currently it supports only lambci containers, but we want to extend support for any containers. so it might not [yet] work for your scenario.

brainstorm commented 4 years ago

Thanks @dnalborczyk it does work but it is indeed pretty slow (for my current lambda-rust project):

$ time npx serverless invoke local -f reads --path tests/rest/apigw_proxy_request.json
Serverless: Building native Rust reads func...
    Finished release [optimized] target(s) in 16.92s
objcopy: stVVO2Zb: debuglink section already exists
  adding: bootstrap (deflated 62%)
Serverless: Packaging service...
Serverless: Building Docker image...
START RequestId: 04c2b77c-ac8a-11cd-cd68-55184736a2c5 Version: $LATEST

END RequestId: 04c2b77c-ac8a-11cd-cd68-55184736a2c5

REPORT RequestId: 04c2b77c-ac8a-11cd-cd68-55184736a2c5  Init Duration: 210.87 ms    Duration: 12.08 ms  Billed Duration: 100 ms Memory Size: 1536 MB    Max Memory Used: 11 MB

{"statusCode":200,"headers":{"content-type":"application/json"},"multiValueHeaders":{"content-type":["application/json"]},"body":"{\"htsget\":{\"format\":\"BAM\",\"urls\":[{\"class\":\"body\",\"headers\":{\"auth\":\"Bearer: foo\",\"byte_range\":\"bytes = 1-100\"},\"url\":\"https://some_presigned_url\"}]}}","isBase64Encoded":false}

real    3m26.943s
user    0m14.713s
sys 0m22.085s

That 3 minute runtime is after I've run it a couple of times, cold starts are ~10 minutes.

I wonder if there is any trick you guys use to have that walltime significantly reduced? At the moment launching "real" lambdas is actually faster compared to serverless-offline on my current machine (MacBook 12" 2017, 1.4GHz dual core).

/cc @softprops