Closed ChristophP closed 5 years ago
Hey @ChristophP , thanks for pointing this out 😄
Basically one had to compile an executable/dynamic library and then wrap it around in a Node.js module that performed a native call. Nowadays, one can create their own runtime, like this one, and provide a compiled executable. I don't know if this should be mentioned in the docs or not.
Yes, this should be definitely answered. One has to run a make
in their project root, and then upload the generated build/function.zip
To be honest we have not much idea, our guess is that AWS makes some fine tuning with Node.js, by perhaps using V8 processes rather than containers 🤷♀️, but that's just a guess. We did some benchmarks and Haskell was slower. Perhaps there is room for optimization in the runtime.
I hope this clears your questions ^_^
Awesome, thanks for getting to it so quickly. I left some comments on the PR.
I have a couple of open questions that aren't answered in the docs (https://theam.github.io/aws-lambda-haskell-runtime/index.html) or anywhere else I could find. I think it would be good to mention that as well some in the docs.
The docs say this
How to do deployment/ do I need a layer? Some time ago there used to be a medium article with some deployment instruction but I am not sure of that is no longer valid with version 2 of this runtime. What changed in version 2?
What's the reason for Haskell being slower than node.js? Seems like a compiled language should be faster. Maybe this is hard to answer without some serious benchmarking.
Thanks for making the runtime.