Open fommil opened 3 years ago
I would suggest using https://github.com/goccy/go-json
ffjson Benchmarking gave very slow results. It seems that it is assumed that the user will use the buffer pool properly. Also, development seems to have already stopped
This was originally raised (accidentally) over at https://github.com/aws/aws-sdk-go-v2/issues/1312
Is your feature request related to a problem? Please describe.
As detailed in the fantastic article at https://yalantis.com/blog/speed-up-json-encoding-decoding/ the default json marshaling / unmarshaling in Go uses reflection and is quite slow compared to what is possible.
It is possible to provide a custom marshaler for a type by implementing a stdlib inferface. However it is quite tedious. Thankfully, the tool ffjson does all the codegen!
Describe the solution you'd like
For the lambda library to ship optimised marshallers and unmarshallers for types that are sent over JSON, i.e. incoming / outgoing events.
Describe alternatives you've considered
easyjson is a similar alternative if for some reason ffjson doesn't do the job. The linked article also includes a summary of other alternatives, but neither are a drop in replacement.
And of course you have your own smithy framework.
Additional context
The benchmark results in the article conclude
easyjson and ffjson offer the biggest win for "large" and "extra large" objects, and are no worse than the stdlib for small objects.
However, as with everything, it would make sense to perform some benchmarks on typical lambda data, e.g. I am particularly interested in
events.KinesisEvent
, which could make for an excellent pilot study. The true test would be if my lambda invocations get smaller as a result of using a non-reflective decoder.