nats-io / nats.go

Golang client for NATS, the cloud native messaging system.
https://nats.io
Apache License 2.0
5.44k stars 686 forks source link

Use faster json encoder/decoder #1222

Open arthurzam opened 1 year ago

arthurzam commented 1 year ago

Feature Request

After migrating from nats-streaming to JetStream, we have noted higher CPU usage ("overhead") on heavy publishers and consumers microservices. After many pprof runs, we noticed that a high overhead on the json.Unmarshal and json.Marshal.

The builtin encoding/json in Go is considered very slow, and there exist a lot of faster encoders and decoders, many of them with nearly identical interface, and many benchmarks showing how much they are faster than the builtin encoding/json. Some links to benchmarks: link 1 , link 2.

More investigations lead me to RegisterEncoder(JSON_ENCODER, &builtin.JsonEncoder{}) and we hoped that it would enable us to define an Encoder and use our JSON encoder of choice. Sadly, it seems like the critical places, like in the following code isn't using the encoder:

https://github.com/nats-io/nats.go/blob/7917595755782af46153b4d89c684efdc03a5259/js.go#L775

Use Case:

Decreasing the "overhead" of JetStream.

Proposed Change:

Use faster JSON encoder.

Who Benefits From The Change(s)?

Most of users who have heavy consumer or publisher.

Alternative Approaches

Enable users to set the encoder.

derekcollison commented 1 year ago

We have done custom encoders for hot path items in the past.

Is the pubAckReply the main item that shows on the profiles?

arthurzam commented 1 year ago

Is the pubAckReply the main item that shows on the profiles?

Yes. In the heaviest service we have, handleAsyncReply and pubAckReply are the main "offenders" (they account to around 15% of CPU time in pprof, done across 1 minute of sample, from 10 times)

derekcollison commented 1 year ago

Thanks that is helpful. We can do a marshal interface for them I would imagine.