Open arthurzam opened 1 year ago
We have done custom encoders for hot path items in the past.
Is the pubAckReply the main item that shows on the profiles?
Is the pubAckReply the main item that shows on the profiles?
Yes. In the heaviest service we have, handleAsyncReply
and pubAckReply
are the main "offenders" (they account to around 15% of CPU time in pprof, done across 1 minute of sample, from 10 times)
Thanks that is helpful. We can do a marshal interface for them I would imagine.
Feature Request
After migrating from nats-streaming to JetStream, we have noted higher CPU usage ("overhead") on heavy publishers and consumers microservices. After many pprof runs, we noticed that a high overhead on the
json.Unmarshal
andjson.Marshal
.The builtin
encoding/json
in Go is considered very slow, and there exist a lot of faster encoders and decoders, many of them with nearly identical interface, and many benchmarks showing how much they are faster than the builtinencoding/json
. Some links to benchmarks: link 1 , link 2.More investigations lead me to
RegisterEncoder(JSON_ENCODER, &builtin.JsonEncoder{})
and we hoped that it would enable us to define an Encoder and use our JSON encoder of choice. Sadly, it seems like the critical places, like in the following code isn't using the encoder:https://github.com/nats-io/nats.go/blob/7917595755782af46153b4d89c684efdc03a5259/js.go#L775
Use Case:
Decreasing the "overhead" of JetStream.
Proposed Change:
Use faster JSON encoder.
Who Benefits From The Change(s)?
Most of users who have heavy consumer or publisher.
Alternative Approaches
Enable users to set the encoder.