Open perhapsmaple opened 9 months ago
This issue is available for anyone to work on. Make sure to reference this issue in your pull request. :sparkles: Thank you for your contribution! :sparkles:
I don't think we'd end up depending on Boost libs, but just for the sake of it, I wanted to mention Boost::JSON as the natural "fast & clean" json library: https://230.jsondocs.prtest.cppalliance.org/libs/json/doc/html/json/benchmarks.html
One option could be for otel SDK to provide an abstract interface for JSON serialization, with default implementation for nlohmann-json. And let user bring their custom implementation which could internally use rapid JSON or Boost::JSON. This would be similar to how we provide HTTPClientFactory, with default implementation for curl.
This issue was marked as stale due to lack of activity.
I have been exploring the use of different json libraries to optimize the performance of my custom pipeline and exporter to write traces and logs to files. My primary concern was serialization speed, which the OtlpHttpExporter was struggling with while logging extensively. While the
nlohmann::json
library is intuitive and easy to use, it's not very fast for serializing. This is particularly evident when writing a lot of logs. The switch to therapidjson
library brought a significant 40-50% performance improvement to my logging system.In order to benchmark the serialization impact, I modified the
OtlpHttpClient
to immediately returnExportResult::kSuccess
after converting the proto message to json and dumping it to a string. I then modifiedexample_otlp_http
with the following:Result for
nlohmann::json
implementation: 127145 milliseconds Result forrapidjson
implementation : 82763 millisecondsThe code is available at: https://github.com/perhapsmaple/opentelemetry-cpp/tree/json-benchmark Not final - I think some more changes could be made to make it a little bit more efficient
I think this is an easy avenue for improvement, and we should consider benchmarking more thoroughly with both libraries. Happy to hear your thoughts and feedback.