ipfs / protons

Protocol Buffers for Node.js and the browser without eval
Other
32 stars 23 forks source link

Slow compared to protobufjs #51

Closed wemeetagain closed 2 years ago

wemeetagain commented 2 years ago

Running a simple benchmark against gossipsub RPC encoding/decoding from before and after switching from protobufjs to protons shows a significant degradation in performance. ~20x slower encoding, ~10x slower decoding.

The performance degradation is serious enough that (js-libp2p-gossipsub) may need to either switch from this library or determine a refactor here that can achieve comparable results.

with protobufjs:

  RPC
    ✔ encode                                                              401606.4 ops/s    2.490000 us/op        -   22116142 runs   60.8 s
    ✔ decode                                                              452284.0 ops/s    2.211000 us/op        -   24709315 runs   60.9 s

with protons:

  RPC
    ✔ encode                                                              24618.41 ops/s    40.62000 us/op        -    1458781 runs   60.1 s
    ✔ decode                                                              41540.31 ops/s    24.07300 us/op        -    2439588 runs   60.0 s
BigLep commented 2 years ago

@wemeetagain : thanks for filing. We're curious in a real-world operation where there are network calls, does this performance hit from the encode/decode have much impact?

wemeetagain commented 2 years ago

It has a big impact, though there are some other things that have higher impacts.

Decoding gossipsub RPCs goes from 0.32% of runtime to 10.21% of runtime.

dapplion commented 2 years ago

@BigLep It has a very serious impact, to the point it's blocking our incorporation of the new libp2p version.

twoeths commented 2 years ago

a benchmark with this simple message

syntax = "proto3";

message Message {
  optional bytes data = 2;
  optional string topic = 4;
}
message
    ✔ full message (data + topic)                                      1.285347e+7 ops/s    77.80000 ns/op   x1.056    3645492 runs   30.0 s
    ✔ only data                                                        8.025682e+7 ops/s    12.46000 ns/op   x0.984   19204647 runs   30.1 s
    ✔ only topic                                                       1.346801e+7 ops/s    74.25000 ns/op   x0.928    3699726 runs   30.1 s

it shows that most of the time is used to decode string, this is due to the performance of toString() in uint8arrays

  itBench({
    id: 'string decode',
    fn: () => {
      toString(stringBytes)
    },
    runsFactor: 100
  })

  itBench({
    id: 'string decode using protobuf',
    fn: () => {
      utf8_read(stringBytes, 0, stringBytes.length)
    },
    runsFactor: 100
  })

the toString() function in uint8arrays is 10x slower compared to protobuf

✔ string decode                                                    1.831166e+7 ops/s    54.61000 ns/op        -    5138306 runs   30.1 s
✔ string decode using protobuf                                     2.288330e+8 ops/s    4.370000 ns/op        -   41186303 runs   30.1 s

need to also figure out why the decode of bytes data is slow as decoding the same message in protobuf takes ~5s (vs 77ns as shown above)

github-actions[bot] commented 2 years ago

Oops, seems like we needed more information for this issue, please comment with more details or this issue will be closed in 7 days.

BigLep commented 2 years ago

To be clear, I removed the "need/author-input" label. I know this work is in progress.

achingbrain commented 2 years ago

This should have been resolved by https://github.com/ipfs/protons/pull/58 - please try again with protons@5.0.0

achingbrain commented 2 years ago

The benchmarks in this repo show that protons is comparable in performance to protobuf.js so I'm going to close this. Please submit a PR with updated benchmarks if you're still seeing this issue.