protobufjs / protobuf.js

Protocol Buffers for JavaScript & TypeScript.
Other
9.85k stars 1.41k forks source link

Create a browser based benchmark and host it on jsPerf #642

Open robin-anil opened 7 years ago

robin-anil commented 7 years ago

Feature request

I would love to have some standardized protobuf.js benchmarks, a hosted version of the benchmarks on jsPerf.com we understand the performance profile on other browsers and low-powered devices.

For the following operations

Across Variations

Messages with ints, strings, enums only fields. Mixed fields, arrays, mixed arrays, messages, mixed messages, extensions, and nested protos.

Across Sizes

Stretch goals

Happy to help collaborate, need to figure out a versioning and hosting story.

dcodeIO commented 7 years ago

Tried porting what we currently have to jsPerf as a starting point, but can't edit anything there. Seems they have a long standing issue with "Not all tests inserted" errors when editing. Iirc I had this issue a month ago already.

For what it's worth, that's about what I started with:

Preparation code HTML

<script src="//cdn.rawgit.com/dcodeIO/long.js/3.2.0/dist/long.js"></script>
<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/master/dist/protobuf.js"></script>

Define setup for all tests

var data = {
    "string" : "Lorem ipsum dolor sit amet.",
    "uint32" : 9000,
    "inner" : {
        "int32" : 20161110,
        "innerInner" : {
            "long" : {
                "low": 1051,
                "high": 151234,
                "unsigned": false
            },
            "enum" : 1,
            "sint32": -42
        },
        "outer" : {
            "bool" : [ true, false, false, true, false, false, true ],
            "double": 204.8
        }
    },
    "float": 0.25
};

var root = protobuf.parse("syntax = \"proto3\";\
message Test {\
    string  string = 1;\
    uint32  uint32 = 2;\
    Inner   inner  = 3;\
    float   float  = 4;\
    message Inner {\
        int32      int32      = 1;\
        InnerInner innerInner = 2;\
        Outer      outer      = 3;\
        message InnerInner {\
            int64  long   = 1;\
            Enum   enum   = 2;\
            sint32 sint32 = 3;\
        }\
    }\
    enum Enum {\
        ONE   = 0;\
        TWO   = 1;\
        THREE = 2;\
        FOUR  = 3;\
        FIVE  = 4;\
    }\
}\
message Outer {\
    repeated bool bool = 1;\
    double double = 2;\
}").root;

var Test = root.lookup("Test");

var buf = Test.encode(data).finish();
var msg = Test.fromObject(data);
var obj = Test.toObject(msg);

Code snippet 1: Test.encode

Test.encode(data).finish();

Code snippet 2: Test.decode

Test.decode(buf);

Code snippet 3: Test.encode + decode

 Test.decode(Test.encode(data).finish());

Code snippet 4: Test.verify

Test.verify(data);

Code snippet 5: Test.fromObject

Test.fromObject(obj);

Code snippet 6: Test.toObject

Test.toObject(msg);
robin-anil commented 7 years ago

Thanks, I see the issue https://github.com/jsperf/jsperf.com/issues/236

I got to save 2 tests, https://jsperf.com/protobufjs-robin here, just to see the initial performance profile and it's the mobile performance that is found lacking.

On Chrome 55 on Macbook Pro (2013) 2.3 GHz Intel Core i7

screen shot 2017-01-12 at 9 07 21 pm

On Chrome 55 on Nexus 6

screen shot 2017-01-12 at 9 08 48 pm

On Safari 10 on Macbook Pro (2013) 2.3 GHz Intel Core i7

screen shot 2017-01-12 at 9 09 30 pm

On Firefox 50 on Macbook Pro (2013) 2.3 GHz Intel Core i7

screen shot 2017-01-12 at 9 09 24 pm
dcodeIO commented 7 years ago

So, Nexus 6 is about 1/10th of a desktop. Not sure if that can even be improved, considering that mobile CPUs are a quite different beast. Still, 50-70k ops/s.

Would be helpful to see a comparison to JSON there to know if there's actually something to improve or if this is just what to expect (really hope jsPerf gets their stuff together soon).

robin-anil commented 7 years ago

For one of our Apps which downloads close to a megabyte of proto data chunks (of about 100KB each) to the client, decodes and converts to json. Our load time went from ~5-7s to ~1-1.5s with this upgrade 🎉 🤑 💰

houfeng0923 commented 5 years ago

i had a simple jsperf test here:

https://jsperf.com/protobuf-vs-json-parse

native JSON.parse is faster then protobuf decode in chrome and firefox . but protobuf is faster 3 times in node env .