Hello @hirotakan , I am making an experiment test speed between JSON, MessagePack and Protobuf. The experiment is described as below:
Preparing:
JSON: Encode and decoding using Apple Codable
MessagePack: Your lib.
Protobuff: Apple Protobuf
3 sets of JSON data (small / med / large).
What I have done:
Test compress storage
Test serializing / deserializing
I let the test encoding and decoding the model for 100 times and take the average result.
In the serializing and deserializing in med and large set, the experiment result time is:
MessagePack < Protobuf < JSON
The MP in some test is slower than 300%. This make me so confused. Since most of articles in the internet say that JSON is slowest in med and large set. So I think there's something wrong with my experiment or the dataset is not large or complex enough? One of the article I have seen:
@gloryluu
Sorry for the late, thank you for checking polite.
If you change the optimization level of the Swift compiler to "-O", I think performance will improve.
Hello @hirotakan , I am making an experiment test speed between JSON, MessagePack and Protobuf. The experiment is described as below:
Preparing:
What I have done:
I let the test encoding and decoding the model for 100 times and take the average result.
In the serializing and deserializing in med and large set, the experiment result time is:
MessagePack < Protobuf < JSON
The MP in some test is slower than 300%. This make me so confused. Since most of articles in the internet say that JSON is slowest in med and large set. So I think there's something wrong with my experiment or the dataset is not large or complex enough? One of the article I have seen:
https://medium.com/unbabel/the-need-for-speed-experimenting-with-message-serialization-93d7562b16e4
So I attached the git. Could you take a look on it ?
https://github.com/gloryluu/swift_serializing_benchmark
Thank you and have a nice day @hirotakan