Open SerVB opened 4 years ago
Still relevant
Has the cause of the slow down been discovered yet?
@SerVB I'm curious if you know whether this is still relevant with the IR compiler on Kotlin 1.7.x
Has the cause of the slow down been discovered yet?
@SerVB I'm curious if you know whether this is still relevant with the IR compiler on Kotlin 1.7.x
relevant even for 1.8.0-RC2
@ScottPierce, idk, I don't use K/JS atm. But you can try to compile the sample repo mentioned above with newer versions and check it yourself: https://github.com/SerVB/kx-serialization-js-benchmark
I updated it to the latest Kotlin - https://github.com/ScottPierce/kx-serialization-js-benchmark
The problem seems to have gotten worse, not better.
@sandwwraith This does seem to be odd. Is this something that can even be fixed, or is it just a result of some performance issues with Kotlin JS?
Kotlin/JS is not really fast, indeed. Besides, there's always a penalty for conversion between JS objects and Kotlin classes, as most of them are not represented natively (e.g. List is a whole special class, only Array is mapped to JS array)
@sandwwraith However the manual Kotlin serialization result is still inline with the js results. It's still an order of magnitude faster.
That implies that matching performance is possible.
JSON deserialization is extremely slow, at least on JS target. I've written the same deserializer myself like this:
It is tedious to write and support but it gives a boost of about 10x. Without the fix, deserialization takes about 80% of processing a message from server time. After the fix, it's only about 30% that is acceptable now.
To Reproduce For convenience, I've extracted the protocol and deserializer part from our app and created a sample repo: https://github.com/SerVB/kx-serialization-js-benchmark.
I've taken a real JSON which is sent in our app with a size of 200 KB and tested the time of the following deserialization methods:
kotlinx.serialization.json.Json.parse
kotlinx.serialization.DynamicObjectParser.parse
JSON.parse
and then reading it as JS objects.It turns out that manual variants are about 20x faster. On smaller JSON sizes, the difference is smaller: in our app, the difference is 10x on average.
You can try it yourself: https://servb.github.io/kx-serialization-js-benchmark/index.html. Just click a button and it will print test results. You can find the results I've received in the README.
Possible solution I think there should be an option to generate inlined code for deserializer. It's what I've done myself, but autogeneration is what I need here because I believe there can be many errors left in my manual code. Also, autogeneration will help to update deserialization immediately when we change our protocol.
Environment