Kotlin / kotlinx.serialization

Kotlin multiplatform / multi-format serialization
Apache License 2.0
5.43k stars 622 forks source link

Pain point: JSON deserialization speed is slow in browser #907

Open SerVB opened 4 years ago

SerVB commented 4 years ago

JSON deserialization is extremely slow, at least on JS target. I've written the same deserializer myself like this:

  override fun decode(string: String): List<ServerEvent> {
    val jsonArray = JSON.parse<Array<Array<Any>>>(string)
    return jsonArray.map { it.toEvent() }
  }

  private fun Array<Any>.toEvent(): ServerEvent {
    val type = this[0] as String
    val content = this[1].unsafeCast<Json>()

    return when (type) {
      "a" -> ServerImageDataReplyEvent(
        content["a"].unsafeCast<Array<Any>>().toImageId(),
        content["b"].unsafeCast<Array<Any>>().toImageData()
      )
      "b" -> ServerPingReplyEvent(content["a"] as Int, content["b"] as Int)
      "c" -> ServerClipboardEvent(content["a"] as String)
// ...

It is tedious to write and support but it gives a boost of about 10x. Without the fix, deserialization takes about 80% of processing a message from server time. After the fix, it's only about 30% that is acceptable now.

To Reproduce For convenience, I've extracted the protocol and deserializer part from our app and created a sample repo: https://github.com/SerVB/kx-serialization-js-benchmark.

I've taken a real JSON which is sent in our app with a size of 200 KB and tested the time of the following deserialization methods:

It turns out that manual variants are about 20x faster. On smaller JSON sizes, the difference is smaller: in our app, the difference is 10x on average.

You can try it yourself: https://servb.github.io/kx-serialization-js-benchmark/index.html. Just click a button and it will print test results. You can find the results I've received in the README.

Possible solution I think there should be an option to generate inlined code for deserializer. It's what I've done myself, but autogeneration is what I need here because I believe there can be many errors left in my manual code. Also, autogeneration will help to update deserialization immediately when we change our protocol.

Environment

Komdosh commented 2 years ago

Still relevant

ScottPierce commented 1 year ago

Has the cause of the slow down been discovered yet?

@SerVB I'm curious if you know whether this is still relevant with the IR compiler on Kotlin 1.7.x

Komdosh commented 1 year ago

Has the cause of the slow down been discovered yet?

@SerVB I'm curious if you know whether this is still relevant with the IR compiler on Kotlin 1.7.x

relevant even for 1.8.0-RC2

SerVB commented 1 year ago

@ScottPierce, idk, I don't use K/JS atm. But you can try to compile the sample repo mentioned above with newer versions and check it yourself: https://github.com/SerVB/kx-serialization-js-benchmark

ScottPierce commented 1 year ago

I updated it to the latest Kotlin - https://github.com/ScottPierce/kx-serialization-js-benchmark

The problem seems to have gotten worse, not better.

@sandwwraith This does seem to be odd. Is this something that can even be fixed, or is it just a result of some performance issues with Kotlin JS?

sandwwraith commented 1 year ago

Kotlin/JS is not really fast, indeed. Besides, there's always a penalty for conversion between JS objects and Kotlin classes, as most of them are not represented natively (e.g. List is a whole special class, only Array is mapped to JS array)

ScottPierce commented 1 year ago

@sandwwraith However the manual Kotlin serialization result is still inline with the js results. It's still an order of magnitude faster.

That implies that matching performance is possible.