We currently use JSON and serde_json to serialize/deserialize models.
Experiments show that other formats might be better suited for the task. For instance on a 7.5Mb Excel file:
Parse into json: 604.852904ms
Parse from json: 1.306669155s
Parse into binary: 112.957899ms
Parse from binary: 225.933871ms
Note that in a few places we will also need to use something else than JSON for serializing/deserializing, but that would remove the whole serde dependency.
It would be interestinmg to see the wasm file size after this.
We currently use JSON and serde_json to serialize/deserialize models. Experiments show that other formats might be better suited for the task. For instance on a 7.5Mb Excel file:
See the remote branch for some experiments:
https://github.com/ironcalc/IronCalc/tree/experiment/nicolas-bincode
Just run
cargo build --release
and then:for a large workbook
Note that in a few places we will also need to use something else than JSON for serializing/deserializing, but that would remove the whole serde dependency.
It would be interestinmg to see the wasm file size after this.
See also:
https://github.com/djkoloski/rust_serialization_benchmark