apache / arrow-rs

Official Rust implementation of Apache Arrow
https://arrow.apache.org/
Apache License 2.0
2.31k stars 678 forks source link

Use custom thrift decoder to improve speed of parsing parquet metadata #5854

Open alamb opened 3 weeks ago

alamb commented 3 weeks ago

Is your feature request related to a problem or challenge? Please describe what you are trying to do. Part of https://github.com/apache/arrow-rs/issues/5853

Parsing the parquet metadata takes substantial time and most of that time is spent in decoding the thrift format (@XiangpengHao is quantifying this in https://github.com/apache/arrow-rs/issues/5770)

Describe the solution you'd like Improve the thrift decoder speed

Describe alternatives you've considered @jhorstmann reports on https://github.com/apache/arrow-rs/issues/5775 that he made a prototype of this:

          I had an attack of "not invented here" syndrome the last few days 😅 and worked on an alternative code generator for thrift, that would allow me to more easily try out some changes to the generated code. The repo can be found at <https://github.com/jhorstmann/compact-thrift/> and the output for `parquet.thrift` at <https://github.com/jhorstmann/compact-thrift/blob/main/src/main/rust/tests/parquet.rs>.

The current output is still doing allocations for string and binary, but running the benchmarks from https://github.com/tustvold/arrow-rs/tree/thrift-bench shows some nice improvements. This is the comparison with current arrow-rs code, so both versions should be doing the same amount of allocations:

decode metadata      time:   [32.592 ms 32.645 ms 32.702 ms]

decode metadata new  time:   [17.440 ms 17.476 ms 17.532 ms]

So incidentally very close to that 2x improvement.

The main difference in the code should be avoiding most of the abstractions from TInputProtocol and avoiding stack moves by directly writing into default-initialized structs instead of moving from local variables.

Originally posted by @jhorstmann in https://github.com/apache/arrow-rs/issues/5775#issuecomment-2131307588

Additional context

alamb commented 3 weeks ago

FWIW another potential possibility is to hand-write a thrift decoder for the parquet metadata rather than relying on a code generator. That would likely result in the fastest decode time, but would also be the hardest to maintain.

jhorstmann commented 3 weeks ago

Thanks @alamb for creating this tracking issue. I've slowly continued working on my code at jhorstmann/compact-thrift and benchmarks are looking good. So good in fact that adapting it on top of #5777 the performance hotspot shifts to the conversion functions from generated thrift types to internal types.

I would love to get some feedback on the code, and whether there would be a preference to integrate the parquet definitions into the arrow-rs repo, or publish them separately.

The generated and runtime code is also structured in a way that it would not be too crazy to write bindings to custom types by hand.

Direct links to the generated code and to the runtime library.

XiangpengHao commented 3 weeks ago

FWIW, by simply moving this field to heap (i.e., Option<Statistics> -> Option<Box<Statistics>>), we can get 30% performance improvement (as will show in blog #5770). https://github.com/apache/arrow-rs/blob/087f34b70e97ee85e1a54b3c45c5ed814f500b0a/parquet/src/format.rs#L3407

The Option<Statistics> occupies 136 bytes even if the file does not have stats at all (i.e., the field is None); this not only slows down decoding (due to poor memory locality) but also causes high memory consumption when decoding metadata (parquet-rs consumes 10MB memory per MB of metadata).

I think this example motivates custom parquet type definitions and, thus, custom thrift decoder.

alamb commented 3 weeks ago

FWIW, by simply moving this field to heap (i.e., Option<Statistics> -> Option<Box<Statistics>>), we can get 30% performance improvement (as will show in blog #5770).

This code is in https://github.com/apache/arrow-rs/pull/5856 for anyone who is curious

alamb commented 3 weeks ago

I would love to get some feedback on the code, and whether there would be a preference to integrate the parquet definitions into the arrow-rs repo, or publish them separately.

Hi @jhorstmann -- I had a look at https://github.com/jhorstmann/compact-thrift/tree/main (very cool)

Some initial reactions: Writing a code generator in Kotlin is a neat idea, but I think it might make the barrier to contribution high (now people need to know Rust and Kotlin (and the associated toolchains, etc)

Also, I keep thinking if we are going to have a parquet-rs specific implementation, why use a code generator at all? Maybe we could simply hand code a decoder directly that uses the runtime library

Given how infrequently the parquet spec changes, a hand rolled parser might be reasonable (though I will admin that the generated format.rs is substantial 🤔 ). We can probably ensure compatibility with round trip testing of generated rust code 🤔

jhorstmann commented 2 weeks ago

Writing a code generator in Kotlin is a neat idea, but I think it might make the barrier to contribution high (now people need to know Rust and Kotlin (and the associated toolchains, etc)

I agree, in the context of arrow-rs this is probably a bigger barrier to contribute than the existing C++ based thrift code generator.

Maybe the amount of code could be simplified and made easier to change by hand with the use of some macros. The most tricky part of the code generation, difficult to replicate in a macro, might be the decision of which structs require lifetime annotations.

alamb commented 2 weeks ago

The more I think about this the more I am convinced that the fastest thing to do would be to decode directly from thrift --> the parquet-rs structs. Perhaps we could follow the tape decoding model of the csv or json parsers in this repo 🤔

Decoding to intermediate thrift structures which are then throw away seems like an obvious source of improvement

jhorstmann commented 2 weeks ago

It occurred to me that the thrift definitions consist entirely of valid rust tokens, and so should be parseable using declarative macros. The result of that experiment can be seen in #5909, the complete macro can be found at https://github.com/jhorstmann/compact-thrift/blob/main/src/main/rust/runtime/src/macros.rs

alamb commented 1 week ago

It occurred to me that the thrift definitions consist entirely of valid rust tokens, and so should be parseable using declarative macros.

That is really (really) cool @jhorstmann

Maybe we could even use the declarative macros to creating a parser that avoids intermediates, by providing callbacks rather than building structs 🤔