-
Is there a way to shoehorn this into standard R serialization to file, and reading back in a way that is blazingly fast -- from a compressed file?
Did some casual experiments (checked by @joshuaulric…
-
I'm looking for optimal compression with limited resources to decompress.
In my case it is 16k available RAM. LZ4x of Ilya Muravyov is easily tweakable for that case.
I couldn't figure out how to do…
-
is there no one who can look into replacing lzma with LZ4?
EDIT: LZ4 is now implemented... :100:
Oxtie updated
9 years ago
-
### Describe the bug, including details regarding any error messages, version, and platform.
500 MB of arrow data took a few hours to compress. A 5GB selection ran for a couple of days and did not c…
-
```
1.4 of the LZ4 Streaming format provides a sound stream interface, yet the
following enhancements would provide more flexiblity for both client and server.
Possible use of 'header' reserved bit …
-
```
1.4 of the LZ4 Streaming format provides a sound stream interface, yet the
following enhancements would provide more flexiblity for both client and server.
Possible use of 'header' reserved bit …
-
Is it possible to add a method that pretends to decompress data and thus works out buffer size needed for decompression?
-
### Describe the bug, including details regarding any error messages, version, and platform.
With {arrow} 14.0.0, I was able to import a large number of CSVs, merge the schemas, establish a partiti…
-
### Describe the bug, including details regarding any error messages, version, and platform.
Any attempt at instantiating an S3Filesystem object in 15.0.1 crashes R in RStudio. You get the dialog box…
-
```java
FAILED: src/arrow/CMakeFiles/arrow_objlib.dir/adapters/orc/adapter_util.cc.o
/usr/bin/c++ -DARROW_WITH_LZ4 -DARROW_WITH_SNAPPY -DARROW_WITH_TIMING_TESTS -DARROW_WITH_ZLIB -DARROW_WITH_ZSTD…