-
UPDATE:
I solved the problem by downgrading to Pytables 3.0.0. You can find an experiment that you can replicate in here:
http://stackoverflow.com/questions/26197622/pandas-hdfstore-slow-on-query-for…
ghost updated
9 years ago
-
First of all, thank you for this amazing package!
Recently, I've been loading a lot of large files and it felt like Arrow.jl loading times are greater than Python. I wanted to quantify this feeling…
-
Hi,
thank you for this nice package!
Today I saw that vary `blosc `does not seem to make a difference,
two weeks ago vary `blosc `had an impact.
Regards,
Stefan
P.S.:
Here is my code:
`u…
-
And with the new features and changes in zstd 1.5, there is some interest by results and possible regressions.
The documented benchmark is excellent, but perhaps quite long for all tests.
An autom…
-
notice the difference between x.sum() and sum(x):
```
from __future__ import print_function
import numpy as np
import tables
dtype = [('x',np.float64), ('y',np.int32)]
data = np.array([(1,…
-
Although Attic make a good job deduplicating data, it seems it uses a simle deflate method for compression (like the ones of zip and gzip).
However for data that is going to be backup once for the l…
-
Looks like when zram-generator is used, systems whose kernel defaults to using zswap do not get that disabled.
As a result, I presume that the system attempts to compress memory pages twice since zsw…
-
Blosc is one of the fastest (de)compressors out there and supports SIMD shuffling, wonder if the licensing would permit to bring the support for it into h5py? (I believe it's BSD)
It's fairly simple,…
-
Hi, I'm running into package incompatibilities when running your setup. See below for environment script and error messages.
Script (`dp.yml`):
```
name: dp
channels:
- conda-forge
- bioco…
-
When running out of disk space while writing to a HDFStore an exception will occur with the following error message. In addition the previously written h5 will be corrupt.
"HDF5ExtError: Problems c…