Closed marklam closed 2 months ago
Thank you, I`ll try to investigate this and your other issue today evening
It should work now with version 1.0.0-beta.12
(for both dependencies, PureHDF
and PureHDF.Filters.Blosc2
) :-)
There were two bugs:
I had another issue logged (#68), wondering why the file size is not reduced when using Blosc. The reason was that the Blosc target buffer has the size of a single chunk and before it is returned, it should be sliced to the actual compressed size. But that was not the case and so the data has been compressed but the buffer size never changed.
The problem you encountered was due to the fact that with compression level 0, no compression takes place. But that does not mean that a buffer of the size of a single chunk is large enough. Actually, Blosc adds an overhead of 32 bytes to the uncompressed data which PureHDF.Filters.Blosc2
did not account for and so Blosc was complaining. Now the target buffer for compression is always 32 bytes larger than the source buffer.
Thanks, that no longer crashes.
Doesn't happen with the default compression level (at least on this chunk configuration).
I've created a branch with a repro case in it in case it's specific to the chunk sizes or data type etc.
https://github.com/marklam/Roundtrip3DArrayOfStructList/tree/blosc