Closed Xqua closed 1 year ago
Also, HDFview says its Gzip too ?
But why dies HDF5.NET invokes Nbit ?
HDFView also shows "SCALEOFFSET", I think this is the nbit filter (so your data are filtered with two different methods). I was not yet able to implement that nbit filter because I did not yet fully understand it (original code here: https://github.com/HDFGroup/hdf5/blob/develop/src/H5Znbit.c). To implement it, I need to find a generic way (for all data types) and even better would be a vectorized implementation to have good performance. But of course a simple implementation would be better than nothing. I'll try to find a solution next week.
That would be amazing! Let me know if you do!
So, if this helps, I got this snippet using the Pinvoke to work:
public void Main()
{
var file = H5F.open("/windows/Documents and Settings/xqua/Documents/drosophila.h5", H5F.ACC_RDONLY);
var dataset = H5D.open(file, "/t00000/s00/0/cells", H5P.DEFAULT);
var space = H5D.get_space(dataset);
var typeID = H5D.get_type(dataset);
int ndims = H5S.get_simple_extent_ndims(space);
ulong[] dims = new ulong[ndims];
H5S.get_simple_extent_dims(space, dims, null);
var sizeData = H5T.get_size(typeID);
var size = sizeData.ToInt32();
double bytearray_elements = 1;
for (int i=0; i<ndims; i++) {
bytearray_elements *= dims[i];
Console.WriteLine(bytearray_elements);
Console.WriteLine(dims[i]);
}
Console.WriteLine("Now testing data loading");
int[,,] data = new int[dims[0], dims[1], dims[2]];
Console.WriteLine($"Data at 3,3,3 should be 0: {data[3, 3, 3]}");
GCHandle handle = GCHandle.Alloc(data, GCHandleType.Pinned);
IntPtr pointer = handle.AddrOfPinnedObject();
var status = H5D.read(dataset, H5T.NATIVE_UINT, H5S.ALL, H5S.ALL, H5P.DEFAULT, pointer);
Console.WriteLine($"Read data, status: {status}");
Console.WriteLine($"Data at 3,3,3: {data[3, 3, 3]}");
int max = 0;
int min = 100000000;
for (int i=0; i < (int)dims[0]; i++)
{
for (int j = 0; j < (int)dims[1]; j++)
{
for (int k = 0; k < (int)dims[2]; k++) {
if (data[i, j, k] > max) {
max = data[i, j, k];
Console.WriteLine($"new max at: {i}, {j}, {k}: {max}");
}
if (data[i, j, k] < min)
{
min = data[i, j, k];
Console.WriteLine($"new min at: {i}, {j}, {k}: {min}");
}
}
}
}
Console.WriteLine($"min: {min} max: {max}");
}
I found that scale offset filter is really the scale offset filter and not nbit. There was an error in the invocation of the correct method. I have started implementing the scale offset filter today and I think I will have finished and publishd it on Friday.
I have publish a new prerelease (https://www.nuget.org/packages/HDF5.NET/1.0.0-alpha.12.final) with the ScaleOffset filter included. I hope it works for you now :-)
Nice I'll try it next week !
This is a message I post to all recent issues: I have just renamed the project from HDF5.NET to PureHDF for my preparations of a soon to come beta release. Please note that the Nuget package name has also changed and can be found here now: https://www.nuget.org/packages/PureHDF.
Hi,
I am trying to read data that was compressed with Gzip, but I am getting this error:
Could you help me/guide me towards getting this solved?