SimulationTools / h5mma

GNU Lesser General Public License v2.1
0 stars 0 forks source link

Loading large h5 file cause to consume all available memory #10

Closed barrywardell closed 2 years ago

barrywardell commented 2 years ago

I am trying to load h5 file with size of 800Mb.

Is it possible to implement load function that could load specific chunk from the data? see for example in Matmatica help on Import: Import["ExampleData/rose.gif", {"Data", 100, 100}]

Or simply import from place X until place Y?

It will be much appreciated, because I need to read from huge files (something like 2G)

Thanks!

Imported from: BitBucket Issue #10 Original date: 2014-05-18 Original creator: Hananel Hazan

barrywardell commented 2 years ago

I have added support for specifying a specific hyperslab of a dataset to read. The syntax for doing so mimics Mathematica's Part[], for example:

ImportHDF5["file.h5", {"Datasets", {"dataset", 1;;100;;2, 4;;30;;3}}]

It might be worth thinking a bit more about whether this is the best interface, or if there would be a better one.

Original date: 2014-05-19 Original author: Barry Wardell

barrywardell commented 2 years ago

Thanks. That worked. What you mean by whether this is the best interface? is there other interface?

Original date: 2014-05-20 Original author: Hananel Hazan

barrywardell commented 2 years ago

Thanks for confirming it works.

With regards to the interface, I meant that the syntax was just the simplest thing I could think of. It might be that there is something better. The current interface is quite similar to the example you pointed out from the help on Import, though, so maybe it is the best choice.

Original date: 2014-05-20 Original author: Barry Wardell

barrywardell commented 2 years ago

The fix works and I think the syntax is pretty consistent with Mathematica's Import.

Original date: 2014-05-22 Original author: Barry Wardell