titusjan / argos

Argos: a data viewer that can read HDF5, NetCDF4, and other file formats.
GNU General Public License v3.0
178 stars 26 forks source link

problems examining my HDF-5 file in argos #4

Closed kdere closed 7 years ago

kdere commented 7 years ago

I am trying to install argos in order to examine some hdf5 files I am trying to make. I am using a cloned version of 0.2.1 and have using my linux (openSuse) package manager to get all the dependencies. I am running Python 3.4.5 The problem seems to be with h5py 2.6.0 I am able to open an hdf5 file and then when I go to examine tree (in the left panel), it crashes with the follow message: (sorry, it is longer but better to be complete) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/_objects.c:2841) File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/_objects.c:2799) File "/usr/lib64/python3.4/site-packages/h5py/_hl/attrs.py", line 79, in getitem attr.read(arr, mtype=htype) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/_objects.c:2841) File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/_objects.c:2799) File "h5py/h5a.pyx", line 355, in h5py.h5a.AttrID.read (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/h5a.c:5349) File "h5py/_proxy.pyx", line 36, in h5py._proxy.attr_rw (/home/abuild/rpmbuild/BUILD/h5py-2.6.0/h5py/_proxy.c:1142) OSError: Unable to read attribute (No appropriate function for conversion path)

titusjan commented 7 years ago

Are you sure this is the complete stack trace? I don't see any file and line number of the argos package, all files names start with "h5py/".

My first guess is that there is something wrong with your HDF files. Are you able to read the file with h5py only? i.e. from a small Python script?

Can you read HDF files with Argos that you didn't create yourself? E.g. one of the HDF-EOS files of this download page?

kdere commented 7 years ago

I downloaded the file used in your tutorial and that worked. I created my file with python Tables and I notice that you use h5py to read files. Maybe the problem lies there. It is my understanding that h5py does not provide a complete access to write or read all of the data that you may want to put into an hdf5 file. However, looking at the documentation, it looks like I can do more that create arrays of data, but the you can also store attributes which. So perhaps, h5py can do the whole job. What I am trying to do is to convert ascii data file of atomic data(www.chiantidatabase.org), containing both data and scientific references into hdf5 file for faster reading. Maybe you have some suggestions as to an approach such as whether to use Tables or h5py? Or some pointers. thanks

titusjan commented 7 years ago

It is my understanding that h5py does not provide a complete access to write or read all of the data that you may want to put into an hdf5 file.

I have never heard of this. Can you provide a link with this information?

In principle you should be able to generate HDF-5 files with PyTables and read them with H5Py, and vice versa. Apparently something went wrong and you created an invalid HDF-5 file. If there are doubts about the validity of an HDF file, it's best to try to open it with one of the official tools of the HDFGroup: h5ls, HDFView or even hdfcheck. This way you can be (almost) certain that the issue lies in the HDF file and not in the viewer. If you just want to look at the data in your file, Argos is the better choice of course :-)

I think the easiest way forward is to just use H5Py to generate your HDF files. This is an excellent package and very easy to use. I have never worked with PyTables so I don't know how good it is. Perhaps it's fine too, but it's hard to match H5Py in my opinion. There are also plans to merge the communities and refactor PyTables so that it uses H5Py under the hood. Read this article for instance.