Open picca opened 11 years ago
On Fri, 12 Jul 2013 13:49:54 -0700 picca notifications@github.com wrote:
Hello Jerome,
I would like to know your plan for integration of hdf5/NeXuS files in your tools.
Plans ... you have spotted some bugs induced by the introduction of code for HDF5.
This has a very low priority because no scientist wants it so no project will be requested on this hence no manpower allocated.
So this can only be done on my spare time
maybe it would be nice to provide a sort of URI system to describe the datas these files.
pyFAI-saxs -p Si.poni hdf5://
:/path/to/the/data[slice]
looks interesting.
where slice allow to extract part of the datas.
Are you aware of an hdf5 URI system ?
In fullfield we are using path.h5:/group then rely on nexus tags to retrieve the dataset but we have no slicing issue.
Cheers,
Jerome Kieffer Jerome.Kieffer@terre-adelie.org
Development version of FabIO include now the ability to read hdf5://
Hello Jerome, so I just tested pyFAI with fabio 0.1.4 and the hdf5 files
here the command I am using.
./bootstrap.py pyFAI-calib -l 0.39 -w 0.652 -D Xpad_flat -S Si hdf5:///nfs/ruche-diffabs/diffabs-users/20120966/2013/Run3/2013-07-11/silicium_1298.nxs:/scan_1311/scan_data/data_15[0]
with the current pyFAI, I got this error message:
Traceback (most recent call last):
File "./bootstrap.py", line 99, in
looking at the code it seems that the problem is in the utils module, the expand_args method
def expand_args(args): """ Takes an argv and expand it (under Windows, cmd does not convert *.tif into a list of files. Keeps only valid files (thanks to glob)
@param args: list of files or wilcards
@return: list of actual args
"""
new = []
for afile in args:
print afile
if os.path.exists(afile):
new.append(afile)
else:
new += glob.glob(afile)
return new
Indeed afile is no more a valid file with the URI. I am wondering is this validation should not be delegated to fabio. which couls says Hey this URI is a valid URI for me.
maybe fabio should contain a way to build a list of valid URI from the command line. so instead of doing this work in pyFAI, fabio should have something that could return a list of valid URI.
Another problem observed with the hdf5 URI
Traceback (most recent call last):
File "./bootstrap.py", line 99, in
Indeed the peakpeaker try to open a non valid .npt file.
It is now possible to use HDF5 files with most applications: average, calibration, integration
We are using 2 types of URLs:
foo.h5::blahblah
silx://
, fabio://
Sometime it is one or the other, cause it was more easy to implement. Here is an overview https://github.com/silx-kit/pyFAI/pull/1175
Let us know if it is enough to close this issue.
Hello Jerome,
I would like to know your plan for integration of hdf5/NeXuS files in your tools.
maybe it would be nice to provide a sort of URI system to describe the datas these files.
pyFAI-saxs -p Si.poni hdf5://:/path/to/the/data[slice]
where slice allow to extract part of the datas.
Are you aware of an hdf5 URI system ?