FXIhub / owl

CXI viewer written in python
3 stars 3 forks source link

Assume everything is a stack #27

Open FilipeMaia opened 8 years ago

FilipeMaia commented 8 years ago

From Kartik:

I hacked my copy to assume everything is a stack regardless of whether it has the experiment_identifier attribute. I guess this violates the CXI file format requirement, but I find it easier to just take random h5 files then.

mhantke commented 8 years ago

After a user clicks on a dataset that is not a stack we could ask the user whether the dataset should be converted into a stack (i.e. adding the attribute "axes").

On Wed, Mar 16, 2016 at 6:57 PM, Filipe Maia notifications@github.com wrote:

From Kartik:

I hacked my copy to assume everything is a stack regardless of whether it has the experiment_identifier attribute. I guess this violates the CXI file format requirement, but I find it easier to just take random h5 files then.

b

Max Felix Hantke Ph.D. student Laboratory of Molecular Biophysics Uppsala University

www.lmb.icm.uu.se

kartikayyer commented 8 years ago

Another option would be to ship a simple utility like the one below which stack-ifies all the datasets in an H5 file.

import h5py
import sys

if len(sys.argv) < 2:
    print 'Format: %s <h5_file>' % sys.argv[0]
    sys.exit(1)

def attach_attribute(fp, node, rank):
    if rank == 0:
        fp[node].attrs.modify('axes', ['experiment_identifier:value'])
    elif rank == 1:
        fp[node].attrs.modify('axes', ['experiment_identifier:x'])
    elif rank == 2:
        fp[node].attrs.modify('axes', ['experiment_identifier:y:x'])
    elif rank == 3:
        fp[node].attrs.modify('axes', ['experiment_identifier:z:y:x'])

def process_group(fp, group_name):
    names = fp[group_name].keys()
    for name in names:
        # Test if name is a dataset
        try:
            dshape = fp[group_name + '/' + name].shape
            attach_attribute(fp, group_name + '/' + name, len(dshape))
        except AttributeError:
            # This is a group. Recursively process that group.
            process_group(fp, group_name + '/' + name)

f = h5py.File(sys.argv[1], 'r+')
group_names = f.keys()
for group_name in group_names:
    process_group(f, group_name)
f.close()
FilipeMaia commented 8 years ago

We should just treat 3D datasets as stacks. We should not assume we can write to the file.