NeurodataWithoutBorders / matnwb

A Matlab interface for reading and writing NWB files
BSD 2-Clause "Simplified" License
49 stars 32 forks source link

can't export expandable table with row vector #383

Closed cechava closed 2 years ago

cechava commented 2 years ago

An error is produced when expandable table contains a row vector. Using latest state of master branch for MatNWB.

No issue for column vectors or multidimensional matrices.

Minimum code snippet to reproduce

start_col = types.hdmf_common.VectorData( ...
    'description', 'start times column', ...
    'data', types.untyped.DataPipe( ...
        'data', (1:10)', ... %(10,1)
        'maxSize', [Inf, 1], ...
        'axis', 1 ...
    ) ...
);
stop_col = types.hdmf_common.VectorData( ...
    'description', 'stop times column', ...
    'data', types.untyped.DataPipe( ...
        'data', (1:10)', ...  %(10,1)
        'maxSize', [Inf, 1], ...
        'axis', 1 ...
    ) ...
);
cond_col = types.hdmf_common.VectorData( ...
    'description', 'condition column', ...
    'data', types.untyped.DataPipe( ...
        'data', randi(2,1,10), ...  %(1,10)
        'maxSize', [1, Inf], ...
        'axis', 2 ...
    ) ...
);
ids_col = types.hdmf_common.ElementIdentifiers( ...
    'data', types.untyped.DataPipe( ...
        'data', (0:9)', ...  % (10,1)
        'maxSize', [Inf, 1], ...
        'axis', 1 ...
    ) ...
);
% Create table
trials_table = types.core.TimeIntervals(...
                 'description', 'test dynamic table column',...
                'colnames', {'start_time', 'stop_time', 'conditions'}, ...
                'start_time', start_col, ...
                'stop_time', stop_col, ...
                'conditions', cond_col, ...
                'id', ids_col ...     
);
% Create NwbFile object with required arguments
file = NwbFile( ...
    'session_start_time', '2022-01-01 00:00:00', ...
    'identifier', 'ident1', ...
    'session_description', 'test file' ...
);
% Assign to intervals_trials
file.intervals_trials = trials_table;
% Export
nwbExport(file, 'testFileWithDataPipes.nwb');

Error using hdf5lib2
The rank of the dataspace does not match the length of the extent arguments.

Error in H5S.create_simple (line 57)
space_id = H5ML.hdf5lib2('H5Screate_simple', rank, h5_dims, h5_maxdims);

Error in types.untyped.datapipe.BlueprintPipe>allocateSpace (line 212)
sid = H5S.create_simple(rank, h5_dims, h5_maxdims);

Error in types.untyped.datapipe.BlueprintPipe/write (line 172)
            sid = allocateSpace(maxSize);

Error in types.untyped.DataPipe/export (line 238)
            obj.internal = obj.internal.write(fid, fullpath);

Error in types.untyped.DataPipe/subsref (line 257)
                B = builtin('subsref', obj, S);

Error in types.untyped.MetaClass/write_base (line 21)
                    refs = obj.data.export(fid, fullpath, refs);

Error in types.untyped.MetaClass/export (line 66)
            refs = obj.write_base(fid, fullpath, refs);

Error in types.hdmf_common.Data/export (line 39)
        refs = export@types.untyped.MetaClass(obj, fid, fullpath, refs);

Error in types.hdmf_common.VectorData/export (line 53)
        refs = export@types.hdmf_common.Data(obj, fid, fullpath, refs);

Error in types.untyped.Set/export (line 181)
                    refs = v.export(fid, propfp, refs);

Error in types.hdmf_common.DynamicTable/export (line 113)
            refs = obj.vectordata.export(fid, fullpath, refs);

Error in types.core.TimeIntervals/export (line 91)
        refs = export@types.hdmf_common.DynamicTable(obj, fid, fullpath, refs);

Error in types.core.NWBFile/export (line 1028)
            refs = obj.intervals_trials.export(fid, [fullpath
            '/intervals/trials'], refs);

Error in NwbFile/export (line 62)
                refs = export@types.core.NWBFile(obj, output_file_id, '/', {});

Error in nwbExport (line 36)
    nwb(i).export(filename);

Error in test (line 50)
nwbExport(file, 'testFileWithDataPipes.nwb');
cechava commented 2 years ago

Issue seems to be with these lines which try to figure out the rank of the HDF5 dataset. Because, the DataPipe object with row vector has maxSize of [1 Inf], the HDF5 dataset is wrongly inferred to be of size rank 1.

I remember being able to create such data sets a couple of weeks ago.

@ln-vidrio any thoughts?

lawrence-mbf commented 2 years ago

Removing that check and just using the rank directly fixes the issue. We may have to check if our workaround for 1-dimensional vectors will still work though.

lawrence-mbf commented 2 years ago

Oh, actually that will be a separate pr. Yeah simply removing that check should resolve this issue.