Open cechava opened 3 years ago
digging in a bit further into this. this doesn't seem to be an issue with the file created by the ophys tutorial. Cross-version compatibility is not an issue. Ophys tutorial file can be generated, read in and re-exported with different matnwb version. Perhaps it has to do with the fact the original file has DataPipe objects while the ophys tutorial file does not. I'm gonna try out another file from the DANDI archive, in case it's something funny about this file.
@cechava yes, this should be possible in general. The DataPipe
causing problems is a good guess. Let us know if that ends up being the problem.
@bendichter indeed it appears DataPipe
is the issue. Specifically, an error appears when trying to export a file with an acquisition containing a DataPipe
object as data. No issue exporting if the DataPipe
is in another part of the file such as the interval_trials field.
@cechava got it. Could you change the title of this issue or create a new issue and put together a minimal snippet of code that completely reproduces the error?
closing out and creating a new issue.
there was an independent bug in my code that led me to believe the DataPipe
object was the issue. It's not entirely clear what the issue is with re-exporting the DANDI file used in the original post. The error does seem to point at an issue with the data in the acquisition field. Moreover, going through the process with another DANDI-downloaded file that did not have data in the acquisition field was without issue.
Interesting. I suggest paring down the code section by section to get a minimal snippet that reproduces the error. If you can post that and the full traceback I might be able to help you figure out what is going on.
The following snippet of code replicates the error. The file path should be replaced to the local location of the file downloaded from here
inputFilePath = '/Users/cesar/Documents/CatalystNeuro/matNWB/EXTRACT-interface/sub-F3_ses-20190414T210000_obj-7vehp4_behavior+ophys.nwb';
nwb = nwbRead(inputFilePath);
nwbExport(nwb, inputFilePath);
The output error
Warning: Attempted to change size of continuous dataset `/file_create_date`.
Skipping.
> In io.writeDataset (line 26)
In types.core/NWBFile/export (line 753)
In NwbFile/export (line 61)
In nwbExport (line 35)
Error using hdf5lib2
The HDF5 library encountered an unknown error.
Error in H5D.write (line 100)
H5ML.hdf5lib2('H5Dwrite', varargin{:});
Error in io.writeDataset (line 35)
H5D.write(did, tid, sid, sid, 'H5P_DEFAULT', data);
Error in types.core.NWBFile/export (line 753)
io.writeDataset(fid, [fullpath '/file_create_date'],
obj.file_create_date, 'forceChunking', 'forceArray');
Error in NwbFile/export (line 61)
refs = export@types.core.NWBFile(obj, output_file_id, '/', {});
Error in nwbExport (line 35)
export(nwb(i), filename);
Some further info:
DataPipe
objectsData from this DANDIset does have acquisition data and the file be read-in and re-exported. I think the issue is specific to the files from the Plitt/Giocomo DANDIset.
You could try deleting components of the file with h5py.
del dataset
and
del group
Both work. You could use that approach to pare the file down
On Fri, Oct 29, 2021 at 6:46 PM cechava @.***> wrote:
Data from this DANDIset https://gui.dandiarchive.org/#/dandiset/000048 does have acquisition data and the file be read-in and re-exported. I think the issue is specific to the files from the Plitt/Giocomo DANDIset.
— You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub https://github.com/NeurodataWithoutBorders/matnwb/issues/336#issuecomment-955087846, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGOEETUBIU2BPYSVBBSQZLUJMP3XANCNFSM5G6IPVMQ .
-- Ben Dichter, PhD Data Science Consultant personal website http://bendichter.com/
Deleting the 'TwoPhotonSeries' acquisition group allows the file to be re-exported without issue. Do you know how this file was created? I'm wondering if there's something about how this file was created that is causing this issue. I tried creating test files with expandable datasets in the acquisition group but re-exporting those test files does not create an error.
I would like to do the following: 1) read in an nwb file 2) do something with it like create a new segmentation plane attached to an imaging plane 3) and export the file again
However, it seems I am unable to do the simpler task of just reading in the file and exporting it again. The following MATLAB code
produced this error:
I tried using the same file name for exporting, but that just throws a more ambiguous error
I found this thread for PyNWB and it seems the issue is resolved there with the addition of an .export method to NWBHDF5IO. Any thoughts on how to approach this in matnwb? Any tips appreciated