Closed bendichter closed 6 years ago
1847893 You'll need to call generateCore again. Matnwb's Export should've been enforcing required properties, it just wasn't throwing an error as assumed.
@ln-vidrio Now the ephys tutorial gives the following error:
Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack trace information:
H5G_traverse_real component not found
H5G_traverse internal path traversal failed
H5G_loc_find can't find object
H5Dopen1 not found
Error in H5D.open (line 22)
dataset_id = H5ML.hdf5lib2('H5Dopen', varargin{:});
Error in io.getRefData (line 7)
did = H5D.open(fid, refpaths{i});
Error in io.writeDataset (line 22)
data = io.getRefData(fid, data);
Error in types.core.NWBData/export (line 53)
io.writeDataset(fid, fullpath, class(obj.data), obj.data);
Error in types.core.ElectrodeTableRegion/export (line 43)
refs = export@types.core.NWBData(obj, fid, fullpath, refs);
Error in types.core.ElectricalSeries/export (line 57)
refs = obj.electrodes.export(fid, [fullpath '/electrodes'], refs);
Error in types.untyped.Set/export (line 117)
refs = v.export(fid, propfp, refs);
Error in types.core.NWBFile/export (line 416)
refs = obj.acquisition.export(fid, [fullpath '/acquisition'], refs);
Error in nwbfile/export (line 41)
refs = export@types.core.NWBFile(obj, fid, '/', {});
Error in nwbExport (line 34)
export(nwb(i), fn);
Error in ecephys (line 235)
nwbExport(nwb, 'ecephys_tutorial.nwb')
Do you have a more updated version of the ephys tutorial? The one currently in the repo runs fine for me.
@ln-vidrio I just copy/pasted the version in the master branch and got the same error. What version of HDF5 are you using?
>> [majnum,minnum,relnum] = H5.get_libversion()
majnum =
1
minnum =
8
relnum =
12
1.8.12, same as yours.
What does the export function of types.core.NWBData look like? Lines 55-62 should look something like this:
catch ME
if strcmp(ME.stack(2).name, 'getRefData') && endsWith(ME.stack(1).file, '+H5D\open.m')
refs = [refs {fullpath}];
return;
else
rethrow(ME);
end
end
If it does, place a breakpoint where the rethrow is and let me know what ME
looks like.
@ln-vidrio yeah, it looks exactly like that. My master branch is "Already up to date." but I get the same error.
>> version
ans =
'9.3.0.713579 (R2017b)'
I'm at a loss for how to else to help you figure out the setup of my computer that could be causing the problem. If it helps, I am getting two out-of-place warnings as well. It says I am overwriting the file when I am not, and it says the file cannot be found during the write (see below).
ecephys
Warning: Overwriting ecephys_tutorial.nwb
> In nwbfile/export (line 20)
In nwbExport (line 34)
In ecephys (line 234)
Warning: File 'ecephys_tutorial.nwb' not found.
> In nwbfile/export (line 21)
In nwbExport (line 34)
In ecephys (line 234)
Error using hdf5lib2
The HDF5 library encountered an error and produced the
following stack trace information:
H5G_traverse_real component not found
H5G_traverse internal path traversal
failed
H5G_loc_find can't find object
H5Dopen1 not found
Error in H5D.open (line 22)
dataset_id = H5ML.hdf5lib2('H5Dopen', varargin{:});
Error in io.getRefData (line 7)
did = H5D.open(fid, refpaths{i});
Error in io.writeDataset (line 22)
data = io.getRefData(fid, data);
Error in types.core.NWBData/export (line 53)
io.writeDataset(fid, fullpath,
class(obj.data), obj.data);
Error in types.core.ElectrodeTableRegion/export (line
43)
refs = export@types.core.NWBData(obj, fid,
fullpath, refs);
Error in types.core.ElectricalSeries/export (line 57)
refs = obj.electrodes.export(fid, [fullpath
'/electrodes'], refs);
Error in types.untyped.Set/export (line 117)
refs = v.export(fid, propfp,
refs);
Error in types.core.NWBFile/export (line 416)
refs = obj.acquisition.export(fid, [fullpath
'/acquisition'], refs);
Error in nwbfile/export (line 41)
refs = export@types.core.NWBFile(obj, fid,
'/', {});
Error in nwbExport (line 34)
export(nwb(i), fn);
Error in ecephys (line 234)
nwbExport(nwb, 'ecephys_tutorial.nwb')
f2c6924 does this work?
You'll also need to run generateCore again.
YES
I am getting this error:
Error using types.core.Clustering/export (line 116)
Property `peak_over_rms` is required.
Error in types.untyped.Set/export (line 117)
refs = v.export(fid, propfp,
refs);
Error in types.core.ProcessingModule/export (line 60)
refs = obj.nwbdatainterface.export(fid,
fullpath, refs);
Error in types.untyped.Set/export (line 117)
refs = v.export(fid, propfp,
refs);
Error in types.core.NWBFile/export (line 626)
refs = obj.processing.export(fid, [fullpath
'/processing'], refs);
Error in nwbfile/export (line 41)
refs = export@types.core.NWBFile(obj, fid,
'/', {});
Error in nwbExport (line 34)
export(nwb(i), fn);
which means you fixed the write issue and it looks like the required args checker is working as well!
An attribute that does not have a "required" field is required by default however this rule is not being enforced in matnwb. For instance the line:
electrode_table_region = types.core.ElectrodeTableRegion('data', rv);
in ecephys.m
works without error, but it should require a description
field (on write at least).
@ln-vidrio
0ccda42 This should include attribute checks on write. Will require regenerating classes.
The following runs without error:
%% Cortical Surface
% generateExtension('/Users/bendichter/dev/nwbext_ecog/nwbext_ecog/ecog.namespace.yaml');
file = nwbfile( ...
'source', 'source', ...
'session_description', 'a test NWB File', ...
'identifier', 'blockname', ...
'session_start_time', datestr(now, 'yyyy-mm-dd HH:MM:SS'), ...
'file_create_date', datestr(now, 'yyyy-mm-dd HH:MM:SS'));
cortical_surfaces = types.ecog.CorticalSurfaces;
file.acquisition.set('CorticalSurfaces', cortical_surfaces);
nwbExport(file, 'test_surface')
nwb_read = nwbRead('test_surface');
However, since ecog.CorticalSurfaces
inherits from NWBDataInterface (https://github.com/bendichter/nwbext_ecog/blob/master/nwbext_ecog/ecog.extensions.yaml), is should require a "source" arg. Failure to supply this arg results in a read error in pynwb, so it is important that we enforce this.
This is actually failing for me with the expected error.
Hmm I'll run it again. Might have been a branching mistake on my part
On Tue, Oct 9, 2018, 8:08 AM ln-vidrio notifications@github.com wrote:
So this is actually failing for me with the expected error.
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/NeurodataWithoutBorders/matnwb/issues/65#issuecomment-428230450, or mute the thread https://github.com/notifications/unsubscribe-auth/AAziElNyxG6xjwe2V3L7OmLpxZ7nFOxiks5ujLv-gaJpZM4WuqO6 .
>> !git reset --hard origin/master
HEAD is now at bee98a4 Overhaul of checkDims. Fixed what should've been an error in PhotonSeriesIOTest
>> generateExtension('/Users/bendichter/dev/nwbext_ecog/nwbext_ecog/ecog.namespace.yaml');
>> file = nwbfile( ...
'source', 'source', ...
'session_description', 'a test NWB File', ...
'identifier', 'blockname', ...
'session_start_time', datestr(now, 'yyyy-mm-dd HH:MM:SS'), ...
'file_create_date', datestr(now, 'yyyy-mm-dd HH:MM:SS'));
cortical_surfaces = types.ecog.CorticalSurfaces;
file.acquisition.set('CorticalSurfaces', cortical_surfaces);
nwbExport(file, 'test_surface')
nwb_read = nwbRead('test_surface');
Warning: Overwriting test_surface
> In nwbfile/export (line 20)
In nwbExport (line 34)
No error. Am I doing something wrong here?
generate core again. The actual source
property check is in NWBContainer.
yes of course! face palm Thanks!
pynwb is strict about required arguments, and is unable to read files that are missing required args, whereas matnwb is pretty lax about required args, and can write and read files that have missing args as long as those missing arguments do not cause dead links. This is fine if you are staying within matlab or within python, but it is causing an IO incompatibility where some files can be written in matnwb with no problem but then cannot be read in pynwb. Here is an example:
write in matlab:
try to read in python:
This issue is actually the result of 3 distinct problems: 1) It's not ideal that pynwb isn't able to read files where the intention of the file is clear, even if they are slightly out of spec. Ideally we could make the read functionality a little more flexible. 2) matnwb should really enforce the spec, or at least warn a user if they are missing required arguments. 3) A lot of these variables should really be optional in the schema.
I can work with the LBNL group on 1 and 3. Could you help out with 2?