Closed bendichter closed 6 years ago
Is this using the schema currently in the repo?
9fda8e07f91faa0c2f44e7471403b268094935d4 added a new test for UnitTimes. Roundtrip seems to work with just matnwb (schema may need to be update to nwb-schema current). Will look at pynwb interop.
Error using hdf5lib2
Incorrect number of uint8 values passed
in reference parameter.
Error in H5R.get_name (line 34)
name = H5ML.hdf5lib2('H5Rget_name',
loc_id, ref_type, ref, useUtf8);
Error in io.parseReference (line 7)
target = H5R.get_name(did, reftype,
data);
Error in io.parseDataset (line 25)
data = io.parseReference(did, tid,
H5D.read(did));
Error in io.parseGroup (line 14)
ds = io.parseDataset(filename,
ds_info, fp);
Error in io.parseGroup (line 26)
subg = io.parseGroup(filename,
g_info);
Error in io.parseGroup (line 26)
subg = io.parseGroup(filename,
g_info);
Error in io.parseGroup (line 26)
subg = io.parseGroup(filename,
g_info);
Error in nwbRead (line 25)
nwb = io.parseGroup(filename, info);
loading this file
It looks like this could possibly be another int8 vs int16 type issue
It's because it's an array of References, which I don't think is actually read in properly yet. I'll get this done by tomorrow.
sounds good. It's been great plowing through these issues with you!
See if this commit works actually: 944998ae980ebd352a1da28d184638ecb6a34808
Works!
>> nwb=nwbRead('test.nwb')
>> spikes = nwb.processing.get('cellular').nwbdatainterface.get('UnitTimes').spike_times_index.data(1).refresh(nwb);
>> spikes(1:10)
ans =
2.7670
4.6878
5.0803
5.3623
5.6367
5.7636
5.7712
5.9959
6.6608
7.0330
I am having trouble reading the
UnitTimes
datatype. Specifically, when I try to import data withUnitTimes
, I get the following error:I have tracked this down to an error building the
spike_times_index
component ofUnitTimes
, which is a list of range references. This is data written usingpynwb
. Could this be due to the switch of dimensions mentioned in the README? I'll try to replicate the error using a file written inmatnwb
as well.