-
@rly and @bendichter have taken time to provide another round of feedback on ecephys nwb files (generated from 32a6817) and have identified the following critical issues:
- [x] raw_running_wheel_ro…
njmei updated
4 years ago
-
Is it possible to write a very large block of data that does not fit in memory all at once?
-
`generateCore` currently expects paths `schema/core/nwb.namespace.yaml` and `schema/common/namespace.yaml`.
`git clone https://github.com/NeurodataWithoutBorders/nwb-schema.git`
creates filepat…
-
## 1) Bug
When downloading NWB files through the AllenSDK I can't seem to read the files using the io.read() command after creating an NWBHDF5IO object. When I open the file with an external NWB re…
-
PyNWB has an input to the read function that that specifies wether you want to read the file with the intentions of appending when writing, as opposed to overwriting. Does such a functionality exist i…
-
At the current moment, MatNWB does not support writing schema files or cached specs.
The reason for not writing cached specs stems from the fact that cached specs are, by default, only written in J…
-
Tab completion is an important part of discovery (for me at least). However, with all the subsref and properties trickiness, tab completion no longer works when you're navigating deeply within a hiera…
-
Found a potential issue in types.core.TimeIntervals that could do with a warning/check. Using this to set trial details will accept trial definitions with unequal lengths e.g.:
trials = types.core.…
-
For the Ophys pipeline the raw data that is stored in TwoPhotonSeries is required to be in a Nframes by Y by X matrix while the ROI segmentation is in a NCells by X by Y matrix. This will cause erro…
-
This is an issue that @JFPerkins and I ran into with imagemasks. The dtype is [specified as float](https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/src/pynwb/data/nwb.ophys.yaml#L4), which ma…