-
We are not able -and it is not probably a good idea- to support numpy 2.0 as soon as possible.
The ecocsystem will take some time to adapt and we should wait some time until most libraries that us…
-
### What happened?
Ran into this during testing on NeuroConv
### Steps to Reproduce
```shell
from pynwb import NWBHDF5IO
from hdmf_zarr import NWBZarrIO
from pynwb.testing.mock.file import mock_N…
-
Brought here from https://github.com/NeurodataWithoutBorders/pynwb/pull/953#discussion_r286616836.
-
Relates to #264 as possibly avoidable via complete avoidance of fetching an .nwb file in full twice. Also might be of interest in the scope of the https://github.com/OpenSourceBrain/DANDIArchiveShowca…
-
In HDF5, there are a few different mechanisms used to refer to other data in an HDF5 file from a location other than where they are stored.
For example, HDF5 supports different types of links like …
-
Prior ref:
- https://github.com/dandi/dandisets-healthstatus/issues/19
So we do store the logs from each run combined across all assets, e.g.
```shell
(venv) dandi@drogon:~/cronlib/dandisets-h…
-
PyNWB contains both base.py and core.py. That organization seems logical. I propose doing the same for the YAML files. PyNWB also separates device.py into its own file. I propose doing the same for th…
-
## Description
The HDMF documentation shows the following process for validating a file: https://hdmf.readthedocs.io/en/stable/validation.html
In talking with @rly, this used to work prior to ex…
-
To complete NeurodataWithoutBorders/pynwb#1128, explicit gain docs should be added to `CurrentClampStimulusSeries` and `VoltageClampStimulusSeries`as well.
So the units should be:
`CurrentClampSti…
-
### What happened?
Trying to use the `set_dataio` method on a `VectorData` object to compress data that has been added by row
Using the trials table as a simple example, see code for my failed att…