Project Aeon's main library for interfacing with acquired data. Contains modules for raw data file io, data querying, data processing, data qc, database ingestion, and building computational data pipelines.
BSD 3-Clause "New" or "Revised" License
6
stars
6
forks
source link
Generalizing the low-level data API for ephys data #371
Ephys data is timestamped with different clock at acquisition time
There is a global correspondence table between clocks that can be used to build an interpolator but would be nice if the architecture could support this seamlessly
Names of files are different since we don't have a chunk timestamp at acquisition time
Ephys data is too big to load entire chunks (support for sub-chunking at read time would be useful)
Each chunk of neuropixel probe data is 81 GB
Even just one chunk of per-sample timestamps is ~800 MB
np.fromfile supports chunking, could also consider using memory mapping
There are multiple devices inside the ephys headstage, how should we group them?
Consider if they should be different devices at the dataset level, or sub-devices inside a headstage device
Might be useful to keep the flat stream structure at the dataset level
Points to consider:
np.fromfile
supports chunking, could also consider using memory mapping