hdmf-dev / hdmf-common-schema

Specifications for pre-defined data structures provided by HDMF.
Other
3 stars 8 forks source link

hdmf-common

Documentation of the HDMF Common data format specification is available at https://hdmf-common-schema.readthedocs.io

Citing HDMF Common Schema

Description

The HDMF Common specification defines a collection of common, reusable data structures that build the foundation for the modeling of advanced data formats, e.g., the Neurodata Without Borders (NWB) neurophysiology data standard. The HDMF Common schema is integrated with HDMF, which provides advanced APIs for reading, writing, and using HDMF-common data types.

The HDMF-common schema provides the following data structures:

The schema also provides the following base data structures:

Finally, HDMF-common contains experimental data structures. Prior to adding a new data type to the HDMF-common specification, new data structures are added to the HDMF-experimental to enable users to experiment with these data structures. Because these data structures are experimental, they are not guaranteed to maintain backward compatibility, and may never make it into HDMF-common.

Current experimental data types are:

Generate documentation

pip install -r requirements-doc.txt
cd docs
make fulldoc