Glossary of Core NWB Tools
The glossary shown here provides a quick overview of the key software packages of the core NWB software stack. For a more general discussion of the overall organization of the core NWB software stack see the NWB Software Ecosystem page on the main NWB website.
Read/Write NWB File APIs
The NWB reference APIs provide full support for reading and writing all components of the NWB standard, including support for extensions. The APIs are interoperable, i.e., files created with PyNWB can be read in MatNWB and vice versa. Both PyNWB and MatNWB support advanced read/write for efficient interaction with very large data files (i.e., data too large for main memory), via lazy data loading, iterative data write, and data compression among others.
Converting Data to NWB
NWB Conversion Tools
Validating NWB Files
NWB provides tools both to check that files comply with the NWB standard schema as well as to check whether the data complies with NWB Best Practices. Validating compliance with the NWB schema ensures that files are structurally correct and can be read by NWB APIs. Validating compliance with best practices helps improve data quality and (re-)usability.
PyNWB: Validate schema compliance
NWB Inspector: Validate best practice
Neurodata Extensions (NDX) are used to extend the NWB data standard, e.g., to integrate new data types with NWB or define standard for lab- or project-specific metadata. The collection of tools listed here are used to create, document, publish extensions. To learn more about how create extensions see the Extending NWB section.
The NDX Template automatically sets up the documentation. As such, developers of extensions will commonly hdmf-docuils as part of the standard setup of NDX code repositories without having to interact with the tool directly.
NWB Format Specification
The NWB data standard is governed by the NWB Format Specification. When creating new extensions we typically build on and reuse existing
neurodata_typesalready available in NWB. The NWB Format Specification provides a reference definition for all types available in NWB. The NWB schema itself includes/builds on the HDMF Common Schema. Docs Source
HDMF Common Schema
The HDMF Common Schema defines the schema of common, general data structures, which are used throughout the NWB Standard Schema but which are not specific to neurophysiology. Example types defined in the HDMF common schema incude, e.g., all types related to
DynamicTablefor defining data tables. Docs Source
Understanding core development tools (e.g., HDMF) is useful for developers in particular when we need to dive deeper into the core data infrastructure for NWB, e.g., when changing or creating new storage methods or when developing features for common data types (e.g.,
DynamicTable) that are defined in HDMF and used in NWB.
The Hierarchical Data Modeling Framework (HDMF) is a python package for working with hierarchical data. It provides APIs for specifying data models, reading and writing data to different storage backends, and representing data with Python object. HDMF builds the foundation for the PyNWB Python API for NWB. Docs Source