Glossary of Core NWB Tools

The glossary shown here provides a quick overview of the key software packages of the core NWB software stack. For a more general discussion of the overall organization of the core NWB software stack see the NWB Software Ecosystem page on the main NWB website.

Read/Write NWB File APIs

The NWB reference APIs provide full support for reading and writing all components of the NWB standard, including support for extensions. The APIs are interoperable, i.e., files created with PyNWB can be read in MatNWB and vice versa. Both PyNWB and MatNWB support advanced read/write for efficient interaction with very large data files (i.e., data too large for main memory), via lazy data loading, iterative data write, and data compression among others.

PyNWB:

PyNWB is the Python reference API for NWB. Docs Source

MatNWB

MatNWB is a MATLAB library for reading and writing NWB files. Docs Source



Converting Data to NWB

NWB Conversion Tools

The NWB Conversion Tools is a Python library for automatic conversion from proprietary data formats to NWB. Docs Source



Validating NWB Files

NWB provides tools both to check that files comply with the NWB standard schema as well as to check whether the data complies with NWB Best Practices. Validating compliance with the NWB schema ensures that files are structurally correct and can be read by NWB APIs. Validating compliance with best practices helps improve data quality and (re-)usability.

PyNWB: Validate schema compliance

The PyNWB reference Python API includes classes and command line tools for validating compliance of files with the core NWB schema and the schema of NWB Neurodata Extensions (NDX). Validation Docs

NWB Inspector: Validate best practice

The NWB Inspector is a python library and command-line tool for inspecting NWB files for adherence to NWB best practices Docs Source



Extending NWB

Neurodata Extensions (NDX) are used to extend the NWB data standard, e.g., to integrate new data types with NWB or define standard for lab- or project-specific metadata. The collection of tools listed here are used to create, document, publish extensions. To learn more about how create extensions see the Extending NWB section.

NDX Template

The NDX Template provides a template for creating Neurodata Extensions (NDX) for the NWB data standard. Source

When creating a new extension, the NDX-template will create a detailed NEXTSTEPS.md file describing how to create an extension and how to submit it to the NDX catalog.

NDX Catalog

The Neurodata Extensions Catalog (NDX Catalog) is a community-led catalog of Neurodata Extensions (NDX) to the NWB data standard. The NDX Catalog provides a central portal to search, publish, and review of NDX. Catalog Source

Publishing NDX

The staged-extensions GitHub repository is used to register new extensions for publication in the Neurodata Extensions Catalog (NDX Catalog). Source

Documentation Utilities

The HDMF Documentation Utilities (hdmf-docuils) provides utility tools for creating documentation for extension schema defined using the NWB Schema Language. Source

The NDX Template automatically sets up the documentation. As such, developers of extensions will commonly hdmf-docuils as part of the standard setup of NDX code repositories without having to interact with the tool directly.

NWB Format Specification

The NWB data standard is governed by the NWB Format Specification. When creating new extensions we typically build on and reuse existing neurodata_types already available in NWB. The NWB Format Specification provides a reference definition for all types available in NWB. The NWB schema itself includes/builds on the HDMF Common Schema. Docs Source

HDMF Common Schema

The HDMF Common Schema defines the schema of common, general data structures, which are used throughout the NWB Standard Schema but which are not specific to neurophysiology. Example types defined in the HDMF common schema incude, e.g., all types related to DynamicTable for defining data tables. Docs Source



Core Development

Understanding core development tools (e.g., HDMF) is useful for developers in particular when we need to dive deeper into the core data infrastructure for NWB, e.g., when changing or creating new storage methods or when developing features for common data types (e.g., DynamicTable) that are defined in HDMF and used in NWB.

HDMF

The Hierarchical Data Modeling Framework (HDMF) is a python package for working with hierarchical data. It provides APIs for specifying data models, reading and writing data to different storage backends, and representing data with Python object. HDMF builds the foundation for the PyNWB Python API for NWB. Docs Source