Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add internal link checking to sphinx-build #1827

Merged
merged 20 commits into from
Feb 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
name: Check Sphinx external links
name: Check Sphinx links
on:
pull_request:
schedule:
- cron: '0 5 * * *' # once per day at midnight ET
workflow_dispatch:

jobs:
check-external-links:
check-sphinx-links:
runs-on: ubuntu-latest
steps:
- name: Cancel non-latest runs
Expand All @@ -31,5 +31,5 @@ jobs:
python -m pip install -r requirements-doc.txt
python -m pip install .

- name: Check Sphinx external links
run: sphinx-build -b linkcheck ./docs/source ./test_build
- name: Check Sphinx internal and external links
run: sphinx-build -W -b linkcheck ./docs/source ./test_build
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
### Documentation and tutorial enhancements
- Add RemFile to streaming tutorial. @bendichter [#1761](https://github.com/NeurodataWithoutBorders/pynwb/pull/1761)
- Fix typos and improve clarify throughout tutorials. @zm711 [#1825](https://github.com/NeurodataWithoutBorders/pynwb/pull/1825)
- Fix internal links in docstrings and tutorials. @stephprince [#1827](https://github.com/NeurodataWithoutBorders/pynwb/pull/1827)
- Add Zarr IO tutorial @bendichter [#1834](https://github.com/NeurodataWithoutBorders/pynwb/pull/1834)

## PyNWB 2.5.0 (August 18, 2023)
Expand Down
2 changes: 1 addition & 1 deletion docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ changes:
@echo "The overview file is in $(BUILDDIR)/changes."

linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
$(SPHINXBUILD) -W -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/advanced_io/h5dataio.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@


####################
# Normally if we create a :py:class:`~pynwb.file.TimeSeries` we would do
# Normally if we create a :py:class:`~pynwb.base.TimeSeries` we would do

import numpy as np

Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/advanced_io/plot_editing.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@
# Editing groups
# --------------
# Editing of groups is not yet supported in PyNWB.
# To edit the attributes of a group, open the file and edit it using :py:mod:`h5py`:
# To edit the attributes of a group, open the file and edit it using ``h5py``:

import h5py

Expand Down
9 changes: 4 additions & 5 deletions docs/gallery/domain/images.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,7 @@
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# :py:class:`~pynwb.image.RGBAImage` is for storing data of color image with transparency.
# :py:attr:`~pynwb.image.RGBAImage.data` must be 3D where the first and second dimensions
# ``RGBAImage.data`` must be 3D where the first and second dimensions
# represent x and y. The third dimension has length 4 and represents the RGBA value.
#

Expand All @@ -208,7 +208,7 @@
# ^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# :py:class:`~pynwb.image.RGBImage` is for storing data of RGB color image.
# :py:attr:`~pynwb.image.RGBImage.data` must be 3D where the first and second dimensions
# ``RGBImage.data`` must be 3D where the first and second dimensions
# represent x and y. The third dimension has length 3 and represents the RGB value.
#

Expand All @@ -224,8 +224,7 @@
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# :py:class:`~pynwb.image.GrayscaleImage` is for storing grayscale image data.
# :py:attr:`~pynwb.image.GrayscaleImage.data` must be 2D where the first and second dimensions
# represent x and y.
# ``GrayscaleImage.data`` must be 2D where the first and second dimensions represent x and y.
#

gs_logo = GrayscaleImage(
Expand Down Expand Up @@ -300,7 +299,7 @@

####################
# Here `data` contains the (0-indexed) index of the displayed image as they are ordered
# in the :py:class:`~pynwb.base.ImageReference`.
# in the :py:class:`~pynwb.base.ImageReferences`.
#
# Writing the images to an NWB File
# ---------------------------------------
Expand Down
4 changes: 2 additions & 2 deletions docs/gallery/domain/ophys.py
Original file line number Diff line number Diff line change
Expand Up @@ -540,7 +540,7 @@
# Data arrays are read passively from the file.
# Calling the data attribute on a :py:class:`~pynwb.base.TimeSeries`
# such as a :py:class:`~pynwb.ophys.RoiResponseSeries` does not read the data
# values, but presents an :py:class:`~h5py` object that can be indexed to read data.
# values, but presents an ``h5py`` object that can be indexed to read data.
# You can use the ``[:]`` operator to read the entire data array into memory.
# Load and print all the data values of the :py:class:`~pynwb.ophys.RoiResponseSeries`
# object representing the fluorescence data.
Expand All @@ -558,7 +558,7 @@
#
# It is often preferable to read only a portion of the data. To do this, index
# or slice into the data attribute just like if you were indexing or slicing a
# :py:class:`~numpy` array.
# :py:mod:`numpy` array.
#
# The following code prints elements ``0:10`` in the first dimension (time)
# and ``0:3`` (ROIs) in the second dimension from the fluorescence data we have written.
Expand Down
13 changes: 6 additions & 7 deletions docs/gallery/domain/plot_icephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@
#####################################################################
# .. note:: Since :py:meth:`~pynwb.file.NWBFile.add_intracellular_recording` can automatically add
# the objects to the NWBFile we do not need to separately call
# :py:meth:`~pynwb.file.NWBFile.add_stimulus` and :py:meth:`~pynwb.file.NWBFile.add_acquistion`
# :py:meth:`~pynwb.file.NWBFile.add_stimulus` and :py:meth:`~pynwb.file.NWBFile.add_acquisition`
# to add our stimulus and response, but it is still fine to do so.
#
# .. note:: The ``id`` parameter in the call is optional and if the ``id`` is omitted then PyNWB will
Expand Down Expand Up @@ -495,8 +495,7 @@
# .. note:: The same process applies to all our other tables as well. We can use the
# corresponding :py:meth:`~pynwb.file.NWBFile.get_intracellular_recordings`,
# :py:meth:`~pynwb.file.NWBFile.get_icephys_sequential_recordings`,
# :py:meth:`~pynwb.file.NWBFile.get_icephys_repetitions`, and
# :py:meth:`~pynwb.file.NWBFile.get_icephys_conditions` functions instead.
# :py:meth:`~pynwb.file.NWBFile.get_icephys_repetitions` functions instead.
# In general, we can always use the get functions instead of accessing the property
# of the file.
#
Expand All @@ -507,7 +506,7 @@
#
# Add a single simultaneous recording consisting of a set of intracellular recordings.
# Again, setting the id for a simultaneous recording is optional. The recordings
# argument of the :py:meth:`~pynwb.file.NWBFile.add_simultaneous_recording` function
# argument of the :py:meth:`~pynwb.file.NWBFile.add_icephys_simultaneous_recording` function
# here is simply a list of ints with the indices of the corresponding rows in
# the :py:class:`~pynwb.icephys.IntracellularRecordingsTable`
#
Expand Down Expand Up @@ -564,7 +563,7 @@
# Add a single sequential recording consisting of a set of simultaneous recordings.
# Again, setting the id for a sequential recording is optional. Also this table is
# optional and will be created automatically by NWBFile. The ``simultaneous_recordings``
# argument of the :py:meth:`~pynwb.file.NWBFile.add_sequential_recording` function
# argument of the :py:meth:`~pynwb.file.NWBFile.add_icephys_sequential_recording` function
# here is simply a list of ints with the indices of the corresponding rows in
# the :py:class:`~pynwb.icephys.SimultaneousRecordingsTable`.

Expand All @@ -579,7 +578,7 @@
# Add a single repetition consisting of a set of sequential recordings. Again, setting
# the id for a repetition is optional. Also this table is optional and will be created
# automatically by NWBFile. The ``sequential_recordings argument`` of the
# :py:meth:`~pynwb.file.NWBFile.add_sequential_recording` function here is simply
# :py:meth:`~pynwb.file.NWBFile.add_icephys_repetition` function here is simply
# a list of ints with the indices of the corresponding rows in
# the :py:class:`~pynwb.icephys.SequentialRecordingsTable`.

Expand All @@ -592,7 +591,7 @@
# Add a single experimental condition consisting of a set of repetitions. Again,
# setting the id for a condition is optional. Also this table is optional and
# will be created automatically by NWBFile. The ``repetitions`` argument of
# the :py:meth:`~pynwb.file.NWBFile.add_icephys_condition` function again is
# the :py:meth:`~pynwb.file.NWBFile.add_icephys_experimental_condition` function again is
# simply a list of ints with the indices of the correspondingto rows in the
# :py:class:`~pynwb.icephys.RepetitionsTable`.

Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/general/add_remove_containers.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@
# have raw data and processed data in the same NWB file and you want to create a new NWB file with all the contents of
# the original file except for the raw data for sharing with collaborators.
#
# To remove existing containers, use the :py:class:`~hdmf.utils.LabelledDict.pop` method on any
# To remove existing containers, use the :py:meth:`~hdmf.utils.LabelledDict.pop` method on any
# :py:class:`~hdmf.utils.LabelledDict` object, such as ``NWBFile.acquisition``, ``NWBFile.processing``,
# ``NWBFile.analysis``, ``NWBFile.processing``, ``NWBFile.scratch``, ``NWBFile.devices``, ``NWBFile.stimulus``,
# ``NWBFile.stimulus_template``, ``NWBFile.electrode_groups``, ``NWBFile.imaging_planes``,
Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/general/object_id.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
unique and used widely across computing platforms as if they are unique.

The object ID of an NWB container object can be accessed using the
:py:meth:`~hdmf.container.AbstractContainer.object_id` method.
:py:attr:`~hdmf.container.AbstractContainer.object_id` method.

.. _UUID: https://en.wikipedia.org/wiki/Universally_unique_identifier

Expand Down
4 changes: 2 additions & 2 deletions docs/gallery/general/plot_file.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
* :ref:`modules_overview`, i.e., objects for storing and grouping analyses, and
* experiment metadata and other metadata related to data provenance.

The following sections describe the :py:class:`~pynwb.base.TimeSeries` and :py:class:`~pynwb.base.ProcessingModules`
The following sections describe the :py:class:`~pynwb.base.TimeSeries` and :py:class:`~pynwb.base.ProcessingModule`
classes in further detail.

.. _timeseries_overview:
Expand Down Expand Up @@ -569,7 +569,7 @@

####################
# :py:class:`~hdmf.common.table.DynamicTable` and its subclasses can be converted to a pandas
# :py:class:`~pandas.DataFrame` for convenient analysis using :py:meth:`.DynamicTable.to_dataframe`.
# :py:class:`~pandas.DataFrame` for convenient analysis using :py:meth:`~hdmf.common.table.DynamicTable.to_dataframe`.

nwbfile.trials.to_dataframe()

Expand Down
30 changes: 13 additions & 17 deletions docs/gallery/general/plot_timeintervals.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,16 +9,14 @@
:py:class:`~pynwb.epoch.TimeIntervals` type. The :py:class:`~pynwb.epoch.TimeIntervals` type is
a :py:class:`~hdmf.common.table.DynamicTable` with the following columns:

1. :py:meth:`~pynwb.epoch.TimeIntervals.start_time` and :py:meth:`~pynwb.epoch.TimeIntervals.stop_time`
describe the start and stop times of intervals as floating point offsets in seconds relative to the
:py:meth:`~pynwb.file.NWBFile.timestamps_reference_time` of the file. In addition,
2. :py:class:`~pynwb.epoch.TimeIntervals.tags` is an optional, indexed column used to associate user-defined string
tags with intervals (0 or more tags per time interval)
3. :py:class:`~pynwb.epoch.TimeIntervals.timeseries` is an optional, indexed
:py:class:`~pynwb.base.TimeSeriesReferenceVectorData` column to map intervals directly to ranges in select,
relevant :py:class:`~pynwb.base.TimeSeries` (0 or more per time interval)
1. ``start_time`` and ``stop_time`` describe the start and stop times of intervals as floating point offsets in seconds
relative to the :py:meth:`~pynwb.file.NWBFile.timestamps_reference_time` of the file. In addition,
2. ``tags`` is an optional, indexed column used to associate user-defined string tags with intervals (0 or more tags per
time interval)
3. ``timeseries`` is an optional, indexed :py:class:`~pynwb.base.TimeSeriesReferenceVectorData` column to map intervals
directly to ranges in select, relevant :py:class:`~pynwb.base.TimeSeries` (0 or more per time interval)
4. as a :py:class:`~hdmf.common.table.DynamicTable` user may add additional columns to
:py:meth:`~pynwb.epoch.TimeIntervals` via :py:class:`~hdmf.common.table.DynamicTable.add_column`
:py:meth:`~pynwb.epoch.TimeIntervals` via :py:meth:`~hdmf.common.table.DynamicTable.add_column`


.. hint:: :py:meth:`~pynwb.epoch.TimeIntervals` is intended for storing general annotations of time ranges.
Expand Down Expand Up @@ -84,12 +82,10 @@
# ^^^^^^
#
# Trials can be added to an NWB file using the methods :py:meth:`~pynwb.file.NWBFile.add_trial`
# By default, NWBFile only requires trial :py:meth:`~pynwb.file.NWBFile.add_trial.start_time`
# and :py:meth:`~pynwb.file.NWBFile.add_trial.end_time`. The :py:meth:`~pynwb.file.NWBFile.add_trial.tags`
# and :py:meth:`~pynwb.file.NWBFile.add_trial.timeseries` are optional. For
# :py:meth:`~pynwb.file.NWBFile.add_trial.timeseries` we only need to supply the :py:class:`~pynwb.base.TimeSeries`.
# By default, NWBFile only requires trial ``start_time`` and ``stop_time``. The ``tags`` and ``timeseries`` are
# optional. For ``timeseries`` we only need to supply the :py:class:`~pynwb.base.TimeSeries`.
# PyNWB automatically calculates the corresponding index range (described by ``idx_start`` and ``count``) for
# the supplied :py:class:`~pynwb.base.TimeSeries based on the given ``start_time`` and ``stop_time`` and
# the supplied :py:class:`~pynwb.base.TimeSeries` based on the given ``start_time`` and ``stop_time`` and
# the :py:meth:`~pynwb.base.TimeSeries.timestamps` (or :py:class:`~pynwb.base.TimeSeries.starting_time`
# and :py:meth:`~pynwb.base.TimeSeries.rate`) of the given :py:class:`~pynwb.base.TimeSeries`.
#
Expand Down Expand Up @@ -199,7 +195,7 @@
#
# To define custom, experiment-specific :py:class:`~pynwb.epoch.TimeIntervals` we can add them
# either: 1) when creating the :py:class:`~pynwb.file.NWBFile` by defining the
# :py:meth:`~pynwb.file.NWBFile.__init__.intervals` constructor argument or 2) via the
# ``intervals`` constructor argument or 2) via the
# :py:meth:`~pynwb.file.NWBFile.add_time_intervals` or :py:meth:`~pynwb.file.NWBFile.create_time_intervals`
# after the :py:class:`~pynwb.file.NWBFile` has been created.
#
Expand Down Expand Up @@ -286,9 +282,9 @@
# Adding TimeSeries references to other tables
# --------------------------------------------
#
# Since :py:class:`~pynwb.base.TimeSeriesReferenceVectorData` is a regular :py:class:`~hdmf.common.table.VectoData`
# Since :py:class:`~pynwb.base.TimeSeriesReferenceVectorData` is a regular :py:class:`~hdmf.common.table.VectorData`
# type, we can use it to add references to intervals in :py:class:`~pynwb.base.TimeSeries` to any
# :py:class:`~hdmf.common.table.DynamicTable`. In the :py:class:`~pynwb.icephys.IntracellularRecordingTable`, e.g.,
# :py:class:`~hdmf.common.table.DynamicTable`. In the :py:class:`~pynwb.icephys.IntracellularRecordingsTable`, e.g.,
# it is used to reference the recording of the stimulus and response associated with a particular intracellular
# electrophysiology recording.
#
Expand Down
2 changes: 1 addition & 1 deletion docs/make.bat
Original file line number Diff line number Diff line change
Expand Up @@ -183,7 +183,7 @@ if "%1" == "changes" (
)

if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
%SPHINXBUILD% -W -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
Expand Down
5 changes: 5 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,6 +148,7 @@ def __call__(self, filename):
'fsspec': ("https://filesystem-spec.readthedocs.io/en/latest/", None),
'nwbwidgets': ("https://nwb-widgets.readthedocs.io/en/latest/", None),
'nwb-overview': ("https://nwb-overview.readthedocs.io/en/latest/", None),
'zarr': ("https://zarr.readthedocs.io/en/stable/", None),
'hdmf-zarr': ("https://hdmf-zarr.readthedocs.io/en/latest/", None),
'numcodecs': ("https://numcodecs.readthedocs.io/en/latest/", None),
}
Expand All @@ -164,6 +165,10 @@ def __call__(self, filename):
'hdmf-zarr': ('https://hdmf-zarr.readthedocs.io/en/latest/%s', '%s'),
}

nitpicky = True
nitpick_ignore = [('py:class', 'Intracomm'),
('py:class', 'BaseStorageSpec')]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

Expand Down
2 changes: 1 addition & 1 deletion environment-ros3.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ channels:
dependencies:
- python==3.11
- h5py==3.8.0
- hdmf==3.5.4
- hdmf==3.12.1
- matplotlib==3.7.1
- numpy==1.24.2
- pandas==2.0.0
Expand Down
2 changes: 1 addition & 1 deletion requirements-min.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# minimum versions of package dependencies for installing PyNWB
h5py==2.10 # support for selection of datasets with list of indices added in 2.10
hdmf==3.12.0
hdmf==3.12.1
numpy==1.18
pandas==1.1.5
python-dateutil==2.7.3
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# pinned dependencies to reproduce an entire development environment to use PyNWB
h5py==3.10.0
hdmf==3.12.0
hdmf==3.12.1
numpy==1.26.1
pandas==2.1.2
python-dateutil==2.8.2
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

reqs = [
'h5py>=2.10',
'hdmf>=3.12.0',
'hdmf>=3.12.1',
'numpy>=1.16',
'pandas>=1.1.5',
'python-dateutil>=2.7.3',
Expand Down
19 changes: 12 additions & 7 deletions src/pynwb/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,15 +147,19 @@ def _dec(cls):
_dec(container_cls)


def get_nwbfile_version(h5py_file: h5py.File):
@docval({'name': 'h5py_file', 'type': h5py.File, 'doc': 'An NWB file'}, rtype=tuple,
is_method=False,)
def get_nwbfile_version(**kwargs):
"""
Get the NWB version of the file if it is an NWB file.
:returns: Tuple consisting of: 1) the original version string as stored in the file and
2) a tuple with the parsed components of the version string, consisting of integers
and strings, e.g., (2, 5, 1, beta). (None, None) will be returned if the file is not a valid NWB file
or the nwb_version is missing, e.g., in the case when no data has been written to the file yet.

:Returns: Tuple consisting of: 1) the
original version string as stored in the file and 2) a tuple with the parsed components of the version string,
consisting of integers and strings, e.g., (2, 5, 1, beta). (None, None) will be returned if the file is not a
valid NWB file or the nwb_version is missing, e.g., in the case when no data has been written to the file yet.
"""
# Get the version string for the NWB file
h5py_file = getargs('h5py_file', kwargs)
try:
nwb_version_string = h5py_file.attrs['nwb_version']
# KeyError occurs when the file is empty (e.g., when creating a new file nothing has been written)
Expand Down Expand Up @@ -251,7 +255,7 @@ def can_read(path: str):
'doc': 'a path to a namespace, a TypeMap, or a list consisting paths to namespaces and TypeMaps',
'default': None},
{'name': 'file', 'type': [h5py.File, 'S3File'], 'doc': 'a pre-existing h5py.File object', 'default': None},
{'name': 'comm', 'type': "Intracomm", 'doc': 'the MPI communicator to use for parallel I/O',
{'name': 'comm', 'type': 'Intracomm', 'doc': 'the MPI communicator to use for parallel I/O',
'default': None},
{'name': 'driver', 'type': str, 'doc': 'driver for h5py to use when opening HDF5 file', 'default': None},
{'name': 'herd_path', 'type': str, 'doc': 'The path to the HERD',
Expand Down Expand Up @@ -327,7 +331,8 @@ def read(self, **kwargs):
{'name': 'nwbfile', 'type': 'NWBFile',
'doc': 'the NWBFile object to export. If None, then the entire contents of src_io will be exported',
'default': None},
{'name': 'write_args', 'type': dict, 'doc': 'arguments to pass to :py:meth:`write_builder`',
{'name': 'write_args', 'type': dict,
'doc': 'arguments to pass to :py:meth:`~hdmf.backends.io.HDMFIO.write_builder`',
'default': None})
def export(self, **kwargs):
"""
Expand Down
Loading
Loading