Skip to content

Commit

Permalink
Update applyChunkConfiguration and dependent functions to work with n…
Browse files Browse the repository at this point in the history
…ew template
  • Loading branch information
ehennestad committed Jan 21, 2025
1 parent e5f9bc7 commit c7402d8
Show file tree
Hide file tree
Showing 8 changed files with 187 additions and 61 deletions.
9 changes: 7 additions & 2 deletions +io/+config/+internal/computeChunkSizeFromConfig.m
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,13 @@
numDimensions = numel(dataSize);

Check warning on line 23 in +io/+config/+internal/computeChunkSizeFromConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/computeChunkSizeFromConfig.m#L21-L23

Added lines #L21 - L23 were not covered by tests

% Extract relevant configuration parameters
chunkDimensions = chunkSpecification.data.chunk_dimensions;
defaultChunkSize = chunkSpecification.chunk_default_size; % in bytes
chunkDimensions = chunkSpecification.chunk_dimensions;
if iscell(chunkDimensions)
numChunkDimensions = cellfun(@numel, chunkDimensions);
chunkDimensions = chunkDimensions{numChunkDimensions == numDimensions};

Check warning on line 29 in +io/+config/+internal/computeChunkSizeFromConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/computeChunkSizeFromConfig.m#L26-L29

Added lines #L26 - L29 were not covered by tests
end

defaultChunkSize = chunkSpecification.target_chunk_size.value; % in bytes
dataByteSize = io.config.internal.getDataByteSize(A);

Check warning on line 33 in +io/+config/+internal/computeChunkSizeFromConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/computeChunkSizeFromConfig.m#L32-L33

Added lines #L32 - L33 were not covered by tests

% Initialize chunk size array
Expand Down
41 changes: 41 additions & 0 deletions +io/+config/+internal/configureDataPipeFromData.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
function dataPipe = configureDataPipeFromData(numericData, datasetConfig)
% configureDataPipeFromData - Configure a DataPipe from numeric data and dataset configuration

import io.config.internal.computeChunkSizeFromConfig
import types.untyped.datapipe.properties.DynamicFilter

chunkSize = computeChunkSizeFromConfig(numericData, datasetConfig);
maxSize = size(numericData);

Check warning on line 8 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L7-L8

Added lines #L7 - L8 were not covered by tests

dataPipeArgs = {...
"data", numericData, ...
"maxSize", maxSize, ...
"chunkSize", chunkSize };

Check warning on line 13 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L10-L13

Added lines #L10 - L13 were not covered by tests

hasShuffle = contains(datasetConfig.compression.prefilters, 'shuffle');

Check warning on line 15 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L15

Added line #L15 was not covered by tests

if strcmpi(datasetConfig.compression.algorithm, "Deflate")

Check warning on line 17 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L17

Added line #L17 was not covered by tests
% Use standard compression filters
dataPipeArgs = [ dataPipeArgs, ...
{'hasShuffle', hasShuffle, ...
'compressionLevel', datasetConfig.compression.level} ...
];

Check warning on line 22 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L19-L22

Added lines #L19 - L22 were not covered by tests
else
% Create property list of custom filters for dataset creation
compressionFilter = DynamicFilter( ...
datasetConfig.compression.algorithm, ...
datasetConfig.compression.level );

Check warning on line 27 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L25-L27

Added lines #L25 - L27 were not covered by tests

if hasShuffle
shuffleFilter = types.untyped.datapipe.properties.Shuffle();
filters = [shuffleFilter compressionFilter];

Check warning on line 31 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L29-L31

Added lines #L29 - L31 were not covered by tests
else
filters = compressionFilter;

Check warning on line 33 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L33

Added line #L33 was not covered by tests
end
dataPipeArgs = [ dataPipeArgs, ...
{'filters', filters} ];

Check warning on line 36 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L35-L36

Added lines #L35 - L36 were not covered by tests
end

% Create the datapipe.
dataPipe = types.untyped.DataPipe( dataPipeArgs{:} );

Check warning on line 40 in +io/+config/+internal/configureDataPipeFromData.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/configureDataPipeFromData.m#L40

Added line #L40 was not covered by tests
end
4 changes: 4 additions & 0 deletions +io/+config/+internal/reconfigureDataPipe.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
function dataPipe = reconfigureDataPipe(dataPipe, datasetConfig)
% todo
end

17 changes: 11 additions & 6 deletions +io/+config/+internal/resolveDataTypeChunkConfig.m
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
function resolvedOptions = resolveDataTypeChunkConfig(chunkSpecification, nwbObject)
function resolvedOptions = resolveDataTypeChunkConfig(chunkSpecification, nwbObject, datasetName)
% resolveDataTypeChunkConfig - Resolve the chunk options for individual datatypes
% This function resolves the chunk configuration options for a given NWB object
% by traversing the object hierarchy and combining options from the most specific
Expand All @@ -14,10 +14,11 @@
arguments
chunkSpecification (1,1) struct
nwbObject (1,1) types.untyped.MetaClass
datasetName (1,1) string
end

% Initialize resolvedOptions with an empty struct
resolvedOptions = struct();
% Initialize resolvedOptions with default options.
resolvedOptions = chunkSpecification.Default;

Check warning on line 21 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L21

Added line #L21 was not covered by tests

% Get the NWB object type hierarchy (from most specific to base type)
typeHierarchy = getTypeHierarchy(nwbObject);

Check warning on line 24 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L24

Added line #L24 was not covered by tests
Expand All @@ -26,12 +27,16 @@
for i = numel(typeHierarchy):-1:1
typeName = typeHierarchy{i};

Check warning on line 28 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L27-L28

Added lines #L27 - L28 were not covered by tests

% Check if the type has a chunkSpecification
% Check if the neurodata type has a chunkSpecification
if isfield(chunkSpecification, typeName)
typeOptions = chunkSpecification.(typeName);

Check warning on line 32 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L31-L32

Added lines #L31 - L32 were not covered by tests

% Merge options into resolvedOptions
resolvedOptions = mergeStructs(resolvedOptions, typeOptions);
% Is datasetName part of typeOptions?
if isfield(typeOptions, datasetName)

Check warning on line 35 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L35

Added line #L35 was not covered by tests
% Merge options into resolvedOptions
datasetOptions = typeOptions.(datasetName);
resolvedOptions = mergeStructs(resolvedOptions, datasetOptions);

Check warning on line 38 in +io/+config/+internal/resolveDataTypeChunkConfig.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/+internal/resolveDataTypeChunkConfig.m#L37-L38

Added lines #L37 - L38 were not covered by tests
end
end
end
end
Expand Down
119 changes: 80 additions & 39 deletions +io/+config/applyChunkConfiguration.m
Original file line number Diff line number Diff line change
@@ -1,49 +1,90 @@
function applyChunkConfiguration(nwbObject, chunkConfiguration)
function applyChunkConfiguration(nwbObject, chunkConfiguration, options)
% applyChunkConfiguration - Apply chunk configuration to datasets of an NWB object

arguments
nwbObject (1,1) NwbFile
chunkConfiguration (1,1) struct = io.config.readDefaultChunkConfiguration()
nwbObject (1,1) types.untyped.MetaClass
chunkConfiguration (1,1) struct = io.config.readDefaultChunkConfiguration() % Todo: class for this...?
options.OverrideExisting (1,1) logical = false

Check warning on line 7 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L6-L7

Added lines #L6 - L7 were not covered by tests
end

import io.config.internal.resolveDataTypeChunkConfig

objectMap = nwbObject.searchFor('');
objectKeys = objectMap.keys();

filteredObjectMap = containers.Map();
for i = 1:numel(objectKeys)
thisObjectKey = objectKeys{i};
thisNwbObject = objectMap(thisObjectKey);
if startsWith(class(thisNwbObject), "types.") && ~startsWith(class(thisNwbObject), "types.untyped")
filteredObjectMap(thisObjectKey) = thisNwbObject;
end
if isa(nwbObject, 'NwbFile')
neurodataObjects = getNeurodataObjectsFromNwbFile(nwbObject);

Check warning on line 13 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L12-L13

Added lines #L12 - L13 were not covered by tests
else
neurodataObjects = {nwbObject};

Check warning on line 15 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L15

Added line #L15 was not covered by tests
end
clear objectMap

for iNeurodataObject = 1:numel(neurodataObjects)
thisNeurodataObject = neurodataObjects{iNeurodataObject};
thisNeurodataClassName = class(thisNeurodataObject);

Check warning on line 20 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L18-L20

Added lines #L18 - L20 were not covered by tests

% Need to keep track of this. A dataset can be defined across
% multiple levels of the class hierarchy, the lowest class should
% take precedence
processedDatasets = string.empty;

Check warning on line 25 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L25

Added line #L25 was not covered by tests

isFinished = false;
while ~isFinished % Iterate over type and it's ancestor types (superclasses)

Check warning on line 28 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L27-L28

Added lines #L27 - L28 were not covered by tests

datasetNames = schemes.listDatasetsOfNeurodataType( thisNeurodataClassName );

Check warning on line 30 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L30

Added line #L30 was not covered by tests

for thisDatasetName = datasetNames % Iterate over all datasets of a type...

Check warning on line 32 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L32

Added line #L32 was not covered by tests

if ismember(thisDatasetName, processedDatasets)
continue

Check warning on line 35 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L34-L35

Added lines #L34 - L35 were not covered by tests
end

datasetConfig = resolveDataTypeChunkConfig(...
chunkConfiguration, ...
thisNeurodataObject, ...
thisDatasetName);

Check warning on line 41 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L38-L41

Added lines #L38 - L41 were not covered by tests

objectKeys = filteredObjectMap.keys();
for i = 1:numel(objectKeys)
thisObjectKey = objectKeys{i};
thisNwbObject = filteredObjectMap(thisObjectKey);

% Todo: Find dataset properties where it makes sense to do chunking
% I.e data, timestamps etc. Can this be determined automatically,
% or do we need a lookup?

dataTypeChunkOptions = io.config.internal.resolveDataTypeChunkConfig(chunkConfiguration, thisNwbObject);

if isprop(thisNwbObject, 'data')
if isnumeric(thisNwbObject.data)
% Create a datapipe object for the property value.
dataByteSize = io.config.internal.getDataByteSize(thisNwbObject.data);
if dataByteSize > dataTypeChunkOptions.chunk_default_size
chunkSize = io.config.internal.computeChunkSizeFromConfig(thisNwbObject.data, dataTypeChunkOptions);
maxSize = size(thisNwbObject.data);

dataPipe = types.untyped.DataPipe( ...
'data', thisNwbObject.data, ...
'maxSize', maxSize, ...
'chunkSize', chunkSize, ...
'compressionLevel', dataTypeChunkOptions.chunk_compression_args);
thisNwbObject.data = dataPipe;
datasetData = thisNeurodataObject.(thisDatasetName);

Check warning on line 43 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L43

Added line #L43 was not covered by tests

if isnumeric(datasetData)

Check warning on line 45 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L45

Added line #L45 was not covered by tests
% Create a datapipe object for a numeric dataset value.
dataByteSize = io.config.internal.getDataByteSize(datasetData);
if dataByteSize > datasetConfig.target_chunk_size.value
dataPipe = io.config.internal.configureDataPipeFromData(datasetData, datasetConfig);

Check warning on line 49 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L47-L49

Added lines #L47 - L49 were not covered by tests
end
elseif isa(datasetData, 'types.untyped.DataPipe')
if options.OverrideExisting
dataPipe = io.config.internal.reconfigureDataPipe(datasetData, datasetConfig);

Check warning on line 53 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L51-L53

Added lines #L51 - L53 were not covered by tests
end
elseif isa(datasetData, 'types.untyped.DataStub')

Check warning on line 55 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L55

Added line #L55 was not covered by tests
% pass
%error('Not implemented for files obtained by nwbRead')
else
disp( class(datasetData) )

Check warning on line 59 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L59

Added line #L59 was not covered by tests
end

if exist('dataPipe', 'var')
thisNeurodataObject.(thisDatasetName) = dataPipe;
processedDatasets = [processedDatasets, thisDatasetName]; %#ok<AGROW>
clear dataPipe

Check warning on line 65 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L62-L65

Added lines #L62 - L65 were not covered by tests
end
end

parentType = matnwb.common.getParentType(thisNeurodataClassName);

Check warning on line 69 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L69

Added line #L69 was not covered by tests

if isempty(parentType)
isFinished = true;

Check warning on line 72 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L71-L72

Added lines #L71 - L72 were not covered by tests
else
thisNeurodataClassName = parentType;

Check warning on line 74 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L74

Added line #L74 was not covered by tests
end
end
end
end

function neurodataObjects = getNeurodataObjectsFromNwbFile(nwbObject)
% getNeurodataObjectsFromNwbObject - Return all neurodata objects in a NwbFile object

objectMap = nwbObject.searchFor('types.');

Check warning on line 83 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L83

Added line #L83 was not covered by tests

neurodataObjects = objectMap.values();
neurodataClassNames = cellfun(@(c) class(c), neurodataObjects, 'uni', 0);

Check warning on line 86 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L85-L86

Added lines #L85 - L86 were not covered by tests

toIgnore = startsWith(neurodataClassNames, "types.untyped");
neurodataObjects(toIgnore) = [];

Check warning on line 89 in +io/+config/applyChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/applyChunkConfiguration.m#L88-L89

Added lines #L88 - L89 were not covered by tests
end
25 changes: 15 additions & 10 deletions +io/+config/readDefaultChunkConfiguration.m
Original file line number Diff line number Diff line change
@@ -1,19 +1,24 @@
function configObject = readDefaultChunkConfiguration()
% READDEFAULTCHUNKCONFIGURATION Reads the default chunking configuration from a JSON file.
%
% configObject = READDEFAULTCHUNKCONFIGURATION() loads the default chunking
% parameters from a JSON configuration file located in the 'configuration'
% directory within the MatNWB directory.
% Syntax:
% configObject = io.config.READDEFAULTCHUNKCONFIGURATION() loads the default
% chunking parameters from a JSON configuration file located in the
% "configuration" folder inside the MatNWB directory.
%
% Output:
% configObject - A MATLAB structure containing the chunking parameters
% Output Arguments:
% - configObject - A MATLAB structure containing the chunking parameters
% defined in the JSON configuration file.
%
% Example:
% % Load the default chunk configuration
% config = readDefaultChunkConfiguration();
% disp(config);
% Example 1 - Load default dataset configurations::
% % Load the default chunk configuration
% config = readDefaultChunkConfiguration();
% disp(config);

configFilePath = fullfile(...
misc.getMatnwbDir, ...
'configuration', ...
'cloud_dataset_configuration.json');

Check warning on line 21 in +io/+config/readDefaultChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/readDefaultChunkConfiguration.m#L18-L21

Added lines #L18 - L21 were not covered by tests

configFilePath = fullfile(misc.getMatnwbDir, 'configuration', 'chunk_params.json');
configObject = jsondecode(fileread(configFilePath));

Check warning on line 23 in +io/+config/readDefaultChunkConfiguration.m

View check run for this annotation

Codecov / codecov/patch

+io/+config/readDefaultChunkConfiguration.m#L23

Added line #L23 was not covered by tests
end
7 changes: 7 additions & 0 deletions +matnwb/+common/getParentType.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
function parentTypeClassName = getParentType(typeClassName)
mc = meta.class.fromName(typeClassName);
parentTypeClassName = mc.SuperclassList(1).Name;
if strcmp(parentTypeClassName, "types.untyped.MetaClass")
parentTypeClassName = string.empty;

Check warning on line 5 in +matnwb/+common/getParentType.m

View check run for this annotation

Codecov / codecov/patch

+matnwb/+common/getParentType.m#L2-L5

Added lines #L2 - L5 were not covered by tests
end
end
26 changes: 22 additions & 4 deletions +schemes/listDatasetsOfNeurodataType.m
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,33 @@

assert(~isempty(typesIdx), 'Expected class name to contain "types"')
namespaceName = classNameSplit(typesIdx+1);
namespaceName = strrep(namespaceName, '_', '-');
namespace = schemes.loadNamespace(namespaceName, misc.getMatnwbDir);

Check warning on line 22 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L19-L22

Added lines #L19 - L22 were not covered by tests

neurodataTypeName = classNameSplit(typesIdx+2);
typeScheme = namespace.registry(neurodataTypeName);

Check warning on line 25 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L24-L25

Added lines #L24 - L25 were not covered by tests

datasetMaps = typeScheme('datasets');
switch typeScheme('class_type')
case 'groups'
if isKey(typeScheme, 'datasets')
datasetMaps = typeScheme('datasets');

Check warning on line 30 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L27-L30

Added lines #L27 - L30 were not covered by tests

datasetNames = repmat("", size(datasetMaps));
for i = 1:numel(datasetMaps)
if isKey(datasetMaps{i}, 'name')
datasetNames(i) = datasetMaps{i}('name');

Check warning on line 35 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L32-L35

Added lines #L32 - L35 were not covered by tests
else
keyboard

Check warning on line 37 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L37

Added line #L37 was not covered by tests
end
end
datasetNames(datasetNames=="") = [];

Check warning on line 40 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L40

Added line #L40 was not covered by tests
else
datasetNames = string.empty;

Check warning on line 42 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L42

Added line #L42 was not covered by tests
end

datasetNames = repmat("", size(datasetMaps));
for i = 1:numel(datasetMaps)
datasetNames(i) = datasetMaps{i}('name');
case 'datasets'
datasetNames = "data";

Check warning on line 46 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L45-L46

Added lines #L45 - L46 were not covered by tests
otherwise
error('Unexpected class type')

Check warning on line 48 in +schemes/listDatasetsOfNeurodataType.m

View check run for this annotation

Codecov / codecov/patch

+schemes/listDatasetsOfNeurodataType.m#L48

Added line #L48 was not covered by tests
end
end

0 comments on commit c7402d8

Please sign in to comment.