Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modis l2 available datasets #913

Open
wants to merge 15 commits into
base: main
Choose a base branch
from
52 changes: 28 additions & 24 deletions satpy/etc/readers/modis_l2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,67 +11,71 @@ reader:
file_types:
mod05_hdf:
file_patterns:
- 'M{platform_indicator:1s}D05_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf'
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod05.hdf'
- "M{platform_indicator:1s}D05_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf"
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod05.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mod35_hdf:
file_patterns:
- 'M{platform_indicator:1s}D35_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf'
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod35.hdf'
- "M{platform_indicator:1s}D35_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf"
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod35.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mod06_hdf:
file_patterns:
- 'M{platform_indicator:1s}D06_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf'
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod06.hdf'
- "M{platform_indicator:1s}D06_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf"
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod06.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mod06ct_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod06ct.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod06ct.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
modis_l2_product:
file_patterns:
- "M{platform_indicator:1s}D{product:2s}_L2.A{acquisition_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
hdf_eos_geo:
file_patterns:
- 'M{platform_indicator:1s}D03_A{start_time:%y%j_%H%M%S}_{processing_time:%Y%j%H%M%S}.hdf'
- 'M{platform_indicator:1s}D03.A{start_time:%Y%j.%H%M}.{collection:03d}.{processing_time:%Y%j%H%M%S}.hdf'
- 'M{platform_indicator:1s}D03.A{start_time:%Y%j.%H%M}.{collection:03d}{suffix}.hdf'
- 'M{platform_indicator:1s}D03.{start_time:%y%j%H%M%S}.hdf'
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.geo.hdf'
- "M{platform_indicator:1s}D03_A{start_time:%y%j_%H%M%S}_{processing_time:%Y%j%H%M%S}.hdf"
- "M{platform_indicator:1s}D03.A{start_time:%Y%j.%H%M}.{collection:03d}.{processing_time:%Y%j%H%M%S}.hdf"
- "M{platform_indicator:1s}D03.A{start_time:%Y%j.%H%M}.{collection:03d}{suffix}.hdf"
- "M{platform_indicator:1s}D03.{start_time:%y%j%H%M%S}.hdf"
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.geo.hdf"
file_reader: !!python/name:satpy.readers.modis_l1b.HDFEOSGeoReader
icecon_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.icecon.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.icecon.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
inversion_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.inversion.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.inversion.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
ist_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.ist.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.ist.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mask_byte1_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mask_byte1.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mask_byte1.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mod07_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod07.hdf'
- 'M{platform_indicator:1s}D07_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod07.hdf"
- "M{platform_indicator:1s}D07_L2.A{start_time:%Y%j.%H%M}.{collection:03d}.{production_time:%Y%j%H%M%S}.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
mod28_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod28.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.mod28.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
modlst_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.modlst.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.modlst.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
ndvi_1000m_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.ndvi.1000m.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.ndvi.1000m.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler
snowmask_hdf:
file_patterns:
- '{platform_indicator:1s}1.{start_time:%y%j.%H%M}.snowmask.hdf'
- "{platform_indicator:1s}1.{start_time:%y%j.%H%M}.snowmask.hdf"
file_reader: !!python/name:satpy.readers.modis_l2.ModisL2HDFFileHandler

datasets:
Expand All @@ -81,7 +85,7 @@ datasets:
5000:
file_type: [mod35_hdf, mod06_hdf, mod06ct_hdf, mod07_hdf, mod05_hdf]
1000:
file_type: [hdf_eos_geo, mod35_hdf, mod06_hdf, mod05_hdf]
file_type: [hdf_eos_geo, mod35_hdf, mod06_hdf, mod05_hdf, modis_l2_product]
500:
file_type: hdf_eos_geo
250:
Expand All @@ -96,7 +100,7 @@ datasets:
# For EUM reduced (thinned) files
file_type: [mod35_hdf, mod06_hdf, mod06ct_hdf, mod07_hdf, mod05_hdf]
1000:
file_type: [hdf_eos_geo, mod35_hdf, mod06_hdf, mod05_hdf]
file_type: [hdf_eos_geo, mod35_hdf, mod06_hdf, mod05_hdf, modis_l2_product]
500:
file_type: hdf_eos_geo
250:
Expand Down
10 changes: 8 additions & 2 deletions satpy/readers/file_handlers.py
Original file line number Diff line number Diff line change
Expand Up @@ -273,10 +273,16 @@
Example 2 - Add dynamic datasets from the file::

def available_datasets(self, configured_datasets=None):
"Add information to configured datasets."
"Add datasets dynamically determined from the file."
# pass along existing datasets
for is_avail, ds_info in (configured_datasets or []):
yield is_avail, ds_info
if is_avail is not None:
# some other file handler said it has this dataset
# we don't know any more information than the previous
# file handler so let's yield early
yield is_avail, ds_info
continue
yield self.file_type_matches(ds_info["file_type"]), ds_info

Check warning on line 285 in satpy/readers/file_handlers.py

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ Getting worse: Large Method

BaseFileHandler.available_datasets increases from 70 to 73 lines of code, threshold = 70. Large functions with many lines of code are generally harder to understand and lower the code health. Avoid adding more lines to this function.

# get dynamic variables known to this file (that we created)
for var_name, val in self.dynamic_variables.items():
Expand Down
39 changes: 39 additions & 0 deletions satpy/readers/modis_l2.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,45 @@
self._add_satpy_metadata(dataset_id, dataset)
return dataset

def available_datasets(self, configured_datasets):
"""Add dataset information from arbitrary level 2 files.

Adds dataset information not specifically specified in reader yaml file
from arbitrary modis level 2 product files to available datasets.

Notes:
Currently only adds 2D datasets and does not decode bit encoded information.
"""
# pass along yaml configured (handled) datasets and collect their file keys to check against dynamically
# collected variables later on.
handled = set()
for is_avail, ds_info in (configured_datasets or []):
file_key = ds_info.get("file_key", ds_info["name"])
handled.add(file_key)

if is_avail is not None:
yield is_avail, ds_info
continue
yield self.file_type_matches(ds_info["file_type"]), ds_info

res_dict = {5416: 250, 2708: 500, 1354: 1000, 270: 5000, 135: 10000}

# get variables from file dynamically and only add those which are not already configured in yaml
for var_name, val in self.sd.datasets().items():
if var_name in handled:
continue
if len(val[0]) == 2:
resolution = res_dict.get(val[1][-1])
if resolution is not None:
ds_info = {
"file_type": self.filetype_info["file_type"],
"resolution": resolution,
"name": var_name,
"file_key": var_name,
"coordinates": ["longitude", "latitude"]
}
yield True, ds_info

Check warning on line 185 in satpy/readers/modis_l2.py

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ New issue: Bumpy Road Ahead

ModisL2HDFFileHandler.available_datasets has 3 blocks with nested conditional logic. Any nesting of 2 or deeper is considered. Threshold is one single, nested block per function. The Bumpy Road code smell is a function that contains multiple chunks of nested conditional logic. The deeper the nesting and the more bumps, the lower the code health.
mraspaud marked this conversation as resolved.
Show resolved Hide resolved

def _extract_and_mask_category_dataset(self, dataset_id, dataset_info, var_name):
# what dimension is per-byte
byte_dimension = None if self.is_imapp_mask_byte1 else dataset_info["byte_dimension"]
Expand Down
5 changes: 3 additions & 2 deletions satpy/readers/yaml_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -623,8 +623,9 @@ def create_filehandlers(self, filenames, fh_kwargs=None):
self.file_handlers.get(filetype, []) + filehandlers,
key=lambda fhd: (fhd.start_time, fhd.filename))

# load any additional dataset IDs determined dynamically from the file
# and update any missing metadata that only the file knows
# Update dataset IDs with IDs determined dynamically from the file
# and/or update any missing metadata that only the file knows.
# Check if the dataset ID is loadable from that file.
self.update_ds_ids_from_file_handlers()
return created_fhs

Expand Down
Loading