Skip to content

DSWx‐HLS Interface Acceptance Testing Instructions

Scott Collins edited this page Aug 5, 2024 · 1 revision

This page contains instructions for performing Acceptance Testing for the DSWx-HLS interface delivery from the OPERA-ADT team. These instructions assume the user has access to the JPL FN-Artifactory, and has Docker installed on their local machine.

Acquiring the DSWx-HLS Interface Docker Image

The image is currently hosted on JPL FN-Artifactory, which requires JPL VPN access and JPL credentials. You may also need to be added to the gov.nasa.jpl.opera.adt organization.

Once you have access, the container tarball delivery is available under general/gov/nasa/jpl/opera/adt/r1/interface/dockerimg_dswx_hls_interface.tar. Sample inputs and outputs are also available under general/gov/nasa/jpl/opera/adt/r1/interface/test_datasets.tar.gz.

Download both images to a location on your local machine. This location will be referred to throughout this instructions as <DSWx_DIR>

Loading the image into Docker

The first step in running the DSWx-HLS image is to load it into Docker via the following command:

docker load -i <DSWx_DIR>/dockerimg_dswx_hls_interface.tar

This should add the Docker image to your local repository with the name opera/dswx_hls and the tag interface.

Preparing the test data

Once the test_datasets.tar.gz file is downloaded to your local machine, unpack it to <DSWx_DIR>:

tar -xzf test_datasets.tar.gz --directory=<DSWx_DIR>

This will create a test_datasets directory within <DSWx_DIR> containing two test datasets, l30_greenland and s30_louisiana. Within each dataset are the following files/directories:

  • expected_output/
  • input_files_hls_v2.0/
  • README.txt
  • runconfig_dswx_hls.yaml

In order to execute the SAS, the input file directory, runconfig and an output location will be mounted into container instance as Docker Volumes. To help streamline this process, we recommend making the following changes to the test_datasets directory:

  1. Create a directory named runconfig within each dataset directory, and move the existing runconfig YAML file into them:

    mkdir -p <DSWx_DIR>/test_datasets/{l30_greenland|s30_louisiana}/runconfig

    mv <DSWx_DIR>/test_datasets/{l30_greenland|s30_louisiana}/runconfig_dswx_hls.yaml <DSWx_DIR>/test_datasets/{l30_greenland|s30_louisiana}/runconfig

  2. Create a directory for the SAS container to write its outputs to, separate from the expected_outputs directory:

    mkdir -p <DSWx_DIR>/test_datasets/{l30_greenland|s30_louisiana}/output_dir

Executing the DSWx-HLS container on the sample datasets

We're now ready to execute the DSWx-HLS Interface. Run the following the command to kick off execution for each dataset:

docker run --rm -u conda:conda \
  -v <DSWx_DIR>/test_datasets/<DATASET>/runconfig:/home/conda/runconfig:ro \
  -v <DSWx_DIR>/test_datasets/<DATASET>/input_files_hls_v2.0:/home/conda/input_dir:ro \
  -v <DSWx_DIR>/test_datasets/<DATASET>/output_dir:/home/conda/output_dir \
  -i --tty opera/dswx_hls:interface \
  sh -ci "python3 DSWx-HLS-0.1/bin/dswx_hls.py runconfig/runconfig_dswx_hls.yaml --log output_dir/<DATASET>.log"

Where <DATASET> is one of l30_greenland or s30_louisiana.

You should see console output from the executing SAS container. Execution should only take roughly 20 minutes per dataset. Once execution is complete, you should see a dswx_hls.tif and <DATASET>.log file.

Running the Quality Assurance test

Now that we've successfully executed the SAS container and generated outputs, the last step is to perform a QA check against the expected outputs.

The first step is to create a script that will compute difference images between the output and expected TIF files using GDAL utilities included with the SAS container delivery. Within the <DSWx_DIR>/test_datasets directory, create a script named qa_check.sh with the following contents:

#!/bin/bash
for i in {1..10}
do
echo Band $i
gdal_calc.py --calc=A-B -A output_dir/dswx_hls.tif --A_band $i -B expected_output/dswx_hls.tif --B_band $i --outfile output_dir/dswx_hls_diff.tif
gdalinfo -stats output_dir/dswx_hls_diff.tif|grep STATISTICS_M
done

Make sure the script has execute permissions:

chmod 775 <DSWx_DIR>/test_datasets/qa_check.sh

Now execute the QA script on each dataset, via the DSWx-HLS Interface container:

docker run --rm -u conda:conda \
  -v <DSWx_DIR>/test_datasets/<DATASET>/expected_output:/home/conda/expected_output:ro \
  -v <DSWx_DIR>/test_datasets/<DATASET>/output_dir:/home/conda/output_dir \
  -v <DSWx_DIR>/test_datasets:/home/conda/scripts \
  -i --tty opera/dswx_hls:interface \
  sh -ci "/home/conda/scripts/qa_check.sh | tee /home/conda/output_dir/qa_check.log"

The QA console output should appear on screen, with statistics calculated for each of the 10 bands:

Band 1
0.. 6.. 12.. 18.. 25.. 31.. 37.. 43.. 50.. 56.. 62.. 68.. 75.. 81.. 87.. 93.. 100 - Done
    STATISTICS_MAXIMUM=0
    STATISTICS_MEAN=0
    STATISTICS_MINIMUM=0
Band 2
...

A copy of the output should also be available under <DSWX_DIR>/test_datasets/<DATASET>/output_dir/qa_check.log.

The expected results for this Acceptance Test are for all 10 bands to return values of 0 for STATISTICS_MAXIMUM, STATISTICS_MEAN, and STATISTICS_MINIMUM for both datasets. If the actual results differ, the output products from the Acceptance Test, as well as the diff files should be provided to ADT for diagnostics.

Clone this wiki locally