Skip to content

Commit

Permalink
Merge pull request #104 from CosmoStat/develop
Browse files Browse the repository at this point in the history
Minor Patch Release 2.0.1
  • Loading branch information
jeipollack authored Dec 15, 2023
2 parents 146725f + 0cffc54 commit 87e0c8e
Show file tree
Hide file tree
Showing 13 changed files with 64 additions and 38 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:
pull_request:
branches:
- main

- develop

jobs:
test-full:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
project = "wf-psf"
copyright = "2023, CosmoStat"
author = "CosmoStat"
release = "2.0.0"
release = "2.0.1"

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
Expand Down
10 changes: 5 additions & 5 deletions docs/source/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Next, we shall describe each configuration file.
(data_config)=
## Data Configuration

The file [data_config.yaml](https://github.com/CosmoStat/wf-psf/blob/dummy_main/config/data_config.yaml) stores the metadata for generating training and test data sets or retrieving existing ones. A set of training and test data is provided in the `data/coherent_euclid_dataset` directory. `Wavediff` will automatically retrieve the dataset within its directory tree. In the field `data_dir`, the user should specify the sub-path to the data directory as in the example below. The actual name of the dataset file is provided as an entry to the field `file`. However, new training and test data sets can be produced with the parameters in the file, which *should be* provided to the `simPSF` code although not at present (implementation upgrade pending).
The file [data_config.yaml](https://github.com/CosmoStat/wf-psf/blob/main/config/data_config.yaml) stores the metadata for generating training and test data sets or retrieving existing ones. A set of training and test data is provided in the `data/coherent_euclid_dataset` directory. `Wavediff` will automatically retrieve the dataset within its directory tree. In the field `data_dir`, the user should specify the sub-path to the data directory as in the example below. The actual name of the dataset file is provided as an entry to the field `file`. However, new training and test data sets can be produced with the parameters in the file, which *should be* provided to the `simPSF` code although not at present (implementation upgrade pending).

```
# Training and test data sets for training and/or metrics evaluation
Expand All @@ -54,7 +54,7 @@ data:
(training_config)=
## Training Configuration

The file [training_config.yaml](https://github.com/CosmoStat/wf-psf/blob/dummy_main/config/training_config.yaml) is used to configure the settings for the training pipeline task. The first line contains the parent key `training`. All of the following child keys are treated as values of the `training` key. Above each child key a description is provided. Below is an abridged example of this:
The file [training_config.yaml](https://github.com/CosmoStat/wf-psf/blob/main/config/training_config.yaml) is used to configure the settings for the training pipeline task. The first line contains the parent key `training`. All of the following child keys are treated as values of the `training` key. Above each child key a description is provided. Below is an abridged example of this:

```
training:
Expand Down Expand Up @@ -90,7 +90,7 @@ Training hyperparameters, defined by the parent key: `training_hparams`, include
(metrics_config)=
## Metrics Configuration

The [metrics_config.yaml](https://github.com/CosmoStat/wf-psf/blob/dummy_main/config/metrics_config.yaml) file stores the configuration parameters for the WaveDiff pipeline to perform computations of the four metrics listed in the table on a trained PSF model, as applied in {cite:t}`Liaudat:23`.
The [metrics_config.yaml](https://github.com/CosmoStat/wf-psf/blob/main/config/metrics_config.yaml) file stores the configuration parameters for the WaveDiff pipeline to perform computations of the four metrics listed in the table on a trained PSF model, as applied in {cite:t}`Liaudat:23`.

| Metric type | Description |
| --- | ----------- |
Expand Down Expand Up @@ -213,7 +213,7 @@ The `metrics` package is run using [TensorFlow](https://www.tensorflow.org) to r
(plotting_config)=
## Plot Configuration

The [plotting_config.yaml](https://github.com/CosmoStat/wf-psf/blob/dummy_main/config/plotting_config.yaml) file stores the configuration parameters for the WaveDiff pipeline to generate plots for the metrics listed in the {ref}`metrics settings table <metrics_settings>` for each data set.
The [plotting_config.yaml](https://github.com/CosmoStat/wf-psf/blob/main/config/plotting_config.yaml) file stores the configuration parameters for the WaveDiff pipeline to generate plots for the metrics listed in the {ref}`metrics settings table <metrics_settings>` for each data set.

An example of the contents of the `plotting_config.yaml` file is shown below.

Expand Down Expand Up @@ -283,7 +283,7 @@ plotting_params:
(master_config_file)=
## Master Configuration

The `configs.yaml` file is the master configuration file that is used to define all of the pipeline tasks to be submitted and executed by `WaveDiff` during runtime. In this file, the user lists the processing tasks (one or more) to be performed by setting the values of the associated configuration variables `{pipeline_task}_conf` and the name of the configuration file `{pipeline_task}_config.yaml`. See an example below to configure `WaveDiff` to launch a sequence of runs to train models 1...n with their respective configurations given in the files `training_config_{id}.yaml`.
The [configs.yaml](https://github.com/CosmoStat/wf-psf/blob/main/config/configs.yaml) file is the master configuration file that is used to define all of the pipeline tasks to be submitted and executed by `WaveDiff` during runtime. In this file, the user lists the processing tasks (one or more) to be performed by setting the values of the associated configuration variables `{pipeline_task}_conf` and the name of the configuration file `{pipeline_task}_config.yaml`. See an example below to configure `WaveDiff` to launch a sequence of runs to train models 1...n with their respective configurations given in the files `training_config_{id}.yaml`.

```
---
Expand Down
2 changes: 2 additions & 0 deletions docs/source/dependencies.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ Third-party software packages required by WaveDiff are installed automatically (
|--------------|---------------------------------------------------|
| [numpy](https://numpy.org/) | {cite:t}`harris:20` |
| [scipy](https://scipy.org) | {cite:t}`SciPy-NMeth:20` |
| [keras](https://keras.io) | {cite:t}`chollet:2015keras`|
| [tensorflow](https://www.tensorflow.org) | {cite:t}`tensorflow:15` |
| [tensorflow-addons](https://www.tensorflow.org/addons) |{cite:t}`tensorflow:15` |
| [tensorflow-estimator](https://www.tensorflow.org/api_docs/python/tf/estimator) |{cite:t}`tensorflow:15` |
Expand All @@ -17,4 +18,5 @@ Third-party software packages required by WaveDiff are installed automatically (
| [galsim](http://galsim-developers.github.io/GalSim/_build/html/index.html#) | {cite:t}`rowe:15` |
| [astropy](https://www.astropy.org) | {cite:t}`astropy:13,astropy:18`, <br>{cite:t}`astropy:22` |
| [matplotlib](https://matplotlib.org) | {cite:t}`Hunter:07` |
| [pandas](https://pandas.pydata.org) | {cite:t}`mckinney:2010pandas` |
| [seaborn](https://seaborn.pydata.org) | {cite:t}`Waskom:21` |
2 changes: 2 additions & 0 deletions docs/source/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ WaveDiff is a software written in Python and uses the [TensorFlow](https://www.t

Note: You may want to set up a dedicated Python environment for running WaveDiff using e.g. [Conda](https://docs.conda.io/en/latest/). You can use the minimal installer [Miniconda](https://docs.conda.io/projects/miniconda/en/latest/) to set up the environment in which we run the command below to install the subset of packages needed for running WaveDiff.

In the [WaveDiff repository](https://github.com/CosmoStat/wf-psf.git), we provide in the file `environment.yml` the specific environment used in the testing of `WaveDiff`. We recommend users to work with this environment for the current release.

## Installation Steps

Clone the repository:
Expand Down
18 changes: 18 additions & 0 deletions docs/source/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,14 @@ @article{Liaudat:23
journal = {Inverse Problems},
}

@online{chollet:2015keras,
title={Keras},
author={Chollet, Francois and others},
year={2015},
publisher={GitHub},
url={https://github.com/fchollet/keras},
}

@misc{clark:15,
title={Pillow (PIL Fork) Documentation},
author={Clark, Alex},
Expand Down Expand Up @@ -196,6 +204,16 @@ @ARTICLE{SciPy-NMeth:20
doi = {10.1038/s41592-019-0686-2},
}

@inproceedings{mckinney:2010pandas,
title={Data structures for statistical computing in python},
author={McKinney, Wes and others},
booktitle={Proceedings of the 9th Python in Science Conference},
volume={445},
pages={51--56},
year={2010},
organization={Austin, TX}
}

@article{opencv_library:08,
author = {Bradski, G.},
citeulike-article-id = {2236121},
Expand Down
7 changes: 0 additions & 7 deletions docs/source/wf_psf.info.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/wf_psf.plotting.plot_optimisation_metrics.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/source/wf_psf.plotting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,4 @@ Submodules
.. toctree::
:maxdepth: 4

wf_psf.plotting.plot_optimisation_metrics
wf_psf.plotting.plots_interface
3 changes: 1 addition & 2 deletions docs/source/wf_psf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,5 @@ Submodules

.. toctree::
:maxdepth: 4

wf_psf.info

wf_psf.run
19 changes: 19 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: wavediff-env
channels:
- conda-forge
dependencies:
- keras=2.9.0=pyhd8ed1ab_0
- scipy=1.11.4=py310h2b794db_0
- tensorflow=2.9.1=cpu_py310h6ecea76_0
- pip=23.3.1
- pip:
- astropy>=6.0.0
- galsim>=2.5.1
- matplotlib>=3.8.2
- opencv-python>=4.8.1.78
- pandas>=2.1.4
- pillow>=10.1.0
- seaborn>=0.13.0
- tensorflow-addons>=0.23.0
- tensorflow-estimator>=2.15.0
- zernike>=0.0.32
27 changes: 14 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,22 @@ maintainers = [

description = 'A software framework to perform Differentiable wavefront-based PSF modelling.'
dependencies = [
"numpy>=1.19.2",
"scipy>=1.11.2",
"tensorflow>=2.9.1",
"tensorflow-addons>=0.12.1",
"tensorflow-estimator>=2.9.0",
"zernike==0.0.31",
"opencv-python>=4.5.1.48",
"pillow>=9.5.0",
"galsim>=2.4.11",
"astropy>=5.3.3",
"matplotlib>=3.3.2",
"seaborn>=0.12.2",
"numpy",
"scipy",
"keras==2.9.0",
"tensorflow==2.9.1",
"tensorflow-addons>=0.23.0",
"tensorflow-estimator",
"zernike",
"opencv-python",
"pillow",
"galsim",
"astropy",
"matplotlib",
"seaborn",
]

version = "2.0.0"
version = "2.0.1"

[project.optional-dependencies]
docs = [
Expand Down
2 changes: 1 addition & 1 deletion src/wf_psf/training/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,7 +362,7 @@ def train(
learning_rate_non_param=training_handler.learning_rate_non_params[
current_cycle - 1
],
n_epochs_param=training_handler.n_epochs_non_params[current_cycle - 1],
n_epochs_param=training_handler.n_epochs_params[current_cycle - 1],
n_epochs_non_param=training_handler.n_epochs_non_params[current_cycle - 1],
param_optim=param_optim,
non_param_optim=non_param_optim,
Expand Down

0 comments on commit 87e0c8e

Please sign in to comment.