Skip to content

Commit

Permalink
Merge branch 'release-1.0.1'
Browse files Browse the repository at this point in the history
  • Loading branch information
jkglasbrenner committed Oct 28, 2024
2 parents a661aaa + e33c3f1 commit 6078bce
Show file tree
Hide file tree
Showing 193 changed files with 11,042 additions and 6,358 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/pip-compile.yml
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ jobs:
run: |
python3 -m tox run -e py311-linux-${{ matrix.architecture }}-${{ matrix.requirements }}
- name: run tox (MacOS, Python 3.11)
- name: run tox (macOS, Python 3.11)
if: ${{ (matrix.os == 'macos-13' || matrix.os == 'macos-latest') && matrix.python-version == '3.11' }}
run: |
python3 -m tox run -e py311-macos-${{ matrix.architecture }}-${{ matrix.requirements }}
Expand Down
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -398,7 +398,6 @@ vignettes/*.pdf

# Javascript
node_modules/
package-lock.json

# ---------------------------
# BEGIN Whitelisted Files
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ See the [Dioptra Commit Style Guide](./COMMIT_STYLE_GUIDE.md).

#### Squashing

All final commits will be squashed, therefore when squashing your branch, it’s important to make sure you update the commit message. If you’re using Github’s UI it will by default create a new commit message which is a combination of all commits and **does not follow the commit guidelines**.
All final commits will be squashed, therefore when squashing your branch, it’s important to make sure you update the commit message. If you’re using GitHub’s UI it will by default create a new commit message which is a combination of all commits and **does not follow the commit guidelines**.

If you’re working locally, it often can be useful to `--amend` a commit, or utilize `rebase -i` to reorder, squash, and reword your commits.

Expand Down
1 change: 1 addition & 0 deletions CONTRIBUTORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,3 +18,4 @@ lbarbMITRE
cminiter
pscemama-mitre
alexb1200
jsoref
86 changes: 73 additions & 13 deletions DEVELOPER.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

If you have not already, please review [CONTRIBUTING.md](CONTRIBUTING.md) for more complete information on expectations for contributions.

## Developer quickstart
## Developer Set-up

### Setting up the Python virtual environment

Expand All @@ -17,12 +17,12 @@ Ensure that you have Python 3.11 installed and that it is available in your PATH
| linux-arm64-py3.11-requirements-dev.txt | Linux | arm64 |||
| linux-arm64-py3.11-requirements-dev-tensorflow.txt | Linux | arm64 |||
| linux-arm64-py3.11-requirements-dev-pytorch.txt | Linux | arm64 |||
| macos-amd64-py3.11-requirements-dev.txt | MacOS | x86-64 |||
| macos-amd64-py3.11-requirements-dev-tensorflow.txt | MacOS | x86-64 |||
| macos-amd64-py3.11-requirements-dev-pytorch.txt | MacOS | x86-64 |||
| macos-arm64-py3.11-requirements-dev.txt | MacOS | arm64 |||
| macos-arm64-py3.11-requirements-dev-tensorflow.txt | MacOS | arm64 |||
| macos-arm64-py3.11-requirements-dev-pytorch.txt | MacOS | arm64 |||
| macos-amd64-py3.11-requirements-dev.txt | macOS | x86-64 |||
| macos-amd64-py3.11-requirements-dev-tensorflow.txt | macOS | x86-64 |||
| macos-amd64-py3.11-requirements-dev-pytorch.txt | macOS | x86-64 |||
| macos-arm64-py3.11-requirements-dev.txt | macOS | arm64 |||
| macos-arm64-py3.11-requirements-dev-tensorflow.txt | macOS | arm64 |||
| macos-arm64-py3.11-requirements-dev-pytorch.txt | macOS | arm64 |||
| win-amd64-py3.11-requirements-dev.txt | Windows | x86-64 |||
| win-amd64-py3.11-requirements-dev-tensorflow.txt | Windows | x86-64 |||
| win-amd64-py3.11-requirements-dev-pytorch.txt | Windows | x86-64 |||
Expand All @@ -34,7 +34,7 @@ python -m venv .venv
```

Activate the virtual environment after creating it.
To activate it on MacOS/Linux:
To activate it on macOS/Linux:

```sh
source .venv/bin/activate
Expand All @@ -53,7 +53,7 @@ python -m pip install --upgrade pip pip-tools
```

Finally, use `pip-sync` to install the dependencies in your chosen requirements file and install `dioptra` in development mode.
On MacOS/Linux:
On macOS/Linux:

```sh
# Replace "linux-amd64-py3.11-requirements-dev.txt" with your chosen file
Expand All @@ -70,9 +70,69 @@ pip-sync requirements\win-amd64-py3.11-requirements-dev.txt
If the requirements file you used is updated, or if you want to switch to another requirements file (you need access to the Tensorflow library, for example), just run `pip-sync` again using the appropriate filename.
It will install, upgrade, and uninstall all packages accordingly and ensure that you have a consistent environment.

### Frontend development setup

For instructions on how to prepare the frontend development environment, [see the src/frontend/README.md file](src/frontend/README.md)
### Local Development setup (without containers)
- Clone the repository at https://github.com/usnistgov/dioptra:
```
git clone [email protected]:usnistgov/dioptra.git ~/dioptra/dev
```
or
```
git clone https://github.com/usnistgov/dioptra.git ~/dioptra/dev
```
- `cd ~dioptra/dev`
- `git checkout dev`
- [Install redis](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/)
- Create a work directory for files `mkdir -p ~/dioptra/deployments/dev`
- [Create a python virtual environment](#setting-up-the-python-virtual-environment)

- The following describes commands to execute in four different terminal windows:
1. Flask Terminal
- Environment variables that must be set for flask:
```
DIOPTRA_RESTAPI_DEV_DATABASE_URI="sqlite:////home/<username>/dioptra/deployments/dev/dioptra-dev.db"
DIOPTRA_RESTAPI_ENV=dev
DIOPTRA_RESTAPI_VERSION=v1
```
N.B.: replace <username> with your username. On some systems the home path may also be different. Verify the expansion of '~' with the `pwd` command while in the appropriate directory.
- Activate the python environment set-up in prior steps
- `dioptra-db autoupgrade`
- `flask run`
2. Frontend UI Terminal
- Commands to get a Frontend running:
```bash
cd src/fronted
npm install
npm run dev
```
3. Redis Terminal
- `redis-server`
4. Dioptra Worker
- Starting a Dioptra Worker requires the following environment variables:
```
DIOPTRA_WORKER_USERNAME="dioptra-worker" # This must be a registered user in the Dioptra app
DIOPTRA_WORKER_PASSWORD="password" # Must match the username's password
DIOPTRA_API="http://localhost:5000" # This is the default API location when you run `flask run`
RQ_REDIS_URI="redis://localhost:6379/0" # This is the default URI when you run `redis-server`
MLFLOW_S3_ENDPOINT_URL="http://localhost:35000" # If you're running a MLflow Tracking server, update this to point at it. Otherwise, this is a placeholder.
OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES # Macs only, needed to make the RQ worker (i.e. the Dioptra Worker) work
```
- Activate the python environment set-up in prior steps (e.g. `source .venv/bin/activate`)
- With the prior environment variables set then execute the following commands:
```bash
mkdir -p ~/dioptra/deployments/dev/workdir/
cd ~/dioptra/deployments/dev/workdir/
dioptra-worker-v1 'Tensorflow CPU' # Assumes 'Tensorflow CPU' is a registered Queue name
```
- Frontend app is available by default at http://localhost:5173 (the frontend terminal windows should also indicate the URL to use)
- Create Dioptra worker in the Frontend UI or through API. curl command for interacting with API (assuming you have the environment variables in Step iv set) is:
```
curl http://localhost:5000/api/v1/users/ -X POST --data-raw "{\"username\": \"$DIOPTRA_WORKER_USERNAME\", \"email\": \"dioptra-worker@localhost\", \"password\": \"$DIOPTRA_WORKER_PASSWORD\", \"confirmPassword\": \"$DIOPTRA_WORKER_PASSWORD\"}"
```
- Create 'Tensorflow CPU' Queue -- this needs to agree with the queue name used in Step iv.
### Building the documentation
Expand Down Expand Up @@ -133,7 +193,7 @@ make code-check

This project has a [commit style guide](./COMMIT_STYLE_GUIDE.md) that is enforced using the `gitlint` tool.
Developers are expected to run `gitlint` and validate their commit message before opening a Pull Request.
After commiting your contribution, activate your virtual environment if you haven't already and run:
After committing your contribution, activate your virtual environment if you haven't already and run:

```sh
python -m tox run -e gitlint
Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -537,7 +537,7 @@ endif
$(call save_sentinel_file,$@)

#################################################################################
# AUTO-GENERATED PROJECT BUILD RECEIPES #
# AUTO-GENERATED PROJECT BUILD RECIPES #
#################################################################################

$(call generate_full_docker_image_recipe,MLFLOW_TRACKING,CONTAINER_IMAGE_TAG)
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Details are available in the project documentation available at <https://pages.n

## Current Release Status

Release 1.0.0 -- with on-going improvements and development
Release 1.0.1 -- with on-going improvements and development

## Use Cases

Expand Down
4 changes: 2 additions & 2 deletions RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@
- node
- redis

3. Edit `examples/scripts/venvs/examples-setup-requirements.txt` and set an upper bound constraint on each of the packages listed (if one isn't set already). The upper bounds can be determined by creating the a virtual environment using this file from the `dev` branch and testing that the instructions in `examples/README.md` work. Once the repo maintainer confirms that the environment works and the user can run the provided scripts and submit jobs from the Jupyter notebook, run `python -m pip freeze` to check what is currently installed. Use the known working versions to set the upper bound constraint.
3. Edit `examples/scripts/venvs/examples-setup-requirements.txt` and set an upper bound constraint on each of the packages listed (if one isn't set already). The upper bounds can be determined by creating a virtual environment using this file from the `dev` branch and testing that the instructions in `examples/README.md` work. Once the repo maintainer confirms that the environment works and the user can run the provided scripts and submit jobs from the Jupyter notebook, run `python -m pip freeze` to check what is currently installed. Use the known working versions to set the upper bound constraint.

4. Fetch the latest requirement files generated by GitHub Actions [here](https://github.com/usnistgov/dioptra/actions/workflows/pip-compile.yml). Download the `requirements-files` zip, unpack it, and move the files with `*requirements-dev*` into the `requirements/` folder, and the rest into the `docker/requirements` folder. In addition, get someone with an M1/M2 Mac to regenerate the MacOS ARM64 requirements files.
4. Fetch the latest requirement files generated by GitHub Actions [here](https://github.com/usnistgov/dioptra/actions/workflows/pip-compile.yml). Download the `requirements-files` zip, unpack it, and move the files with `*requirements-dev*` into the `requirements/` folder, and the rest into the `docker/requirements` folder. In addition, get someone with an M1/M2 Mac to regenerate the macOS ARM64 requirements files.

5. Commit the changes using the message `build: set container tags and package upper bounds for merge to main`

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@
"dbadmin": {
"image": "pgadmin4",
"namespace": "dpage",
"tag": "8.9",
"tag": "8.12",
"registry": ""
},
"mc": {
Expand All @@ -102,7 +102,7 @@
"redis": {
"image": "redis",
"namespace": "",
"tag": "7.2.5",
"tag": "7.2.6",
"registry": ""
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ The following subsections explain how to:
- Assign GPUs to specific worker containers
- Integrate custom containers in the Dioptra deployment

In addition to the above, you may want to further customize the the Docker Compose configuration via the `docker-compose.override.yml` file to suit your needs, such as allocating explicit CPUs you want each container to use.
In addition to the above, you may want to further customize the Docker Compose configuration via the `docker-compose.override.yml` file to suit your needs, such as allocating explicit CPUs you want each container to use.
An example template file (`docker-compose.override.yml.template`) is provided as part of the deployment as a starting point.
This can be copied to `docker-compose.override.yml` and modified.
See the [Compose specification documentation](https://docs.docker.com/compose/compose-file/) for the full list of available options.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# configuration changes.
#
# A datasets directory is configured in the main docker-compose.yml file in the
# cookicutter deployment generation. It is recommended that datasets_directory
# cookiecutter deployment generation. It is recommended that datasets_directory
# be left blank if mounts are being configured here.
# ------------------------------------------------------------------------------

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -289,6 +289,7 @@ prepare_build_dir() {
"public"
"index.html"
"package.json"
"package-lock.json"
"tsconfig.json"
"tsconfig.app.json"
"tsconfig.node.json"
Expand Down Expand Up @@ -351,10 +352,10 @@ copy_dist_to_output() {
compile_vue_js_frontend() {
cd "${BUILD_DIR}"

log_info "Installing node packages using npm install"
log_info "Installing node packages using npm ci"

if ! npm install; then
log_error "Installing node modules using npm install failed, exiting..."
if ! npm ci; then
log_error "Installing node modules using npm ci failed, exiting..."
exit 1
fi

Expand Down
2 changes: 1 addition & 1 deletion docker/ca-certificates/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@ There are some common situations where it is necessary to provide one or more ex
2. You are building the containers in a corporate environment that has its own certificate authority and the containers need access to resources or repository mirrors on the corporate network.

If these situations do not apply to you, or if you are unsure if they apply to you, then it is recommended that you try to build the containers without adding anything to this folder first.
If the build process fails due to an HTTPS or SSL error, then that is a a telltale sign that you need to add extra CA certificates to this folder.
If the build process fails due to an HTTPS or SSL error, then that is a telltale sign that you need to add extra CA certificates to this folder.
Loading

0 comments on commit 6078bce

Please sign in to comment.