Skip to content
This repository has been archived by the owner on Dec 20, 2024. It is now read-only.

Commit

Permalink
chore: Merge pull request #94 from ecmwf/develop
Browse files Browse the repository at this point in the history
Release 0.4.1
  • Loading branch information
JPXKQX authored Dec 19, 2024
2 parents ba37563 + 804db6d commit 7bed8fa
Show file tree
Hide file tree
Showing 17 changed files with 1,188 additions and 300 deletions.
9 changes: 4 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ repos:
- id: python-check-blanket-noqa # Check for # noqa: all
- id: python-no-log-warn # Check for log.warn
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 24.8.0
rev: 24.10.0
hooks:
- id: black
args: [--line-length=120]
Expand All @@ -40,7 +40,7 @@ repos:
- --force-single-line-imports
- --profile black
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.9
rev: v0.8.1
hooks:
- id: ruff
args:
Expand All @@ -60,11 +60,11 @@ repos:
- id: rstfmt
exclude: 'cli/.*' # Because we use argparse
- repo: https://github.com/tox-dev/pyproject-fmt
rev: "2.2.4"
rev: "v2.5.0"
hooks:
- id: pyproject-fmt
- repo: https://github.com/jshwi/docsig # Check docstrings against function sig
rev: v0.64.0
rev: v0.65.0
hooks:
- id: docsig
args:
Expand All @@ -74,6 +74,5 @@ repos:
- --check-protected # Check protected methods
- --check-class # Check class docstrings
- --disable=E113 # Disable empty docstrings
- --summary # Print a summary
ci:
autoupdate_schedule: monthly
14 changes: 12 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,22 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
Please add your functional changes to the appropriate section in the PR.
Keep it human-readable, your future self will thank you!

## [Unreleased](https://github.com/ecmwf/anemoi-models/compare/0.3.0...HEAD)
## [Unreleased](https://github.com/ecmwf/anemoi-models/compare/0.4.0...HEAD)

- Add synchronisation workflow
### Added

- New AnemoiModelEncProcDecHierarchical class available in models [#37](https://github.com/ecmwf/anemoi-models/pull/37)
- Mask NaN values in training loss function [#56](https://github.com/ecmwf/anemoi-models/pull/56)
- Added dynamic NaN masking for the imputer class with two new classes: DynamicInputImputer, DynamicConstantImputer [#89](https://github.com/ecmwf/anemoi-models/pull/89)
- Reduced memory usage when using chunking in the mapper [#84](https://github.com/ecmwf/anemoi-models/pull/84)
- Added `supporting_arrays` argument, which contains arrays to store in checkpoints. [#97](https://github.com/ecmwf/anemoi-models/pull/97)
- Add remappers, e.g. link functions to apply during training to facilitate learning of variables with a difficult distribution [#88](https://github.com/ecmwf/anemoi-models/pull/88)

## [0.4.0](https://github.com/ecmwf/anemoi-models/compare/0.3.0...0.4.0) - Improvements to Model Design

### Added

- Add synchronisation workflow [#60](https://github.com/ecmwf/anemoi-models/pull/60)
- Add anemoi-transform link to documentation
- Codeowners file
- Pygrep precommit hooks
Expand Down
26 changes: 26 additions & 0 deletions docs/modules/models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,29 @@ encoder, processor, and decoder.
:members:
:no-undoc-members:
:show-inheritance:

**********************************************
Encoder Hierarchical processor Decoder Model
**********************************************

This model extends the standard encoder-processor-decoder architecture
by introducing a **hierarchical processor**.

Compared to the AnemoiModelEncProcDec model, this architecture requires
a predefined list of hidden nodes, `[hidden_1, ..., hidden_n]`. These
nodes must be sorted to match the expected flow of information `data ->
hidden_1 -> ... -> hidden_n -> ... -> hidden_1 -> data`.

A new argument is added to the configuration file:
`enable_hierarchical_level_processing`. This argument determines whether
a processor is added at each hierarchy level or only at the final level.

By default, the number of channels for the mappers is defined as `2^n *
config.num_channels`, where `n` represents the hierarchy level. This
scaling ensures that the processing capacity grows proportionally with
the depth of the hierarchy, enabling efficient handling of data.

.. automodule:: anemoi.models.models.hierarchical
:members:
:no-undoc-members:
:show-inheritance:
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ classifiers = [
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
]
Expand Down
12 changes: 11 additions & 1 deletion src/anemoi/models/interface/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ class AnemoiModelInterface(torch.nn.Module):
Statistics for the data.
metadata : dict
Metadata for the model.
supporting_arrays : dict
Numpy arraysto store in the checkpoint.
data_indices : dict
Indices for the data.
pre_processors : Processors
Expand All @@ -48,7 +50,14 @@ class AnemoiModelInterface(torch.nn.Module):
"""

def __init__(
self, *, config: DotDict, graph_data: HeteroData, statistics: dict, data_indices: dict, metadata: dict
self,
*,
config: DotDict,
graph_data: HeteroData,
statistics: dict,
data_indices: dict,
metadata: dict,
supporting_arrays: dict = None,
) -> None:
super().__init__()
self.config = config
Expand All @@ -57,6 +66,7 @@ def __init__(
self.graph_data = graph_data
self.statistics = statistics
self.metadata = metadata
self.supporting_arrays = supporting_arrays if supporting_arrays is not None else {}
self.data_indices = data_indices
self._build_model()

Expand Down
6 changes: 2 additions & 4 deletions src/anemoi/models/layers/block.py
Original file line number Diff line number Diff line change
Expand Up @@ -512,18 +512,16 @@ def forward(
edge_attr_list, edge_index_list = sort_edges_1hop_chunks(
num_nodes=size, edge_attr=edges, edge_index=edge_index, num_chunks=num_chunks
)
out = torch.zeros((x[1].shape[0], self.num_heads, self.out_channels_conv), device=x[1].device)
for i in range(num_chunks):
out1 = self.conv(
out += self.conv(
query=query,
key=key,
value=value,
edge_attr=edge_attr_list[i],
edge_index=edge_index_list[i],
size=size,
)
if i == 0:
out = torch.zeros_like(out1, device=out1.device)
out = out + out1
else:
out = self.conv(query=query, key=key, value=value, edge_attr=edges, edge_index=edge_index, size=size)

Expand Down
1 change: 1 addition & 0 deletions src/anemoi/models/layers/processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -323,6 +323,7 @@ def forward(
*args,
**kwargs,
) -> Tensor:

shape_nodes = change_channels_in_shape(shard_shapes, self.num_channels)
edge_attr = self.trainable(self.edge_attr, batch_size)

Expand Down
5 changes: 5 additions & 0 deletions src/anemoi/models/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,8 @@
# In applying this licence, ECMWF does not waive the privileges and immunities
# granted to it by virtue of its status as an intergovernmental organisation
# nor does it submit to any jurisdiction.

from .encoder_processor_decoder import AnemoiModelEncProcDec
from .hierarchical import AnemoiModelEncProcDecHierarchical

__all__ = ["AnemoiModelEncProcDec", "AnemoiModelEncProcDecHierarchical"]
Loading

0 comments on commit 7bed8fa

Please sign in to comment.