Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(pt): support loss plugin for external package #4248

Merged
merged 6 commits into from
Oct 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 24 additions & 1 deletion deepmd/pt/loss/loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,12 @@
from deepmd.utils.data import (
DataRequirementItem,
)
from deepmd.utils.plugin import (
make_plugin_registry,
)


class TaskLoss(torch.nn.Module, ABC):
class TaskLoss(torch.nn.Module, ABC, make_plugin_registry("loss")):
def __init__(self, **kwargs):
"""Construct loss."""
super().__init__()
Expand All @@ -38,3 +41,23 @@
whether the property is found
"""
return loss if bool(find_property) else torch.nan

@classmethod
def get_loss(cls, loss_params: dict) -> "TaskLoss":
"""Get the loss module by the parameters.

By default, all the parameters are directly passed to the constructor.
If not, override this method.

Parameters
----------
loss_params : dict
The loss parameters

Returns
-------
TaskLoss
The loss module
"""
loss = cls(**loss_params)
return loss

Check warning on line 63 in deepmd/pt/loss/loss.py

View check run for this annotation

Codecov / codecov/patch

deepmd/pt/loss/loss.py#L62-L63

Added lines #L62 - L63 were not covered by tests
4 changes: 3 additions & 1 deletion deepmd/pt/train/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
EnergySpinLoss,
EnergyStdLoss,
PropertyLoss,
TaskLoss,
TensorLoss,
)
from deepmd.pt.model.model import (
Expand Down Expand Up @@ -1260,7 +1261,8 @@
loss_params["task_dim"] = task_dim
return PropertyLoss(**loss_params)
else:
raise NotImplementedError
loss_params["starter_learning_rate"] = start_lr
return TaskLoss.get_class_by_type(loss_type).get_loss(loss_params)

Check warning on line 1265 in deepmd/pt/train/training.py

View check run for this annotation

Codecov / codecov/patch

deepmd/pt/train/training.py#L1264-L1265

Added lines #L1264 - L1265 were not covered by tests
ChiahsinChu marked this conversation as resolved.
Show resolved Hide resolved


def get_single_model(
Expand Down