Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Manual loss weights adaptation in TF2.0 #1656

Open
wants to merge 18 commits into
base: master
Choose a base branch
from

Conversation

haison19952013
Copy link

What?

  • Supported and tested backend: tensorflow
  • Details:
    • Provide a way to change the loss weights dynamically via callbacks without the need to recompile the model
    • Currently works for non-gradient-based adaptive loss weight scheme

Why?

  • Motivation
  • Help deepxde users can formulate their non-gradient-based adaptive loss weights scheme

How?

  • In model.py:
    • Add loss_weights as instances for functions that work for tensorflow backend
  • In callbacks.py, give some examples on how to define a calback to change the loss_weights
    • Add ManualDynamicLossWeight: to change the loss weights based on the specified index
    • Add PrintLossWeight: to display the loss weights based on the specified period

Testing?

  • A working example is given in deepxde\examples\pinn_inverse\elliptic_inverse_field_manual_dynamic_loss_weights.py

Future work

  • Work on gradient-based adaptive loss weight scheme

@lululxvi
Copy link
Owner

Format the code via black https://github.com/psf/black

@haison19952013
Copy link
Author

Format the code via black https://github.com/psf/black

Updated

deepxde/callbacks.py Outdated Show resolved Hide resolved
@@ -119,7 +119,9 @@ def compile(
print("Compiling model...")
self.opt_name = optimizer
loss_fn = losses_module.get(loss)
self.loss_weights = loss_weights
self.loss_weights = tf.convert_to_tensor(
Copy link
Owner

@lululxvi lululxvi Mar 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • How about loss weights is None?
  • Using tf here will break other backends.

@pescap
Copy link
Contributor

pescap commented Jun 5, 2024

@haison19952013, do you plan to keep working on this PR? If not, I will continue the work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants