Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pytest unit tests, new losses support, and normalization enhancement #22

Merged
merged 50 commits into from
Nov 27, 2024

Conversation

franckma31
Copy link
Collaborator

The Torchlip library has undergone several updates to align more closely with the functionalities provided by deel-lip (TensorFlow). Here are the key changes:

  • Transition to pytest for unit tests, sharing a common test suite with deel-lip. The utils_framework.py file has been introduced to facilitate adaptation of names and parameters.

  • Enhancements in the available loss functions:

    • Binary and Multiclass KR and HKR: now support any target style (0,1), (-1,1), ... [target>0 for true value],
    • Multi-GPU Support: binary and multiclass KR and HKR supports multi-gpu mode with preprocessed target
    • Modified HKR Loss: the $\alpha$ parameter (required in [0,1]) with (1- $\alpha$ )HKRLoss+ $\alpha$ HingeLoss (for backward compatibility a warning is raised in case of $\alpha>1$ replacing the given value by $\frac{\alpha}{1+\alpha}$)
    • Tau Cross-Entropy Losses: Support has been added for cross-entropy (binary or categorical) losses with temperature scaling.
    • Reduction Parameters: Loss functions now accept the 'reduction' argument with options: 'mean', 'none', or 'sum'.
  • Normalizers (Spectral and Bjorck): Updated with a stopping criterion based on an epsilon value (eps) rather than a fixed number of iterations

  • Bjorck parenthesis trick added (reduce computational complexity)

Copy link
Collaborator

@cofri cofri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So many valuable changes! Nice and big work

deel/torchlip/modules/loss.py Outdated Show resolved Hide resolved
deel/torchlip/modules/loss.py Outdated Show resolved Hide resolved
deel/torchlip/modules/module.py Outdated Show resolved Hide resolved
deel/torchlip/modules/module.py Outdated Show resolved Hide resolved
deel/torchlip/modules/loss.py Show resolved Hide resolved
tests/utils_framework.py Outdated Show resolved Hide resolved
tests/test_condense.py Outdated Show resolved Hide resolved
@franckma31 franckma31 force-pushed the feat/update_normalizers branch from 7592b90 to 9658d27 Compare November 7, 2024 16:41
@franckma31 franckma31 force-pushed the feat/update_normalizers branch from 47dcf7c to 9c2d4a1 Compare November 13, 2024 09:23
@cofri cofri merged commit 4360edf into develop Nov 27, 2024
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants