Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update baseline number for vit-base-patch16-224-in21k on G1 and G2 #1660

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

deepak-gowda-narayana
Copy link
Contributor

What does this PR do?

Resolve test case failure observed for model vit-base-patch16-224-in21k with PyTorch 2.5 version

PR does the follow:

  1. Updates G1 and G2 baseline performance numbers of the model vit-base-patch16-224-in21k
  2. Add an extra parameter --sdp_on_bf16 for the tests

( Referenced the implementation of --sdp_on_bf16 from the related pull request : #1555 )

Test Results :

tests/test_examples.py::ImageClassificationExampleTester::test_run_image_classification_vit-base-patch16-224-in21k_single_card
tests/test_examples.py::ImageClassificationExampleTester::test_run_image_classification_vit-base-patch16-224-in21k_multi_card

Gaudi 1 single and multi card
PASSED
1 passed, 33 deselected in 205.85s (0:03:25)
PASSED
1 passed, 33 deselected in 126.20s (0:02:06)

Gaudi 2 single and multi card
PASSED
1 passed, 69 deselected in 91.75s (0:01:31)
PASSED
1 passed, 69 deselected in 57.03s

-make style
ruff check . setup.py --fix
All checks passed!
ruff format . setup.py
410 files left unchanged

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@deepak-gowda-narayana
Copy link
Contributor Author

@libinta @yeonsily Request to review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant