-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Smaller simplifications of everest baserunmodel #9158
Smaller simplifications of everest baserunmodel #9158
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #9158 +/- ##
==========================================
- Coverage 90.75% 90.75% -0.01%
==========================================
Files 351 351
Lines 21901 21903 +2
==========================================
Hits 19877 19877
- Misses 2024 2026 +2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
6e27ae1
to
4f7f6e1
Compare
4f7f6e1
to
d5f9a80
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, just have one comment/question
optimizer = self._configure_optimizer(simulator) | ||
|
||
# Before each batch evaluation we check if we should abort: | ||
optimizer.add_observer( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So functionally the only difference is that now we add an observer when we create the optimizer (at the end) instead of outside that method?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep. It was always created after initializing, just not within that function.
No description provided.