Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix use of TAO solution vector #28923

Open
wants to merge 11 commits into
base: next
Choose a base branch
from

Conversation

lynnmunday
Copy link
Contributor

closes #28922

We were changing the internal tao parameters vector and we should not have been
doing that because it changed the way the tao optimization algorithms worked.

We now have two parameter vectors, _parameters for TAO to take ownership of and
another called _local_parameters that is still owned by OptimizeSolve.
We can now update the _local_parameters with the data from TAO and compute our objective and gradient from this.

This causes some poorly designed tests to diff and time out.
These tests will be cleaned up in next set of commits.
… mead algorithm

This test was diff'ing because it relied on the incorrect use of nelder mead to converge
…s set extremely loose to make this solve solve in reasonable amount of time. This loose tolerance will result diffs in the solution. Nelder mead is not meant to be used with so many parameters. This problem is still being tested with gradient based methods
… PETSC errors after fixing the parameters vector. This test is optimizing a quadratic equation that converges in a single iteration. And with the fix to the parameters vector, the linesearch algorithms now work more robustly.
… the per iteration exodus output diff. The test and input files were also cleaned up
…nce with slightly different optimized values than before. closes idaholab#28922
@moosebuild
Copy link
Contributor

moosebuild commented Oct 24, 2024

Job Documentation, step Docs: sync website on e62b0fc wanted to post the following:

View the site here

This comment will be updated on new commits.

@moosebuild
Copy link
Contributor

moosebuild commented Oct 24, 2024

Job Coverage, step Generate coverage on e62b0fc wanted to post the following:

Framework coverage

b3924e #28923 e62b0f
Total Total +/- New
Rate 85.05% 85.05% +0.00% -
Hits 106298 106299 +1 0
Misses 18691 18690 -1 0

Diff coverage report

Full coverage report

Modules coverage

Optimization

b3924e #28923 e62b0f
Total Total +/- New
Rate 88.55% 88.54% -0.01% 100.00%
Hits 1972 1971 -1 8
Misses 255 255 - 0

Diff coverage report

Full coverage report

Full coverage reports

Reports

This comment will be updated on new commits.

@lynnmunday
Copy link
Contributor Author

@lindsayad Will you review this? I documented the changes to the tests in the commit messages.

@lindsayad lindsayad self-assigned this Oct 24, 2024
@moosebuild
Copy link
Contributor

Job Apptainer moose-openmpi on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Conda (Rocky) on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Conda (Ubuntu) on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Conda MOOSE (Linux) on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Conda build (ARM Mac) on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Conda build (Intel Mac) on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Python 3.11 on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Python 3.9 on e62b0fc : invalidated by @lindsayad

@moosebuild
Copy link
Contributor

Job Modules parallel on e62b0fc : invalidated by @lindsayad

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix how the optimization module receives parameters from TAO
3 participants