Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multi-system to FEProblemSolve and deploy on Physics syntax #29021

Open
wants to merge 22 commits into
base: next
Choose a base branch
from

Conversation

GiudGiud
Copy link
Contributor

@GiudGiud GiudGiud commented Nov 7, 2024

closes #29019

Includes:

  • support for setting system names in Physics, which can then be coded to assign the variables to the right systems. There is no matching variables to systems from the input directly. I dont think all options are needed for the main class.
  • a reworked FEProblemSolve that includes the fixed point iteration (or once-through loop) using a CV object for each system
  • fixes for residual debugging for multi-system
  • support for using the dynamic pressure for solving the NS equations. This one is a small change but consequential and I need it for the same project as this.

Next items we need (TODO but not in this PR?)

  • Nonlinear convergence for each NL system that don't fight over the EquationSystems parameter (currently an issue)
    This is requiring this PR in libmesh: Add a parameter object on the system, use it in derived classes libMesh/libmesh#4007
  • A decent convergence object for multi-system fixed point. Right now if you tried to use the nonlinear residual (like for multiapp iterations), you'd end up with always considering it converged (because you just solved that nonlinear system, but did not recompute the residual after solving the other nonlinear systems)
  • Relatedly, an execute_on for multisystem fixed-point convergence checks (needed to update any PP during that convergence check) or for post system1 solve (a la post-Executor style)

For these last two tasks, i think the solutions will be fairly close to what gets used for MultiAppFixedPointConvergence when that gets figured out.

@GiudGiud GiudGiud self-assigned this Nov 7, 2024
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch 6 times, most recently from bda2b80 to 665b214 Compare November 12, 2024 17:05
Copy link
Contributor

@joshuahansel joshuahansel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's looking good!

framework/src/executioners/FEProblemSolve.C Outdated Show resolved Hide resolved
framework/include/executioners/FEProblemSolve.h Outdated Show resolved Hide resolved
framework/include/physics/PhysicsBase.h Show resolved Hide resolved
framework/src/problems/FEProblemBase.C Outdated Show resolved Hide resolved
framework/src/actions/AddDefaultConvergenceAction.C Outdated Show resolved Hide resolved
framework/src/executioners/MultiSystemSolveObject.C Outdated Show resolved Hide resolved
framework/src/physics/PhysicsBase.C Outdated Show resolved Hide resolved
framework/src/physics/PhysicsBase.C Outdated Show resolved Hide resolved
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch 4 times, most recently from abbcc10 to b2069b1 Compare November 12, 2024 22:56
@moosebuild
Copy link
Contributor

moosebuild commented Nov 13, 2024

Job Documentation, step Docs: sync website on 8f0ff16 wanted to post the following:

View the site here

This comment will be updated on new commits.

@idaholab idaholab deleted a comment from moosebuild Nov 13, 2024
@moosebuild
Copy link
Contributor

Job Coverage, step Generate coverage on 4c83d80 wanted to post the following:

Framework coverage

61ed70 #29021 4c83d8
Total Total +/- New
Rate 85.13% 85.10% -0.03% 74.00%
Hits 107285 107393 +108 185
Misses 18743 18800 +57 65

Diff coverage report

Full coverage report

Modules coverage

Contact

61ed70 #29021 4c83d8
Total Total +/- New
Rate 90.41% 90.38% -0.03% 66.67%
Hits 4957 4959 +2 4
Misses 526 528 +2 2

Diff coverage report

Full coverage report

Heat transfer

61ed70 #29021 4c83d8
Total Total +/- New
Rate 88.81% 88.81% +0.00% 100.00%
Hits 4404 4406 +2 2
Misses 555 555 - 0

Diff coverage report

Full coverage report

Navier stokes

61ed70 #29021 4c83d8
Total Total +/- New
Rate 84.64% 84.74% +0.10% 96.71%
Hits 17081 17282 +201 353
Misses 3100 3112 +12 12

Diff coverage report

Full coverage report

Optimization

61ed70 #29021 4c83d8
Total Total +/- New
Rate 88.54% 88.55% +0.01% 100.00%
Hits 1971 1973 +2 2
Misses 255 255 - 0

Diff coverage report

Full coverage report

Scalar transport

61ed70 #29021 4c83d8
Total Total +/- New
Rate 88.86% 88.89% +0.03% 100.00%
Hits 367 368 +1 1
Misses 46 46 - 0

Diff coverage report

Full coverage report

Full coverage reports

Reports

Warnings

  • framework new line coverage rate 74.00% is less than the suggested 90.0%
  • contact new line coverage rate 66.67% is less than the suggested 90.0%

This comment will be updated on new commits.

@GiudGiud
Copy link
Contributor Author

Got to add doco and fix the odd straggler tests but otherwise ready to get looked at

- remove extra space on Solve  converged
- check for the existence of a single default NL conv
- make the minimalApp test more resilient to ordering of object creation
…ariable name

Re-order tasks to make adding variables before checking integrity in physics (checks variables & systems)
Add missing required tasks
…oners to function

They rely on solving only one system, then performing operations in their OTHER solveObject (not fixed point solve, not inner_solve)
then solving the second one
// We seek to prevent the MultiSystemSolveObject from solving both systems
// This is abusing input parameters, but SolveObjects do not have their own syntax
// and we need to send this parameter from the executioner to the default nested SolveObject
params.addPrivateParam<bool>("solve_only_first_system", true);
Copy link
Contributor Author

@GiudGiud GiudGiud Nov 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could also be cleaner to set the system_names here and not make it private. I have not made up my mind!
Either way it's messy to be sending parameters like this in my opinion

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gonna be substantial merge conflicts for me 😢

/// @param index index of the nonlinear system
/// @return parameter for that nonlinear system
template <typename T>
T getParamFromNonlinearSystemVectorParam(std::string param_name, unsigned int index) const;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
T getParamFromNonlinearSystemVectorParam(std::string param_name, unsigned int index) const;
T getParamFromNonlinearSystemVectorParam(const std::string & param_name, unsigned int index) const;

Comment on lines +47 to +52
_systems.push_back(&_problem.getSolverSystem(_problem.nlSysNum(sys_name)));
_num_nl_systems++;
}
else if (std::find(linear_sys_names.begin(), linear_sys_names.end(), sys_name) !=
linear_sys_names.end())
_systems.push_back(&_problem.getSolverSystem(_problem.nlSysNum(sys_name)));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not store these separately and avoid the dynamic cast in FEProblemSolve?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that preferred? Is the dynamic cast a worse problem than having to double up any loop acting on the list of systems?

expect_out = "FEProblem\nDefaultNonlinearConvergence\nGeneratedMesh"
expect_out = "FEProblem\n*GeneratedMesh"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could just move the convergence object ahead of the problem

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw someone on discussion get a test failure there. I think this test is brittle

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes it's because @joshuahansel must have done a git sed to change the name and then didn't change the order here

Comment on lines +67 to +68
/// Whether to only solve the first nonlinear system, which is the legacy behavior
const bool _solve_only_first_system;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this data member. If this can be true, then this class is not truly a MultiSystemSolveObject. I think either this data member needs to go away or FEProblemSolve should not inherit from MultiSystemSolveObject

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok that's fair. I did not need this, until I tried to get the optimization module tests to pass. It uses two solve objects m, one for each system.

I have a different solution you might like more.

Comment on lines +65 to +66
/// Convergence object to assess the convergence of the multi-system fixed point iteration
Convergence * _multi_sys_fp_convergence;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we had a class dedicated to multisystem solves then this could be a reference. Trying to retro-fit FEProblemSolve as a multi-system solve really is feeling like a poor fit.

On a different note, I do not know if there are segregated solve options that do not fit in a fixed-point class of algorithms? In which case there may not be a need for this _fp_ denotion

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that there is no dedicated solve object syntax.

So what you are asking means duplicating the executioners instead, adding a MultiSystemSteady and a MiltiSystemTrnsient. This PR makes it work without doing that

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also takes it down the path of objects like IterationAdaptiveDT where we just keep bolting on pieces instead of having a modular system. What I am asking is that we have syntax for solve objects, e.g. executors

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe @joshuahansel wants to work with @loganharbour to finish executors? He knocked down convergence objects

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feels more like they knocked me down 😁 I'd be happy to try to help with executors, but Sockeye deadlines demand my attention for at least two months.

Copy link
Contributor Author

@GiudGiud GiudGiud Nov 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see executors as removing the executioner/solve object separation. This separation is not that natural and performed inconsistently.

This also takes it down the path of objects like IterationAdaptiveDT where we just keep bolting on pieces instead of having a modular system.

No I don't think that s a great analogy because I don't see how you would implement multi-system without the lines I added. I am replacing a single system object with a multi system solve object, not adding multi system as an option to single-system.

Copy link
Member

@lindsayad lindsayad Nov 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in a world of executors and not this current solve object paradigm, it would be quite easy to implement multisystem. You just have a MultiSystem executor which logically would have some kind of convergence object that works naturally with multisystem. Additionally the MultiSystem or MultiXXX executor would have a parameter that takes a vector of other executors to execute. Some of these may be SingleSystem executors, for whom such a vector parameter would not make sense and for whom a multi-system or multi-executor convergence object also doesn't make sense. A MultiXXX object is fundamentally different thatn a SingleXXX object because the MultiXXX object immediately introduces the possibility of iteration. I think there should be clear distinction between objects with MultiXXX functionality and SingleXXX functionality because the parameter set is logically much more complex for the former.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in a world of executors and not this current solve object paradigm, it would be quite easy to implement multisystem. You just have a MultiSystem executor which logically would have some kind of convergence object that works naturally with multisystem

Agree

Additionally the MultiSystem or MultiXXX executor would have a parameter that takes a vector of other executors to execute. Some of these may be SingleSystem executors, for whom such a vector parameter would not make sense and for whom a multi-system or multi-executor convergence object also doesn't make sense

Agree

A MultiXXX object is fundamentally different thatn a SingleXXX object because the MultiXXX object immediately introduces the possibility of iteration. I think there should be clear distinction between objects with MultiXXX functionality and SingleXXX functionality because the parameter set is logically much more complex for the former.

Agree for executors. And for solveObjects I am moving FEProblemSolve from a singleXXX to a MultiXXX, and the parameters are adapted to be vectors.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lindsayad Seems like a good design at first glance - are you requesting that executors be finished before this PR? Or just discussing future refactoring?

Comment on lines +399 to +402
while (_multi_sys_fp_convergence
? (_multi_sys_fp_convergence->checkConvergence(num_fp_multisys_iters) ==
Convergence::MooseConvergenceStatus::ITERATING)
: !converged)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you don't have the convergence object and have multisystem could you iterate forever? I think that if you have multisystem, the convergence object should be a required parameter

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See the constructor, it is required

Comment on lines +65 to +66
/// Convergence object to assess the convergence of the multi-system fixed point iteration
Convergence * _multi_sys_fp_convergence;
Copy link
Member

@lindsayad lindsayad Nov 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in a world of executors and not this current solve object paradigm, it would be quite easy to implement multisystem. You just have a MultiSystem executor which logically would have some kind of convergence object that works naturally with multisystem. Additionally the MultiSystem or MultiXXX executor would have a parameter that takes a vector of other executors to execute. Some of these may be SingleSystem executors, for whom such a vector parameter would not make sense and for whom a multi-system or multi-executor convergence object also doesn't make sense. A MultiXXX object is fundamentally different thatn a SingleXXX object because the MultiXXX object immediately introduces the possibility of iteration. I think there should be clear distinction between objects with MultiXXX functionality and SingleXXX functionality because the parameter set is logically much more complex for the former.

Copy link
Member

@lindsayad lindsayad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this PR should be split into two. Can we change the title to highlight some of the more fundamental additions/changes in this PR? The things I'm concerned about are separate from the Physics classes.

I also do not like that we have hard-coded a certain type of fixed point iteration (for which conceptually we would hope to have something like a FixedPointSolveObject electable from the input file) in a FEProblemSolve object

@grmnptr
Copy link
Contributor

grmnptr commented Nov 18, 2024

The same issues will come up with this I think:
#28821

Maybe it has an excuse that currently there is no executioner that can run a linear system on its own in the framework.

@GiudGiud GiudGiud changed the title Add multi-system to Physics syntax Add multi-system to FEProblemSolve and deploy on Physics syntax Nov 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for multi-system to physics classes
5 participants