idaholab / moose

Multiphysics Object Oriented Simulation Environment
https://www.mooseframework.org
GNU Lesser General Public License v2.1
1.7k stars 1.04k forks source link

Steady state detection tolerance criteria #26312

Open oanaoana opened 9 months ago

oanaoana commented 9 months ago

Bug Description

A steady state for a problem $\dfrac{du}{dt}=F(u)$ is defined as $F(u)=0$. If one time marches to a solution then the stopping criteria should be based on $|F(u)|$ (possibly in relative norm if small). The current criteria is based on the timestep as in https://mooseframework.inl.gov/source/executioners/Transient.html. This assumes

Steps to Reproduce

Any steady state computation with a known solution.

Impact

Reduce the number of timesteps for problems with a solution of small magnitude, increase the number of time-steps but provide a more accurate solution in cases where the solution norm is large.

lindsayad commented 9 months ago

You could use the MP TA charge number for this work if you want. If you do use the MP TA charge number let me know, and I will file this under the MP TA github project

grmnptr commented 9 months ago

I think this is a great catch, would add a nice feature to the steady-state detection. Especially that we have access to the steady-state residual vector.

lindsayad commented 9 months ago

The residual vector, using the notation from the title, is equal to du/dt - F(u), so you cannot get much information for determining a steady-state just from inspecting the residual vector. You could look at the norm of the time or nontime vectors though

oanaoana commented 9 months ago

Yes the nontime vector should be used.

lindsayad commented 9 months ago

The other criteria currently available if you set normalize_solution_diff_norm_by_dt = false is:

(u - u_old).norm() / u.norm() < ss_tol

lindsayad commented 9 months ago

Well if your solver has done its job, then the nontime vector should be the negative of the time vector (within the solver tolerance)

oanaoana commented 9 months ago

Not sure what you mean by time vector, the entire $\dfrac{du}{dt}-F(u)$? The only relation I see is that they should both be zero, but what kind of zero is the problem.

oanaoana commented 9 months ago

My main issue with this is if it should change the current tolerance crieria, or add new names like steady_state_check (abs, rel or just abs haven't figured out yet how needed rel is). I'd say have a new name at first then switch it cause it may affect old simulations, but don't know the users.

grmnptr commented 9 months ago

Not sure what you mean by time vector, the entire dudt−F(u)? The only relation I see is that they should both be zero, but what kind of zero is the problem.

I think what he is saying is that at the end of the newton iteration the norm of F (steady-state residual or nontime vector in moose) will be the same as the norm of the timederivative. We are kinda-sorta using the norm of the timederivative to detect steady-state instead of the norm of F. I haven't looked at the code but we only use an Implicit Euler like approximation which would be inconsistent with time terms that are not discretized like that or if they have material multipliers. I still like Oana's suggestion on testing either the time or nontime vector norms. I wonder if testing the time vector is cheaper.

GiudGiud commented 9 months ago

If you dont have a customer asking for this, I do not believe this is TA work.

I would rather the Convergence system (which we have funding for) be created than we add more independent options.

GiudGiud commented 9 months ago

I'd say have a new name at first then switch it cause it may affect old simulations, but don't know the users.

If we do proceed with this, yes that is the right appeoach

grmnptr commented 9 months ago

Oh I thought this already had the green light from the MP TA part. I remember a restart case from a year ago where the Griffin team restarted something from a simulation with steady state detection and it was not at steady state due to switching timestep size or time discretization. This would likely solve all of those issues if the nontime vector is used.

GiudGiud commented 9 months ago

If considering the absolute norm instead of the relative norm fixes our restart issues, there's still something we dont understand. Because varying the size of the final time step also linearly changes the restart residual (which we do understand and was definitely an issue too). I was under the impression changing the automatic scaling factors was the main culprit after that.

In the end, if dt is very large du/dt with an Euler scheme does get scaled down a lot. But since we are solving du/dt = F, the two norms are equal within solver tolerances. So using a different norm should not change the result much in that regard? Not dividing by dt (which is already an option) fits more what we need for that problem of restarting with different dts

Note that neutronics does not use relaxation transients for reaching steady state. So it must have been a thermal solve in the Griffin team

GiudGiud commented 9 months ago

Funding is not a taboo topic and there's no reason we can't discuss it here. We have to be assigning tickets to the right pot. Though between two NEAMS projects it's much less of a worry than between direct and indirect. Sorry if it did not come off the right way.

oanaoana commented 9 months ago

Funding is not a taboo topic and there's no reason we can't discuss it here. We have to be assigning tickets to the right pot. Though between two NEAMS projects it's much less of a worry than between direct and indirect. Sorry if it did not come off the right way.

It came across as intended. Point taken.

lindsayad commented 8 months ago

If you dont have a customer asking for this, I do not believe this is TA work.

It seemed like @grmnptr is interested in this. He's a customer. If it takes two hours and < 50 lines of code to add, then I think it's a fine investment while we do not have the convergence criteria system. If it is going to take much more work, then I would agree that it would be better to invest the time in the convergence criteria system.

grmnptr commented 8 months ago

During the FV meeting yesterday we discussed this topic and turns out neither Lise nor Mauricio use the steady-state detection. If I remember correctly the conclusion was that this will not be a priority for now unless there is more interest from the MP TA side. If I am not remembering it correctly, the people here can correct me!

oanaoana commented 8 months ago

Nobody's using it cause they noticed it doesn't work, so it shouldn't be a criteria. I'll see when I do it and on what funding if any, 2h can't be serious with all the overheads, even if it's a quick thing.

lindsayad commented 8 months ago

I wouldn't say that no-one uses it. I believe @joshuahansel uses it and I've used it before

lindsayad commented 8 months ago

I think we're all in agreement that implementing the new math is a good idea. The current thrust though seems to be to wait to add it in the Convergence system though