Closed jcotela closed 5 years ago
:+1: for adding this for the next release!
Hi guys, I'm realizing that this is a very ambitious change for a week (the planned date for the fork), so I'm dropping it from the next Release. I'd postpone it to the eventual release 5.3. :(
may I add #1090 to the list? :)
and maybe what I did in StructuralMechanics: Having the possibility to define Variables in ProjectParams that are to be added to the ModelPart, see #1113 (see auxiliary_variables_list
)
Hi @philbucher I'll look into it. However, I'd prefer a solution in the line of #869, at least for utilities or processes that will always need the same variables.
hi @jcotela it is just a suggestion, it might not be relevant for the Fluids.
Just as an example of when I need it: Some elements in the StructuralMechanicsApp can compute results for variables that are not added by the solver by default. Those I can add now through ProjectParams to ask for results in the outputprocess.
@philbucher I agree that this is needed, also for the fluids. And your solution is the only one that works as the code is now. Still, I think it is the job of the output process to read those variables and take care that they have been added, as this would make the code more modular.
Now I understood, the output process is treated in the same way! Somehow I haven't seen thos before
But yes, it makes absolute sense (I will revert my changes then once the adding is working)
Just for future reference, this is the proposed list of solvers:
FluidSolver
. This is being implemented in #2262.NavierStokesSolverVMSMonolithic
. It already supports other stabilizations, so the name should be changed. Ideally, it should also support different time schemes (BDF2/Bossak/"internal" element-based time integration)NavierStokesCompressibleSolver
.NavierStokesEmbeddedMonolithicSolver
, NavierStokesEmbeddedMonolithicAusasSolver
and NavierStokesEmbeddedMonolithicFMALESolver
. Ideally, we'd merge the different formulations with a trick similar to what we do to choose the stabilized formulation in the body-fitted monolithic solver, but we'll see how it can be best done.NavierStokesSolverFractionalStep
.NavierStokesSolverVMSMonolithic
(currently TrilinosNavierStokesSolverMonolithic
)TrilinosNavierStokesEmbeddedMonolithicSolver
and TrilinosNavierStokesEmbeddedMonolithicAusasSolver
.adjoint_vmsmonolithic_solver
to new framework.All current solvers not on the list will be removed in the final stage of the clean-up. Most of them are outdated or duplicates, so no features should be lost.
Note to self/to-do: default settings for different solvers should have "wrong" values when they must be set from outside (domain_size
is an obvious example, which is currently initialized to 2 in most solvers).
I think it would be nice to have this for the release
Just to add that it might me worth changing the default mpi-solver from Trilinos-MultiLevel to AMGCL (FYI @ddemidov ) since @AndreasWinterstein found out in his recent scaling-studies that it is much faster
Also then the same solver would be used in serial and MPI
+1 on my side (and kudos to @ddemidov)
On Tue, Feb 5, 2019, 10:04 AM Philipp Bucher <notifications@github.com wrote:
Just to add that it might me worth changing the default mpi-solver from Trilinos-MultiLevel to AMGCL (FYI @ddemidov https://github.com/ddemidov ) since @AndreasWinterstein https://github.com/AndreasWinterstein found out in his recent scaling-studies that it is much faster
Also then the same solver would be used in serial and MPI
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/KratosMultiphysics/Kratos/issues/871#issuecomment-460561497, or mute the thread https://github.com/notifications/unsubscribe-auth/AHr7Edu6CAaLbaioMAV6Gtk-oShFiAppks5vKUkGgaJpZM4QARJV .
kudos @ddemidov!
adding #4390 to the list
adding #4736 to the list (once #4736 is merged)
is this really closed? I think there are some things left to do
@philbucher kind of... @rubenzorrilla and me agreed not to rename the existing solvers for now. I believe that the only solver on that list that is really missing is the trilinos compressible solver, but there is no movement on that front at the moment
I meant #4390 and #4121 are not implemented yet
@philbucher I added them to the FluidDynamicsApplication project, I think it will be easier to keep track of them in this way.
We need to do a bit of clean-up of the FluidDynamicsApplication python solvers: right now we have a lot of them and abundant duplication. I think this would also be a good opportunity to settle on a common solver class and a uniform way of dealing with processes or, at the very least, have a set of "clean and nice" solvers that work from GiD.
Tasks:
[x] Identify minimal set of needed python solvers for the different combinations: monolithic/embedded/fractional step, openMP/MPI...
[x] Ensure that all "clean" solvers derive from the base fluid_solver class, modifying it if needed.
[x] Ensure that this minimal set of solvers work properly from the GiD interface.
[x] Remove as many of the "old" solvers as possible.