Closed ktbolt closed 2 months ago
@yuecheng-yu and I want to implement the PETSc interface that @CZHU20 developed for svFSI
in svFSIplus
because we need a direct linear solver.
@ktbolt, do you already have anything that we should build on top? @CZHU20, can we use your C code in svFSIplus?
Sure. No problem.
On Aug 19, 2023, at 05:37, Martin R. Pfaller @.***> wrote:
@yuecheng-yuhttps://github.com/yuecheng-yu and I want to implement the PETSc interface that @CZHU20https://github.com/CZHU20 developed for svFSI in svFSIplus because we need a direct linear solver.
@ktbolthttps://github.com/ktbolt, do you already have anything that we should build on top? @CZHU20https://github.com/CZHU20, can we use your C code in svFSIplus?
— Reply to this email directly, view it on GitHubhttps://github.com/SimVascular/svFSIplus/issues/30#issuecomment-1684459750, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AKOVCMHO2P2VYN6SLKZE35LXV7OCFANCNFSM6AAAAAAWTGMMYE. You are receiving this because you were mentioned.Message ID: @.***>
@mrp089 I have not spent any time thinking about this.
But off the top of my head what we need is an interface to linear solvers implemented something like this
We will first need to understand what data structures are needed to interface to Trilinos and PETSc, then prototype the interface.
@mrp089 @yuecheng-yu I've seen no discussions about the design of the interface.
Possible designs, implementation strategies, prototypes, etc. should be discussed before any code is implemented.
As stated above, #143 is a literal translation of the PETSc
interface we have for svFSI-
(like the rest of svFSI+
). Once you implement the linear_solver
interface you describe here, I'm happy to migrate PETSc
. In the meantime, we (and others) would like to use PETSc
for its wealth of linear solvers.
@ktbolt, please feel free to participate in our discussions in the lab at any time.
@yuecheng-yu I would like to implement the interface I outlined above. I can meet with you at any time if you would like to discuss possible designs and implementation.
@mrp089 I don't think anything more is going to be done with this so go ahead and do a PR for the PETSc interface, good to have this in when we release.
No activity here so closing.
@mrp089 I will create a linear solver interface to the svFSILS, Trilinos and PETSc linear solvers.
Shall I use your petsc_interface_30 branch for PETSc?
Awesome!! Yes, that's still the most up-to-date version. There is also an example in there that uses a direct solver.
I have built PETSc libs from source on a Mac.
After figuring out how to enable building with PETSc and some problems with setting PETSC_DIR
I was able to compile petsc_interface_30
.
I was then able to run tests/cases/fsi/pipe_3d, results seemed ok.
Looking at the petsc_interface_30
I see that petsc_linear_solver.c
is essentially the code written for the Fortran svFSI solver and does just a linear solve, no assembly interface like Trilinos.
My plan is to
1) learn how to use PETSc
2) implement an interface for svFSILS, Trilinos and PETSc
3) rewrite the PETSc interface code in petsc_linear_solver.c
4) add the new interface to the svFSIplus code removing all of the #ifdefs and such
5) Add XML commands to select a solver and read solver configuration files
6) add CMake commands to select building with PETSc (like Trilinos)
I've incorporated the PETSc interface into a branch of the latest svFSIplus code, compiled and tested, works.
I've created a LinearAlgebra
base class that provides an interface to numerical linear algebra packages, just PETSc for now though.
The PetscLinearAlgebra
class derived fromLinearAlgebra
provides an interface to PETSc, contains a private PetscImpl
class hiding all of the PETSc data structures and functions.
A PETSc interface is created like so
auto linear_algebra = LinearAlgebraFactory::create_interface(LinearAlgebraType::petsc);
linear_algebra->initialize(simulation->com_mod);
.
.
.
linear_algebra->solve(com_mod, eq, incL, res);
All of the PETSc-dependent code has been moved to petsc_impl.h,cpp
which is then conditionally included into PetscLinearAlgebra
. I also removed class plsType
from ComMod.h
.
The PETSc interface petsc_linear_solver.c
is a C code interface to Fortran: declared extern functions with underscores, passed scalars as pointers, etc., no attempt was made to make it a C++ code integrated with svFSIplus (e.g. use enum classes defined in consts.h).
I've cleaned up all of this so now the code looks like it is actually part of svFSIplus.
I've added fsils
and trilinos
classes derived from LinearAlgebra
to provide interfaces to FSILS and Trilinos linear algebra packages, tested both on fsi/pipe_3d
and results look good.
All Trilinos solve code has been moved into trilinos_impl.h,cpp
, removed tslType
from ComMod
.
I have not yet implemented using Trilinos for assembly and preconditioning, can be used with FSILS solve it seems, need to understand how these guys work together.
I needed to reorganized the code a bit to get things to work in parallel, fsils works and trilinos works, still some things to do for petsc.
You can now replace
// Assembly
#ifdef WITH_TRILINOS
if (eq.assmTLS) {
if (cPhys == Equation_ustruct) {
throw std::runtime_error("[construct_fsi] Cannot assemble USTRUCT using Trilinos");
}
trilinos_doassem_(const_cast<int&>(eNoN), ptr.data(), lK.data(), lR.data());
} else {
#endif
if (cPhys == Equation_ustruct) {
throw std::runtime_error("[construct_fsi] USTRUCT_DOASSEM not implemented");
} else {
lhsa_ns::do_assem(com_mod, eNoN, ptr, lK, lR);
}
#ifdef WITH_TRILINOS
}
#endif
with
// Assembly
eq.linear_algebra->assemble(com_mod, eNoN, ptr, lK, lR);
It just hit me that the petsc_interface_30
implementation hard-codes using PETSc for all equations. However, Trilinos or FSILS can be specified for each equation.
It would be good to have PETSc work the same way.
I've modified the PETSc interface to support using PETSc on a per equation basis.
I've also added error checking for all of the restrictions for preconditioning and assembly.
@MatteoSalvador @mrp089 I am going to remove support for the Preconditioner
parameter under the LS
section, for example
<LS type="CG" >
<Preconditioner> FSILS </Preconditioner>
<Tolerance> 1e-4 </Tolerance>
</LS>
will no longer be valid, a Linear_algebra
section is required to set the Preconditioner
<Linear_algebra type="fsils" >
<Preconditioner> fsils </Preconditioner>
</Linear_algebra>
I would also like to make the Linear_algebra
section required, currently if it is not there then the linear algebra by default is fsils
.
I've added error checking and changed the LinearAlgebra
classes a bit to reproduce what all the ifdefs were doing.
I will now replace the remaining #ifdef WITH_TRILINOS
statements and test.
I've replace the remaining #ifdef WITH_TRILINOS
statements and have tested on various svFSI-Test problems, seems to work.
And I added a check for the ustruct
equation that requires fsils assembly.
I've also added support for a couple of PETSc preconditioners (jacobi and rcs).
I've made all of the changes I think we need for now, all code for each of the linear algebra interfaces is mostly encapsulated in classes. There is still a bit of assembly code spread around for the ustruct
equation but that is confined to a couple of files.
The PreconditionerType
and other related data structures (e.g. preconditioner_name_to_type
) defined in consts.h
is not very pleasing, would like to have separate enums for each linear algebra interface but I will leave that for another day.
@MatteoSalvador @mrp089 I was thinking to update all of the svFSIplus/tests/cases
files within my current branch so they will work with the code changes.
Shall I do that or should I create another Issue and branch for the file changes?
Thanks @ktbolt! I think you can update the tests within the same PR/issue, as this is strongly connected to the linear algebra modifications
@ktbolt, yes, please update the tests so they work with your code changes (otherwise we can't merge it).
How will we test the PETSc and Trilinos interfaces on GitHub runners? I see two options:
@mrp089
1) Why would the pipelines run for hours?
2) Building Docker containers would be a good idea. We could then use those containers on an HPC cluster.
We do need to monitor memory and execute time, make sure those don't change significantly with code modifications.
In my experience, building PETSc or Trilinos took a long time. But if you have a minimal configuration that builds quickly everything that we need for svFSI and nothing more, that would be great!
In this case, maybe add the build instructions as separate bash configure scripts that are then called in .github/workflows/test.yml
. Then people can execute the same PETSc/Trilinos build instructions independently of the GitHub runners.
GitHub's limit is six hours. We can set our own time limit with timeout-minutes
.
I didn't think PETSc took so long to build but maybe (probably) I was not building it in all of its glory.
I built Trilinos according to the svFSI instructions. The primary problem I had there was getting mumps
installed.
Merged into main.
svFSIplus currently supports an interface to the Trilinos linear solver. The interface for matrix assembly looks like this
and is included in the assembly code for all equations (e.g.,
construct_fluid()
for the fluid equation), about 20 files.A better design would be to replace all of these
#ifdef
blocks with a call to a linear solver interface, for examplewhere
linear_solver
is an object created for a specific linear solver (e.g., Trilinos, FSILS, PETSc).