Open jltuhnu opened 8 months ago
but when I move it to the supercomputing
Just to be clear, you are using the same version of libmesh in both places, and when you say "move it" you mean that you re-complied both your application and libmesh itself using the native compilers of each system?
Which version (or git hash) of libmesh are you using exactly?
you mean that you re-complied both your application and libmesh itself using the native compilers of each system?
yes, I use the same version of libmesh, and re-complied both my application and libmesh in different environments. There is a significant version gap in MPI and GCC between the two instances.
Which version (or git hash) of libmesh are you using exactly?
the git HEAD
of libmesh :
commit cd2192759881961864dce40ecbc66af2b7c9c1fd (HEAD -> devel, origin/master, origin/devel, origin/HEAD)
Merge: 7729913f8 3905097b5
Author: John W. Peterson <jwpeterson@gmail.com>
Date: Thu Feb 29 17:59:17 2024 -0600
I want to emphasize that when I use equation_systems.get_system("twophase").solve()
, everything works fine. but i need much more infomation of the slover so I write the void inbsolve()
function.
OK, well I see some relevant looking commented out code in nonlinear_implicit_system.C:
// FIXME - this is necessary for petsc_auto_fieldsplit
// nonlinear_solver->init_names(*this);
The commented lines were added in 7a24259a92 by @roystgnr but there's no hint as to why they are commented out? This doesn't necessarily explain why you would see different behavior from different versions of PETSc, though. I don't know much about how the petsc_auto_fieldsplit
code works, and I'm not even sure if we have tests or examples of using it with nonlinear solvers.
@jwpeterson It seems that this issue is not caused by petsc_auto_fieldsplit
. Even after removing petsc_auto_fieldsplit(pc, system);
, the error persists for non-fieldsplit preconditioners. I am wondering if there are any missing elements related to the DM type 'libmesh'
in my code that could be causing this issue.
@jltuhnu are you saying that you are not using fieldsplit preconditioner (even via PETSc command line args) and you are still getting that same error message (DM type 'libmesh' did not attach the DM to the matrix
)? If so, I'm not sure how that's possible since I don't think anything other than the fieldsplit preconditioner uses the DM
...
@jwpeterson are you saying that you are not using fieldsplit preconditioner (even via PETSc command line args) and you are still getting that same error message (DM type 'libmesh' did not attach the DM to the matrix)
Yes, when I use asm preconditioner also get that error, Although I resolved my issue by modifying the libmesh nonlinear solve() function, I am still curious about the cause of the error with different Petsc versions.
I want to use
petsc_auto_fieldsplit(pc, system)
, The following code runs successfully on my mac (libmesh+petsc-3.19.0), but when I move it to the supercomputing (libmesh+petsc-3.18.0), it throws the petsc error.