Closed Vahid-Galavi closed 3 years ago
Can you upload the matrix as MatrixMarket file?
have you tried the other solvers from the LinearSolversApplication
?
also adding @armingeiser
I would also say that it is important to have a look at the matrix where this is happening. Only then we can figure out if the matrix is actually singular and the error in LinearSolversApplication (the error is done by a check inside the external library Eigen, not Kratos) is expected to appear or not.
Sorry for the late response. I was a little busy today. I will send you the matrices as soon as possible.
This is an example, for which linear solver returns singular matrix but the external solver solves it successfully. There are also some other cases that I am going to search and share here.
I tested Eigen::SparseLU
with your matrix (outside of Kratos) and it doesn't seem to have any problems with it. What is the "scaling" parameter doing?
I tested
Eigen::SparseLU
with your matrix (outside of Kratos) and it doesn't seem to have any problems with it. What is the "scaling" parameter doing?
I use "scaling" to scale matrices when I use combination of elements with totally different stiffness. It should not change the behaviour. In this case, setting scaling to true or false, does not change the behaviour and still get "singular matrix".
I also attached the matrix without scaling. For both cases, LinearSolversApplication returns singular matrix.
By the way, I am getting this matrix by setting "echo_level" to 4. Maybe this matrix is not last matrix before solving. Is it possilble?
This is the message that I get (if helps):
RuntimeError: Error: THE MATRIX IS STRUCTURALLY SINGULAR ... ZERO COLUMN AT 211
in D:/src/Kratos/applications/LinearSolversApplication/custom_solvers/eigen_sparse_lu_solver.h:53:EigenSparseLUSolver
scaling
seems to create a wrapper around the actual linear solver, i was not aware this exists.
https://github.com/KratosMultiphysics/Kratos/blob/ebf673f520148a5e167c7bc4058c8f0b063c9a61/kratos/factories/standard_linear_solver_factory.h#L83
So it would be interesting to see if the same problem appears with scaling: false
BTW: numpy can also solve the matrix without problems
scaling
seems to create a wrapper around the actual linear solver, i was not aware this exists. https://github.com/KratosMultiphysics/Kratos/blob/ebf673f520148a5e167c7bc4058c8f0b063c9a61/kratos/factories/standard_linear_solver_factory.h#L83So it would be interesting to see if the same problem appears with
scaling: false
BTW: numpy can also solve the matrix without problems
@armingeiser I got the same error without scaling. See my comments above.
@armingeiser I got the same error without scaling. See my comments above.
@Vahid-Galavi thanks, have not seen this before submitting the comment.
By the way, I am getting this matrix by setting "echo_level" to 4. Maybe this matrix is not last matrix before solving. Is it possilble?
As far as i understand, this outputs the matrix after solving. So from my understanding it should be the scaled matrix if scaling: true
@armingeiser how can we print the matrix before sending to the linear solver?
@armingeiser how can we print the matrix before sending to the linear solver?
This i don't know, @philbucher is there such a possibility?
In the meantime i have tried to solve the matrix with Kratos (see script below), works for me with both, sparse_lu
and pardiso_lu
from LinearSolversApplication.
(for the matrix without scaling the result are one order less accurate)
@armingeiser thanks! But most probably the matrix that I send is from one iteration before the error. I need to find how to print the matrix before sending to the solver (the most recent matrix).
@armingeiser I tried to do it with pardiso_lu. but I get an error that this solver does not exist:
RuntimeError: Error: Trying to construct a Linear solver with solver_type: "pardiso_lu" which does not exist. The list of available options (for currently loaded applications) is: Kratos components amgcl amgcl_ns bicgstab cg deflated_cg monotonicity_preserving scaling skyline_lu_factorization sparse_cg sparse_lu sparse_qr tfqmr
Maybe you need to recompile Kratos with Intel MKL (see Readme of LinearSolversApp).
Keep in mind, that Pardiso performs a regularization. Therefore it might return a result even for a singular/bad conditioned matrix. However, this result may not be mechanically meaningful. Therefore it still makes sense to check if the final matrix is singular.
I need to find how to print the matrix before sending to the solver (the most recent matrix).
I don't know a nice way, but i would probably just add a call to TSparseSpace::WriteMatrixMarketMatrix
right before the SystemSolveWithPhysics
call in the BuildAndSolve
function of your builder and solver.
Hacky but should work...
@armingeiser Thanks! I will check and share the singular matrix here.
Hi @armingeiser This is the matrix right before SystemSolveWithPhysics in BuildAndSolve.
Thanks for your help.
Now with this matrix:
Error: THE MATRIX IS STRUCTURALLY SINGULAR ... ZERO COLUMN AT 211
numpy.linalg.LinAlgError: Singular matrix
The condition number is 1.293e+20
I would close the issue because the current behavior of both solvers in the LinearSolversApp is reasonable.
@oberbichler Can you check it with ExternalSolversApplication.super_lu? This is a direct solver and calculates the example. There are some other cases, that super_lu solves successfully but LinearSolversApplication fails.
you can print it out by raising echo_level of the strategy
On Tue, Aug 17, 2021, 6:37 PM Armin Geiser @.***> wrote:
I need to find how to print the matrix before sending to the solver (the most recent matrix).
I don't know a nice way, but i would probably just add a call to TSparseSpace::WriteMatrixMarketMatrix right before the SystemSolveWithPhysics call in the BuildAndSolve function of your builder and solver.
Hacky but should work...
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/KratosMultiphysics/Kratos/issues/9052#issuecomment-900452988, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB5PWEJY6ISIVNDYWAQYA6TT5KF5RANCNFSM5CGPVYRQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
@Vahid-Galavi Pardiso can also factorize this matrix. The "problem" is that (numerically) it is hard to distinguish between an ill-conditioned and a singular matrix. Actually, it is desirable to be able to solve badly conditioned matrices. However, this can also lead to incorrect results (see comment from @armingeiser). In these cases it would make sense to include a regularization (problem-dependent) or to check the condition of the matrix/results (expensive). If we would build this checks into the solver, it would slow down all cases where the conditioning is good. In my opinion it makes more sense to implement corresponding functions in the applications/outside the linear solver.
@RiccardoRossi I did. If I set echo_level to 4, then the matrix is written in file but this is a matrix which comes out of the solver. The first matrix that I posted here is that one, which surprisingly is NON-SINGULAR. If I set echo_level to 3, then a very big matrix is printed on screen.
I am facing this problem (singular matrix) more and more with LinearSolversApplication.sparse_lu, while the same example can be solved with ExternalSolversApplication.super_lu. @RiccardoRossi what are the differences between these two solvers? Are they both direct solvers?
@Vahid-Galavi Both are direct and both are implementing the same algorithm (small differences in regularization/zero check probably make the difference). The one from the ExternalSolversApp is just magnitudes slower.
@Vahid-Galavi Pardiso can also factorize this matrix. The "problem" is that (numerically) it is hard to distinguish between an ill-conditioned and a singular matrix. Actually, it is desirable to be able to solve badly conditioned matrices. However, this can also lead to incorrect results (see comment from @armingeiser). In these cases it would make sense to include a regularization (problem-dependent) or to check the condition of the matrix/results (expensive). If we would build this checks into the solver, it would slow down all cases where the conditioning is good. In my opinion it makes more sense to implement corresponding functions in the applications/outside the linear solver.
@oberbichler Thanks for your response. I understand that the problems that I am solving might be ill-conditioned as we are combining elements with completely different stiffnesses (up to an order of 1e8). The thing is that I was quite happy with ExternalSolversApplication.super_lu and now this solver is going to be replaced by solvers from LinearSolversApplication, which is not (yet) suitable for my application.
@Vahid-Galavi Both are direct and both are implementing the same algorithm (small differences in regularization/zero check probably make the difference). The one from the ExternalSolversApp is just magnitudes slower.
@oberbichler Thanks for the confirmation.
@Vahid-Galavi Both are direct and both are implementing the same algorithm (small differences in regularization/zero check probably make the difference). The one from the ExternalSolversApp is just magnitudes slower.
@oberbichler is it possible to include super_lu in LinearSolversApplication?
I have tried with ExternalSolversApplication.super_lu
:
It does solve, but the results are completely wrong for the check in the python script above.
@Vahid-Galavi have you checked if in the cases where sparse_lu
gives an error but super_lu
solves, the results of super_lu
physically make sense?
@oberbichler These are the results when I use ExternalSolversApplication.super_lu. I simplified the example in order to post the stiffness matrix. Considering all the simplification that I did, the results make sense to me.
An these are the results, when I do not simplify the solution. The are quite OK.
I cannot tell you if the plots are usable or not.
It seems to be correct that SparseLU gives a warning for the singular matrix - SuperLU in scipy does the same. The results of the ExternalSolverApp also seem to be wrong. Of course, it is possible that these wrong results lead the Newton algorithm out of the singularity and that it subsequently converges to a useful solution. The RHS is still valid. However, the behavior is rather random.
ExternalSolversApplication.super_lu and now this solver is going to be replaced by solvers from LinearSolversApplication
You can always use ExternalSolversApplication.super_lu
- it is 100% a users choice which solver to use.
I would recommend to compile the LinearSolversApplication with MKL support and try pardiso_lu. It does also solve your problem and is even much faster than both sparse_lu
and super_lu
.
But as @oberbichler says, it might be just luck that the results are reasonable in your case, given the ill-conditioned matrix.
There is nothing we can do about the fact that sparse_lu
raises an error instead of returning questonable results in such a case, since its from an external library (Eigen).
@armingeiser Thanks! I will try with Pardiso.
hi Vahid,
.sparse_lu is the solver implementation by eigen.
super_lu is the "superlu" solver.
i think that the difference is that superlu will do a sort of least square solutions in the case to e system is not solvable (for example if a rigid body motion is present), while sparse_lu apparently does nothing of the sort.
i am not sur of the details though...it may also be a minmum norm solution if other options are not available. you will need to check on the superlu documentation
On Fri, Aug 20, 2021, 3:06 PM Thomas Oberbichler @.***> wrote:
Closed #9052 https://github.com/KratosMultiphysics/Kratos/issues/9052.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/KratosMultiphysics/Kratos/issues/9052#event-5186727828, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB5PWEKVDFGMV3UBC5ZZPWTT5ZHOJANCNFSM5CGPVYRQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
From the Eigen documentation:
This class [Eigen::SparseLU] implements the supernodal LU factorization for general matrices. It uses the main techniques from the sequential SuperLU package (http://crd-legacy.lbl.gov/~xiaoye/SuperLU/).
and
Unlike the initial SuperLU implementation, there is no step to equilibrate the matrix. [...] you can try the basic scaling method at "unsupported/Eigen/src/IterativeSolvers/Scaling.h"
but I assume that this scaling method is the same as the scaling
within Kratos (to be sure you can check the source code of Eigen).
According to the test of @armingeiser the solution of SuperLU is completely wrong. No matter how SuperLU determines this solution: it makes no sense to implement this "feature". However, this does not mean that Newton's algorithm does not converge. It is just that at least one iteration is actually random. If you can live with that I would choose Pardiso or keep the SuperLU.
If all steps of the solution should be reproducible, I would include a regularization. But this should not be done within the linear solver. A simple variant would be a superposition with gradient descent. For this you could build a wrapper (similar to the scaling).
@oberbichler, @armingeiser and @RiccardoRossi Thanks a lot for checking the problem and suggestions!
Description In some cases, LinearSolverApplication returns singular matrix while the same example can be solved by ExternalSolverApplication.
For linear solver, I am using:
and for external solver, I am using