Open ValletRomain opened 2 years ago
there is no reason that the solver didn't converge (if the pde is the same when you use 40 mats and the solver works). The algebraic system is identical. Probably a json error somewhere (else there is a bug in cfpdes). For the perf, indeed the time can increase with the number of mat (but normally not too much, except if you use symbol mat prop without matname).
Hello @vincentchabannes ,
I put new version of CFG and JSON file with less variables in Materials
and with section PostProcess
commented.
The new results are :
Materials
: 4640.88 s for solver and 5.921897e+03s for execution timeMaterials
: 88.3281 s for solver and 1.209600e+02s for execution timeMy JSON and CFG files are here :
I give you the mesh.
Stale issue message
@feelpp/hifimagnet could you check this testcase again ? @jermuzet?
The files don't seem to exist anymore, or I can't access them, but the elasto-thermo-magnetic model with the magnet M19061901 works well with the v109 and v110. This model has 322 materials and converges in sequential and parallel computation.
Hello, @vincentchabannes , @romainhild , @prudhomm
With toolboxe cfpdes, I try to run a resolution of coupled elastic, thermic and magnetism equations on magnet with 14 Helices.
I have about 300 Materials and the mesh have 120000 nDof. I resolve with Newton solver and lu method. My simulation doesn't converge and take a lot of time : for 2 iterations with 1 core, it takes 1100s->3h !
I try to do this : I reduce the number of materials to 300 to about 40 without change the problem. I resolve with Newton solver and lu method. My simulation converges, gives good results (good physical values) and takes about 200s.
I think, the number of Materials changes the performance of solver. Is it right ?
My JSON and CFG file are here :