FreeFem / FreeFem-sources

FreeFEM source code
https://freefem.org/
Other
746 stars 188 forks source link

Access TSConvergedReason with TSSolve plugin #305

Closed cmd8 closed 4 months ago

cmd8 commented 4 months ago

This commit doesn't produce any errors, but it also doesn't finish the job... Something is still missing. Any ideas? My goal is to get TSConvergedReason with

int ret;
TSSolve(..., reason = ret, ...);
prj- commented 4 months ago

What do you mean by "missing"? Is it not returning what you want? Could you share a MWE?

cmd8 commented 4 months ago

Here is a MWE from the heat-TS-2d-PETSc.edp example. Here, following a successful run, we should have ret > 0.

//  run with MPI:  ff-mpirun -np 4 script.edp
// NBPROC 4

load "PETSc"                        // PETSc plugin
include "macro_ddm.idp"             // additional DDM functions

macro def(i)i// EOM                         // scalar field definition
macro init(i)i// EOM                        // scalar field initialization
macro grad(u)[dx(u), dy(u)]// EOM           // two-dimensional gradient
func Pk = P1;                               // finite element space

macro ThRefinementFactor()getARGV("-split", 1)//
int[int] l = [1, 2, 2, 2];
mesh Th = square(getARGV("-global", 40), getARGV("-global", 40), label = l);
fespace Wh(Th, Pk);           // local finite element space
Mat T;
MatCreate(Th, T, Pk);

real[int] rhs(Wh.ndof);                     // local right-hand side
matrix<real> Loc;                           // local operator
{                                           // local weak form
    fespace Ph(Th, P0);
    Ph kappa = x < 0.25 ? 10.0 : 1.0;
    varf vPb(u, v) = int2d(Th)(-1.0 * kappa * grad(u)' * grad(v)) + on(1, u = 0.0);
    Loc = vPb(Wh, Wh, tgv = -2);
    rhs = vPb(0, Wh, tgv = -2);
}

func real[int] funcRes(real t, real[int]& in, real[int]& inT) {
    real[int] out(in.n);
    T = Loc;
    MatMult(T, in, out);
    out = inT - out;
    return out;
}
real shift = 0;
matrix Id;
{
    real[int] D(Loc.n);
    D = 1.0;
    Id = D;
}
func int funcJ(real t, real[int]& in, real[int]& inT, real a) {
    matrix B = (-1.0) * Loc + a * Id;
    T = B;
    shift = a;
    return 0;
}
Wh w;
func int funcM(int s, real t, real[int]& u) {
    ChangeNumbering(T, w[], u, exchange = true, inverse = true);
    plotMPI(Th, w, Pk, def, real, cmm = "Global solution at step " + s + " (time " + t + ")");
}
w = (0.5 - x)^2 + (0.5 - y)^2 < 0.2 ? 1.0 : 0.0;
real[int] wPETSc;
ChangeNumbering(T, w[], wPETSc);
int ret;
TSSolve(T, funcJ, funcRes, wPETSc, reason = ret, sparams = "-ts_type beuler -ts_dt 0.1 -ts_max_time 100 -ts_exact_final_time interpolate -ts_max_snes_failures -1 -ts_view -pc_type lu -ts_adapt_type basic -ts_rtol 1e-3", monitor = funcM);
if(mpirank==0) cout << "ret = " << ret << endl;
prj- commented 4 months ago

I'm not sure I understand why you are saying it is not working. Here is what I get.

$ ff-mpirun -n 1 dump.edp -v 0 | grep "ret ="
ret = 1
$ ff-mpirun -n 1 dump.edp -v 0 -ts_max_steps 1  | grep "ret ="
ret = 2

This matches with https://petsc.org/release/manualpages/TS/TSConvergedReason/, doesn't it?

cmd8 commented 4 months ago

Weird. This is not working for me, even if I completely recompile FF. Using the modification in the commit, I ran:

cd ${FF_DIR}
./reconfigure
make -j4

I tried this in addition to the more direct approach of:

cd ${FF_DIR}/plugin/mpi
../seq/ff-c++ -auto PETSc.cpp

Either way, running the MWE I sent above, I get:

ff-mpirun -np 1 test.edp -v 0 | grep "ret ="
ret = 0

Am I doing something stupid here?

prj- commented 4 months ago

I'm guessing you are not using the proper PETSc.so. If you had a random cout, does it get printed?

prj- commented 4 months ago

Any update @cmd8?

cmd8 commented 4 months ago

Sorry for the delay, I will check into this later morning. But it sounds like the issue is clearly on my end so the PR can be merged.

cmd8 commented 4 months ago

Ok. cout is getting printed. Further, I ran the following MWE (from navier-stokes-2d-PETSc.edp) using SNESSolve, and I see that reason is being updated:

//  run with MPI:  ff-mpirun -np 4 script.edp
// NBPROC 4

load "PETSc"

mesh Th;
{
    real R = 5, L = 35;
    border a(t=0, 2*pi) {x=2.0+cos(t)/2; y=sin(t)/2; label=1;}
    border b(t=pi/2, 3*pi/2) {x=cos(t)*R; y=sin(t)*R; label=2;}
    border c(t=0, 1) {x=t^0.9*L; y=-R; label=3;}
    border d(t=1, 0) {x=t^0.9*L; y=R; label=3;}
    border e(t=-R, R) {x=L; y=t; label=0;}
    real ratio = 1.0;
    Th = buildmesh(a(-70*ratio) + b(30*ratio) + c(80*ratio) + d(80*ratio) + e(20*ratio));
    plot(Th);
}

macro dimension()2//
include "macro_ddm.idp"

macro def(i)[i, i#B, i#C]//
macro init(i)[i, i, i]//
func Pk = [P2, P2, P1];
macro grad(u)[dx(u), dy(u)]//
macro div(u)(dx(u#1) + dy(u#2))//
macro UgradV(u, v)[[u#1, u#2]' * [dx(v#1), dy(v#1)],
                   [u#1, u#2]' * [dx(v#2), dy(v#2)]]//
real Re = getARGV("-Re", 50.0);
real nu;
fespace Wh(Th, Pk); // complete space [u, v, p]
fespace Qh(Th, P1); // pressure space for Schur complement

Mat J;
int[int] n2o;       // need to know how to go from the local to the global mesh
mesh ThBackup = Th; // need to backup the original global mesh
macro ThN2O()n2o//
MatCreate(Th, J, Pk);
Wh [uc1, uc2, pc] = [1, 0, 0];
varf vRes([u1, u2, p], [v1, v2, q]) = int2d(Th)(
      nu * (grad(uc1)' * grad(v1) +
            grad(uc2)' * grad(v2))
    + UgradV(uc, uc)' * [v1, v2]
    - pc * div(v) - div(uc) * q)
    + on(3, u1 = uc1-1)
    + on(1, u1 = uc1-0, u2 = uc2-0)
    + on(2, u1 = uc1-1, u2 = uc2-0);
varf vJ([u1, u2, p], [v1, v2, q]) = int2d(Th)(
      (UgradV(uc, u) + UgradV(u, uc))' * [v1, v2] +
      nu * (grad(u1)' * grad(v1) +
            grad(u2)' * grad(v2))
    - p * div(v) - div(u) * q)
    + on(3, u1 = uc1-1)
    + on(1, u1 = uc1-0, u2 = uc2-0)
    + on(2, u1 = uc1-1, u2 = uc2-0);
set(J, sparams = "-pc_type lu");
func real[int] funcRes(real[int]& inPETSc) {
    ChangeNumbering(J, uc1[], inPETSc, inverse = true, exchange = true);
    real[int] out(Wh.ndof);
    out = vRes(0, Wh, tgv = -1);
    ChangeNumbering(J, out, inPETSc);
    return inPETSc;
}
func int funcJ(real[int]& inPETSc) {
    ChangeNumbering(J, uc1[], inPETSc, inverse = true, exchange = true);
    J = vJ(Wh, Wh, tgv = -1);
    return 0;
}
for(int i = 0; i < 50 && Re < 100.0; ++i) {
    real[int] xPETSc;
    ChangeNumbering(J, uc1[], xPETSc);
    nu = 1.0/Re;
    int ret;
    SNESSolve(J, funcJ, funcRes, xPETSc, reason = ret, sparams = "-snes_monitor ");
    if(mpirank==0) cout << "ret = " << ret << endl;
    ChangeNumbering(J, uc1[], xPETSc, inverse = true, exchange = true);
    if(!NoGraphicWindow) {
        Qh only = pc;
        macro def1(i)i//
        plotMPI(Th, only, P1, def1, real, cmm = "Pressure for Re = " + Re);
        fespace Zh(Th, [P2, P2]);
        Zh [onlyU, onlyV] = [uc1, uc2];
        macro def2(i)[i, i#B]//
        plotMPI(Th, [onlyU, onlyV], [P2, P2], def2, real, cmm = "Velocity for Re = " + Re);
    }
    Re *= 1.5;
    if(usedARGV("-adaptation") != -1) {
        fespace WhBackup(ThBackup, Pk);
        WhBackup def(uG), def(uReduce);
        uc1[] .*= J.D; // scale the solution by the partition of unity to avoid multiple summations on the overlap
        int[int] rest = restrict(Wh, WhBackup, n2o);
        for[i, v : rest] uReduce[][v] = uc1[][i]; // going from local to global
        mpiAllReduce(uReduce[], uG[], mpiCommWorld, mpiSUM);
        if(mpirank == 0) {
            ThBackup = adaptmesh(ThBackup, def(uG));
            plot(ThBackup, wait = 1);
        }
        broadcast(processor(0), ThBackup);
        def(uG) = def(uG);
        Th = ThBackup;
        Mat Adapt;
        MatCreate(Th, Adapt, Pk) // decompose the adapted mesh
        J = Adapt; // replace the old Jacobian
        rest = restrict(Wh, WhBackup, n2o);
        [uc1, uc2, pc] = [0, 0, 0]; // just to trigger a resize
        for[i, v : rest] uc1[][i] = uG[][v]; // going from global to local
    }
}
DmeshSave(Th, "navier-stokes-2d");
ofstream sol("navier-stokes-2d_" + mpirank + "_" + mpisize + ".sol");
sol << uc1[];

Running this, I get: ff-mpirun -n 1 examples/hpddm/navier-stokes-2d-PETSc_mod.edp -v 0 | grep "ret = " ret = 3 ret = 3

prj- commented 4 months ago

How about the reason for TSSolve()? This is not the same code path, so I'm just double-checking whether all is good on your end as well for TSSolve() (I believe it should).

cmd8 commented 4 months ago

Strangely, reason is still not being updated for TSSolve() on my end. When I run the MWE above from heat-TS-2d-PETSc.edp I get:

ff-mpirun -n 1 examples/hpddm/heat-TS-2d-PETSc_mod.edp -v 0 | grep "ret = " 
ret = 0

Where the only difference in the modified file is as below:

diff examples/hpddm/heat-TS-2d-PETSc.edp examples/hpddm/heat-TS-2d-PETSc_mod.edp
57c57,59
< TSSolve(T, funcJ, funcRes, wPETSc, sparams = "-ts_type beuler -ts_dt 0.1 -ts_max_time 100 -ts_exact_final_time interpolate -ts_max_snes_failures -1 -ts_view -pc_type lu -ts_adapt_type basic -ts_rtol 1e-3", monitor = funcM);
---
> int ret;
> TSSolve(T, funcJ, funcRes, wPETSc, reason = ret, sparams = "-ts_type beuler -ts_dt 0.1 -ts_max_time 100 -ts_exact_final_time interpolate -ts_max_snes_failures -1 -ts_view -pc_type lu -ts_adapt_type basic -ts_rtol 1e-3", monitor = funcM);
> if(mpirank==0) cout << "ret = " << ret << endl;
prj- commented 4 months ago

And do you get the cout that you added?

cmd8 commented 4 months ago

I'm sorry, do you mean that I should add a cout in PETSc-code.hpp? The cout is working in the modified .edp file.

prj- commented 4 months ago

Yes, to make sure that the code you are thinking you are using is the one FreeFEM is indeed using (I'm thinking it is not). Just saw your edit, hm, that's puzzling.

cmd8 commented 4 months ago

It seems your hunch was correct. I must've messed something up when I was playing with other things. I'll do a from-scratch build and see if that fixes things. Sorry for the edits.

prj- commented 4 months ago

Most importantly, make sure you do not have multiple PETSc.so lying around.

cmd8 commented 4 months ago

Thanks for your help, Pierre. All is now working on my end.

ff-mpirun -n 1 examples/hpddm/heat-TS-2d-PETSc_mod.edp -v 0 | grep "ret ="            
ret = 1
ff-mpirun -n 1 examples/hpddm/heat-TS-2d-PETSc_mod.edp -v 0 -ts_max_steps 1 | grep "ret ="
ret = 2
prj- commented 4 months ago

Phew, got scared there for a sec 😅