amrvac / amrvac

MPI-AMRVAC: A Parallel Adaptive Mesh Refinement Framework
https://amrvac.org/
GNU General Public License v3.0
73 stars 43 forks source link

problems with restarting in the newest git-version #38

Closed sshestov closed 4 years ago

sshestov commented 4 years ago

Installed a fresh (20 Oct 2019) version of mpi-amrvac from github and observe a problem with restarting from a particular snapshot.

So I run a simulation (in my case it is based on solar_atmoshpere_2.5), then kill it and try to restart it from a given snapshot by either a command line parameter -if, or specifying "restart_from_file" in the used par-file. In both cases it can not start, complaining to the wrong parameters in par-file. I'm pretty sure there was no such behavior with the older (but still v 2.0) version. The output is below, of course the file test/test_mx2_0001.dat is there and I did not change the geometry between the runs.

test_par_dat.zip

sergeis@sergeis-pc1850:~/Projects/Vertical-slab-heating$ mpirun -n 9 ./amrvac -i test.par
 -----------------------------------------------------------------------------
 -----------------------------------------------------------------------------
 |         __  __ ____ ___        _    __  __ ______ ___    ____         |
 |        |  \/  |  _ \_ _|      / \  |  \/  |  _ \ \   / / \  / ___|        |
 |        | |\/| | |_) | |_____ / _ \ | |\/| | |_) \ \ / / _ \| |            |
 |        | |  | |  __/| |_____/ ___ \| |  | |  _ < \ V / ___ \ |___         |
 |        |_|  |_|_|  |___|   /_/   \_\_|  |_|_| \_\ \_/_/   \_\____|        |
 -----------------------------------------------------------------------------
 -----------------------------------------------------------------------------
 Use Colgan & Feldman (2008) cooling curve
 This version only till 10000 K, beware for floor T treatment
 Reading test.par

 Output type | dtsave    | ditsave | itsave(1) | tsave(1)
         log | 0.100E+00 | ******  |      0    | *********
      normal | 0.200E+01 | ******  |      0    | *********
       slice | ********* | ******  | ******    | *********
   collapsed | ********* | ******  | ******    | *********
    analysis | ********* | ******  | ******    | *********

                 typelimited: predictor
         Domain size (cells): 24 48
                Level one dx: 0.833E-02 0.208E-01
           Refine estimation: Lohner's scheme
 Note, Grid is reconstructed once every           3 iterations
           restart_from_file:  test/test_mx2_0001.dat
                  converting: F

 minra  0.56631503347258905
 rhob   7277.0929422705231
 pb   58.216745042170217
 ERROR for processor           0 :
 change in geometry in par file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 128.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
jannisteunissen commented 4 years ago

I think this should be fixed in the latest commit (on master), can you check?

sshestov commented 4 years ago

Looks like now it works! Merci beaucoup!

jannisteunissen commented 4 years ago

Great, thanks for reporting it!

neutrinoceros commented 4 years ago

Thanks for fixing this Jannis, I wasn't able to inspect it earlier.