CDAT / cdms

8 stars 10 forks source link

cdms issue with mpi4py #257

Closed durack1 closed 5 years ago

durack1 commented 6 years ago
On 7/4/18, 6:16 AM, "Nicolas Lebas" wrote:

    Hello,

    I’m working with Eric to try to optimize DensityBining code. My idea was to add parallelism into the code and avoid as far as possible to modify deeply it.

    I thought about using multi-processing in the driver to do so, playing only on the « timeint » parameter to allow each process to execute binDensity function on a sub-interval, but it doesn’t works with cdms lib.

    When I try to use MPI, there is an error that occurs only when importing « CdmsRegrid from cdms2 ». Following little test and the associated error:

    from mpi4py import MPI
    COMM = MPI.COMM_WORLD
    RANK = COMM.Get_rank()
    print 'First print RANK=',RANK
    from cdms2 import CdmsRegrid
    print 'Second print RANK=',RANK

    >> mpirun -n 2 python TestMPI.py

    First print RANK= 0
    First print RANK= 1
    Fatal error in PMPI_Init_thread: Other MPI error, error stack:
    MPIR_Init_thread(474).................:
    MPID_Init(190)........................: channel initialization failed
    MPIDI_CH3_Init(89)....................:
    MPID_nem_init(272)....................:
    MPIDI_CH3I_Seg_commit(366)............:
    MPIU_SHMW_Hnd_deserialize(324)........:
    MPIU_SHMW_Seg_open(865)...............:
    MPIU_SHMW_Seg_create_attach_templ(637): open failed - No such file or directory

    If I comment the import cdms2 it is ok. Do you have an idea of how can I use MPI with cdms2 ?
    Or maybe a suggestion of another way to split all chuck in the driver (I test Pool from  multiprocessing.dummy but there is also a conflict with cdms) ?

    Thank you for your help.

    Nicolas

    P.S: I used from anaconda
    - cmds 2.12
    - esmf 7.0.0
    - esmp ESMF_6_3_0rp1_ESMP_01
    - python 2.7.13
    - mpich 3.2
    - mpi4py 2.0.0

@doutriaux1 @dnadeau4 would dask be an option here for parallelizing?

@eguil @lebasn

dnadeau4 commented 6 years ago

@durack1 I don't think this is working in 2.12.

durack1 commented 5 years ago

@lebasn do you have a demo script/data here? If not, we'll close

dnadeau4 commented 5 years ago

Easier to cut/paste without indentation.

from mpi4py import MPI
COMM = MPI.COMM_WORLD
RANK = COMM.Get_rank()
print 'First print RANK=',RANK
from cdms2 import CdmsRegrid
print 'Second print RANK=',RANK
dnadeau4 commented 5 years ago

This works! Yeah!