nextsimhub / nextsimdg

neXtSIM_DG : next generation sea-ice model with DG
https://nextsim-dg.readthedocs.io/en/latest/?badge=latest
Apache License 2.0
10 stars 13 forks source link

MPI parallelization of dynamics #120

Open athelaf opened 2 years ago

mondus commented 1 year ago

A decision on this work is required. Either

  1. Merge now and then add the Dynamics (Milestone) parallelisation at a later date
  2. Wait until Dynamics merge, add dynamics parallelisation and merge later.

Preference is perhaps for 1 and then a new PR for the dynamics parallelisation later.

draenog commented 1 year ago

Requirements:

  1. MPI parallelization should be based on domain decomposition
  2. Decomposition should use rectangular boxes
  3. Decomposition should take into account land mask for load balancing
  4. Decomposition will not change during the run
draenog commented 1 year ago
MarionBWeinzierl commented 12 months ago

Info from @einola and @timspainNERSC about size of problems to be computed:

Hi! This is to follow up the discussion we had just now. The question was what the problem size is. So I went and did a bit of checking. The standard development test that I use has about 500k elements. Our coupled run has 300k - but we run it for much longer. A fully coupled climate run at the lowest resolution would have about 100k. We want to be able to run regional setups with much higher resolutions - 10 million would be nice!

So:

  • Standard coupled: 100k, maybe 1 million
  • Regional: 500k to 2 million
    • A dream scenario for regional: 10 million

For reference, the 25kmNH test run has 18k elements, and I don't think we would want the production version of the model to be much slower than that is when running single threaded. So 10k elements per core might be a good rule of thumb.