CliMA / ClimateMachine.jl

Climate Machine: an Earth System Model that automatically learns from data
https://clima.github.io/ClimateMachine.jl/latest/
Other
451 stars 78 forks source link

ClimateMachine needs a more comprehensive set of experiments with verifiable output #1147

Closed thomasgibson closed 4 years ago

thomasgibson commented 4 years ago

Description

Just summarizing part of the conversation over today's spring meeting.

I think all of us came to the conclusion that verifying the output of the LES and GCM drivers right now is incredibly difficult. There are a number of reasons for this, one being related to diagnostic output (i.e. not having easy access to perturbation fields or vorticity profiles). Right now, it's hard for many of us to look at the prognostic variables (like total energy for instance) for a particular experiment and determine dycore efficacy. That's a separate issue being addressed (Can someone maybe provide a reference for discussion related to revamping diagnostic output? There are multiple issues and I'm not sure which to reference.)

The main purpose of this issue is to help organize a list of desirable dycore experiments we can use to verify the efficacy of the code (in particular, the dycore and dycore + physics).

This should be an open discussion for all to participate!

Here is a link that I think we should seriously consult:

For those less familiar with DCMIP, or the Dynamical Core Model Intercomparison Project, is a well-documented effort to design a standard set of test suites for atmospheric dynamical cores of varying complexity. This will allow to confidently verify the model output against other atmosphere models.

A nice set of standard experiments collected together for dycore benchmarking in LES mode (i.e. 2D and 3D limited area) can be found in the following references:

@phdthesis{restelliPHD2007,
   author = {{Restelli},M.},
   title = {Semi-{L}agrangian and semi-implicit discontinuous {G}alerkin methods for atmospheric modeling applications},
   school = {Politecnico di Milano},
   year  = {2007},
}

Note: If you can't find this thesis online, ask Simone for a copy.

Additional context

Add any other reasons why this should be addressed.

For CLIMA Developers

thomasgibson commented 4 years ago

This also applies to the ocean dycore as well. So I think it's probably a good idea to do similar verification. I am less familiar with the ocean literature, so I'm going to need some oceans guys to chime in with possible references for standard ocean dycore tests.

szy21 commented 4 years ago

For comparison, the results from all the models participating in DCMIP 2012 are here: https://www.earthsystemcog.org/projects/dcmip-2012/.

smarras79 commented 4 years ago

For LES modality testing, the code currently contains the following LES tests that are accepted as standard tests to assess the dynamical core:

I set and tested these two benchmarks to run to completion (timeend = 1000 s and timeend = 900 s, respectively) in only ~5 minutes of wall-clock on 1 GPU on central. This will permit these tests to always be run with ease when changes across the code are pushed.

We are still missing:

Moist dynamics: a simple precipitating moisture with a dynamically simple problem to solve is

Notice: Dycoms and Bomex are NOT standard benchmarks. We can use them, but a coarse resolution for either one requires > 30 minutes of wall clock to reach completion.

blallen commented 4 years ago

@christophernhill and @jm-c, is there anything like DCMIP for ocean models? I know we are waiting on split-explicit and implicit diffusion before we can make the test suite feasible, but it would be good to know what we are aiming for.