yaniyuval / Neural_nework_parameterization

Code used in:"Use of neural networks for stable, accurate and physically consistent parameterization of subgrid atmospheric processes with good performance at reduced precision"
MIT License
4 stars 6 forks source link

Use of neural networks for stable, accurate and physically consistent parameterization of subgrid atmospheric processes with good performance at reduced precision

Here we have the code and processed data from simulations and neural network parameterizations used in the manuscript ``Use of neural networks for stable, accurate and physically consistent parameterization of subgrid atmospheric processes with good performance at reduced precision''.

code

The code is divided to three main directories:

  1. sam_code_NN: Fortran code with all changes done to SAM used in the simulations.
    • The subdirectory sam_cases contains the namelist (prm) and neural network subroutines (nn_convection_flux.f90, nn_diffusion.f90) for the runs used in the manuscript. The subdirectories in this directory have the naming convections:
    • run_files_x8_N_layers for data from the x8-NN simulation (96km grid spacing with NN parameterization) with neural-network parameterization (N layers) simulation
    • run_files_N_missing_bits_out_in_only for data from the x8-NN simulation (96km grid spacing with NN parameterization) with neural-network parameterization (5 layers) simulation with 23-N bits in the mantissa
  2. NN_training: python code used for creating all Neural Networks used in the manuscript. There are two directories here.
    • run_training: examples of the input files used to train the NNs (first the files starting with 'build' were run to create the train and test data sets, and later the files starting with 'run’ were run where we trained the neural networks).
    • src: python code to process the coarse-grained high-resolution data and to train the neural networks.
  3. high_res_processing_code: matlab code used to calculate the coarse-grained and resolved tendencies, fluxes, diffusivity and input variables. This code uses the high-resolution data to calculate these quantities. The code that was ran was main.m. The high-resolution simulation output and a readme.txt file describing the high-resolution data is found at this google drive.

trained neural networks

All the neural networks used in this study are saved in the NNs directory. The number of layers in each of the neural network is indicated in the file name (“...NN_layers...”, where the indicate the number of layers).

processed data from simulations

The processed data from simulations with neural-network parameterization is found at the data library - data_x8_x16_NN_log. In this library there are different libraries for different simulations described in the manuscript:

Each folder contains a netcdf file with the following data:

The time and zonal average (taken from 3-hourly snapshot over 500 days):

Precipitation averaged over 500 days:

Precipitation frequency: