columncolab / EMC2

Earth Model Column Collabratory
BSD 3-Clause "New" or "Revised" License
10 stars 7 forks source link

ADD: Support for NUWRF, NSSL scheme, variable particle density and > … #91

Closed rcjackson closed 2 years ago

rcjackson commented 2 years ago

In this PR, I am adding support for Nasa Unified WRF and the NSSL double moment scheme with 6 hydrometeor species and variable graupel density.

Example EMC^2 output from NUWRF simulation over TRACER domain looking at average CSAPR reflectivity for the simulation of TRACER case June 2-4 of seabreeze convection in Houston:

image
rcjackson commented 2 years ago

@isilber Could you check and see if my logic is sound for adding support for > 2 preciptation categories as well as provide any comments you may have.

rcjackson commented 2 years ago

That is a very nice addition (I think this would also provide the basis for adding other microphysics schemes later (e.g., P3). Some general notes below:

  • Variable rho_hyd: I presume this is a diagnostic variable based on mean size with the exact dimensions as the r_e field for example?

Density is prognostic variable in the NSSL scheme, but it has the same dimensions as r_e.

If so: 1, I think the use of fi_factor should be ok in the allocation of the re_array for bulk processing in both lidar_moments.py and radar_moments.py

  1. The variable density might raise some errors when calculating 'sub_col_alpha_p' using the empirical method (calc_lidar_empirical) due to inconsistent dimensions, and in the case of cl, if indeed there's a variable liquid density, we might have similar issues in 'calc_radar_empirical' when accumulating attenuation. I recommend checking that.

Looking at that routine, it looks like it only uses density for the liquid alpha_p calculation, which is constant (and I now assume is constant in the below modifications). So I don't suspect this will be affected by variable densities.

  • set_precip_sub_col_frac: I believe these additions of calling the method multiple times would work in most cases. However, this renders the ice hydrometeor classes blind to each other. In other words, the maximum random overlap applies for the ice hydrometeor classes, rain, and cloud hydrometeors but it does not necessarily apply to the different precip ice classes between each other. I think that this potential inconsistency can be addressed, for example, by adding perhaps an optional 'data_frac_ice_precip=None' variable (when not using NSSL, it would simply remain None and the entire processing would be identical to before) being in each grid cell the maximum between, e.g., 'pi', 'sn', 'ha' but where exactly to plug it and where to leave data_frac2 as is would require some careful thinking. I would be able to assist in that generalization.

In this new version I now run the check for maximum random overlapping in all categories and then distribute the remaining hydrometeors out randomly. This works to be the same for the 2-species case as before, but now it will check for overlap in all classes simulatenously if there are more than 2. This required that data_frac1 and data_frac2 simply now be represented as a list of data_frac arrays for each species so that the subcolumn generator can now iterate through an arbitrary number of classes.

  • fwd calculations: In the NSSL scheme, how the RT calculation are made given the variable densities? Currently, we assume spheres (as I suspect the case is for NSSL as well) but we use the Maxwell Garnett equation to generate LUTs for each hydrometeor class density (mixture of ice and air), but I'm not sure how this is done in WRF NSSL. Can you check it out? I suspect that there's some pretty heavy hidden assumption concerning the variable density in RT calculations.

I looked in the WRF code and it just uses the same Maxwell Garnett LUT for all of the ice species, no matter if it's snow, graupel, or hail.

isilber commented 2 years ago

There is still a minor issue with the redundant i variable (see my other comment). Other than that, I think we can merge even though we would likely need to update at some point the LUTs (single particle and bulk) as well as the fall velocity parameterization. I'll open an issue about that to keep in mind.