ESCOMP / CTSM

Community Terrestrial Systems Model (includes the Community Land Model of CESM)
http://www.cesm.ucar.edu/models/cesm2.0/land/
Other
308 stars 312 forks source link

Snow density unrealistically large for very thin snowpacks #1280

Open swensosc opened 3 years ago

swensosc commented 3 years ago

Brief summary of bug

Snow densities derived from ctsm history output of H2OSNO and SNOWDP can be unrealistically large (>1000kg/m3)

General bug information

Calculating snow density as rho=h2osno/snowdp from time step frequency history files shows very large values at times when the snowpack is very thin. This typically occurs when the snowpack is unresolved, i.e. snl is 0 but h2osno_no_layers > 0. The issue appears to happen because frac_sno (snow cover fraction) is held fixed throughout the timestep, while h2osno and snow_depth are updated due to phase change and frost deposition. I corrected the behavior by making changes to how snow_depth is calculated during phase change, and using an updated value of frac_sno in the calculation of snowdp.
CTSM version you are using: Does this bug cause significantly incorrect results in the model's science? ?

Configurations affected:

wwieder commented 2 years ago

looks like we're close to having a fix for the in CTSM5.1. What needs to happen to bring #1283 to main?

billsacks commented 2 years ago

Reading back through #1283 it looks like I had a number of concerns with how that was done, some of which were pretty fundamental – at least if we still want to support the ability to select between multiple snow cover fraction methods. So I think @swensosc was going to rethink the approach.

I wonder if we should just drop support for the old snow cover fraction method: This isn't the first time that changes / fixes have been difficult because of these two very different snow cover fraction methods. If I remember correctly, the big issue is that the different snow cover fraction methods operate very differently in terms of the data flow of how they fit in with the rest of the model (e.g., the order in which different terms are calculated that then need to feed into other calculations: what depends on what). Maybe that's not an acceptable solution, but I just wanted to pose that question....

wwieder commented 2 years ago

how old is the 'old' snow cover fraction. I'm OK to drop it, especially if it's a CLM4 era approach.

billsacks commented 2 years ago

It's the Niu & Yang 2007 as opposed to Swenson & Lawrence 2012. We use the Swenson & Lawrence 2012 approach for both CLM45 and CLM50 by default – so never use the old NiuYang2007 approach by default. But @swensosc and @dlawrenncar should weigh in on the importance of keeping the older approach.

swensosc commented 2 years ago

I think in principle we want to have the ability to use different snow cover fractions. Perhaps the main cause of the occurrence of large densities is the fact that snow is updated at multiple times within the time step, and if snow cover fraction is not concurrently changed, then they can be inconsistent briefly. But changing snow cover within the time steps causes problems with accounting for energy conservation, and that is why it is calculated once per time step. A robust solution would deal with the snow operator splitting issue, but we don't currently have the person-power to commit to that. Also, to some degree snow density is a diagnostic output, so maybe if we calculated the snow density explicitly at one point in the time step and output that value to history, it wouldn't be sensitive to these issues.

On Thu, Nov 4, 2021 at 4:29 PM Bill Sacks @.***> wrote:

It's the Niu & Yang 2007 as opposed to Swenson & Lawrence 2012. We use the Swenson & Lawrence 2012 approach for both CLM45 and CLM50 by default – so never use the old NiuYang2007 approach by default. But @swensosc https://github.com/swensosc and @dlawrenncar https://github.com/dlawrenncar should weigh in on the importance of keeping the older approach.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ESCOMP/CTSM/issues/1280#issuecomment-961481352, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGRN57G7HZXGJB6TJ5KBBALUKMJL3ANCNFSM4XXBCRGA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

billsacks commented 2 years ago

I just looked back at the code. The difference between the two parameterizations that I alluded to vaguely is: In SnowCoverFractionNiuYang2007Mod.F90, first we need to calculate snow_depth, then we need to calculate frac_sno from that; in SnowCoverFractionSwensonLawrence2012Mod.F90, first we need to calculate frac_sno, then we need to calculate snow_depth from that. If I remember correctly (though I may not be remembering correctly here), that difference in ordering has been a significant cause of the challenges of maintaining this code: it forces us to calculate frac_sno and snow_depth at the same time, even if for other reasons it might be better to calculate them at different points.

billsacks commented 2 years ago

@swensosc do you feel we should do anything about this, or just close it as a wontfix?

ekluzek commented 2 years ago

I asked @swensosc about this and this is the response he gave me:

this is a diagnostic issue. It is an undesirable result of the multiple updates to snow variables during the time step ('operator splitting'). Because of this, the snow mass, depth, and fraction can be slightly inconsistent when one, but not all, are updated within the time step. The problem is most obvious for thin snowpacks, where a decrease in snow depth can cause the snow density (a diagnostic quantity) to become unrealistically large at the time step level. A while ago, I tried to see if I could ensure the consistency, but it proved difficult to do, especially in the context of supporting different snow cover fraction methods. One solution might be to create a diagnostic variable solely for the purpose of providing history output, and calculate that value at the point in the time step when it is known to be consistent with the snow state variables, but I didn't explore that option. I think this is low priority, and will really only show up when looking at time step level history output, such as for site level / flux tower type simulations.