NGEET / fates

repository for the Functionally Assembled Terrestrial Ecosystem Simulator (FATES)
Other
95 stars 91 forks source link

LAI explosion? #844

Open rgknox opened 2 years ago

rgknox commented 2 years ago

Its possible that the issue we are seeing in PR https://github.com/NGEET/fates/pull/800 is related to how SLA changes during demotion. This is also relevant to #828

To summarize the issue, I suspect that when a large plant with lots of leaf area is demoted from the canopy to the understory (which is possible with both stochastic and rank-ordered demotion), we will encounter an "LAI explosion". Note that in our default parameter file, several of our sla_max'es are 10x larger than sla_top. Our algorithm assumes that when a plant is demoted to a lower layer, it is suddenly "under" all of these other leaves, which would push its mean SLA from SLA top to sla_max, thereby getting a many fold increase in SLA.

Perhaps we should modify our scheme such that plants are not influenced by canopy position, but SLA depth scaling si based more off of height?

@ckoven @glemieux @rosiealice @aswann @kovenock

Appologies if I'm re-hashing an old issue we've debated before!

jkshuman commented 2 years ago

@JoshuaRady this sounds similar to the issue you were describing to me.

rgknox commented 2 years ago

was thinking the same thing

JoshuaRady commented 2 years ago

Yes, this seem like the issue I'm currently working with. When my trees get large (approaching 50 cm) and one gets demoted the per tree LAI goes from a reasonable value of ~5 to close to 30, with VAI exceeding 30. This causes this error to trip. I'm seeing demotion both due to filling of the upper canopy layer and from fates_comp_excln type demotion.

I have been having trouble understanding what is happening in tree_lai() and interpreted the LAI jump as due to a crown area contraction, which didn't really make sense given that the spread factor was already maxed out or nearly so. I had not considered the SLA profile.

For me this happening in particular under RCP 8.5 later century forcing. I suspect this is because the foliage is not trimming much under this forcing and the trees are getting bigger faster. Moreover, my PFTs allometry has an increasing foliage density (leaf biomass per crown area) with size, which exacerbates the problem.

I was looking at changing fates_vai_width_increase_factor to make room for more VAI, since this is rare, but am not sure about sensible limits on this value.

rgknox commented 2 years ago

One solution would be to let the cohorts maintain their own lai_above tendency. Instead of inheriting the LAI above instantaneously from the plant's canopy position, and using that to drive calculations on SLA, the lai_above term will be slowly pushed in the direction of the LAI in the canopy layer above.

JoshuaRady commented 2 years ago

That makes some sense. The lack of any subdominant canopy position in the current implementation is on highlight here. Something that would give demoted cohorts time to respond to their change in position would be good.

rgknox commented 2 years ago

This might be a nice use of the new running mean feature. The cohort level, tracked leaf area above could follow an exponential moving average, and we could vary the window length (days) to test sensitivity.

ckoven commented 2 years ago

In principle, I don't think it shouldn't matter very much if we were to add a memory feature like you describe, because demoted cohorts tend to get absorbed very quickly into whatever cohort of the approximate size is already there in the understory canopy position, and which will therefore have trimmed its leaves to be in carbon balance at that position. So I feel like this is most likely mainly a model stability problem of what to do in these weird transient edge cases? But even still, that this is happening at all points to potential parameter issues -- if a plant has an LAI ~ 40, then that probably means that some combination of the plant's crown area is too small, it's leaf biomass allocation is too large, and/or it's SLA is too large. If I remember correctly, I think we decided to leave this as a crasher in the code (rather than just ignoring all leaf area greater than some max valid value) specifically because if it is happening, then it is probably symptomatic of other problems in a simulation. @JoshuaRady in your case, particularly since you are using a larger exponent on leaf biomass than crown area, then it is less surprising that this might happen. So some combination of either changing the parameters (particularly decreasing slamax) or else adding more LAI bins, widening the LAI bin width, or using an exponential grid might be needed to handle the high LAIs if they are intended.

rgknox commented 2 years ago

I think its a win-win to have this memory. Firstly, it seems more consistent with what would happen in nature, a tree is a manifestation of its environment, if its environment changes drastically, it doesn't immediately forget its past, it can't snap its fingers (twigs?) and change its SLA. The other win is that if this reduces our maximum LAIs, it simultaneously makes the model more robust and allows us to use array space to capture LAI with higher granularity.

mpaiao commented 2 years ago

@rgknox I like with the memory idea. Presumably this could be linked with the leaf turnover rates, so new leaves would be produced with the new SLA, but existing leaves would keep the pre-demotion values.