Closed adrifoster closed 2 months ago
Still working on this but FYI I tracked it to something happening in ed_ecosystem_dynamics
... I guess no surprises there.
I should note that I am using this test to diagnose:
/glade/derecho/scratch/afoster/tests_0913-092549de/ERS_D_Ld15.f10_f10_mg37.I2000Clm50FatesRs.derecho_gnu.clm-FatesColdSeedDisp.GC.0913-092549de_gnu
and it occurs
if (currentSite%lat == 10.0 .and. currentSite%lon == 120.0) then
currentPatch => currentSite%oldest_patch
do while(associated(currentPatch))
if (currentPatch%patchno == 2) then
litt_c => currentPatch%litter(element_pos(carbon12_element))
write(fates_log(), *) 'leaf_fines_top', sum(litt_c%leaf_fines(:))
write(fates_log(), *) 'all_fines_top', litt_c%leaf_fines(:)
if (sum(litt_c%leaf_fines(:)) < 0.0_r8) then
call fates_endrun('WARNING: negative leaf litter')
end if
end if
currentPatch => currentPatch%younger
end do
end if
Okay so at least part of the reason is originating here, because leaf_fines_frag
is higher than leaf_fines
, so we end up getting a negative value when it is subtracted later.
do dcmpy = 1,ndcmpy
litt%leaf_fines_frag(dcmpy) = litt%leaf_fines(dcmpy) * &
years_per_day * SF_val_max_decomp(dl_sf) * fragmentation_scaler(soil_layer_index)
do ilyr = 1,nlev_eff_decomp
litt%root_fines_frag(dcmpy,ilyr) = litt%root_fines(dcmpy,ilyr) * &
years_per_day * SF_val_max_decomp(dl_sf) * fragmentation_scaler(ilyr)
end do
enddo
The reason it's larger is because, as far as I can tell, SF_val_max_decomp
is 999.0. This is the value that is supposed to be for live grass, not dead leaves. I looked at the entire array and it is:
SF_val_max_decomp 0.52000000000000002 0.38300000000000001 0.38300000000000001 1.0000000000000000 999.00000000000000 0.19000000000000000
So somehow these array values are getting mixed around...
again, this is in fates main
...
Okay it's actually because the values in the parameter file it's pointing to are wrong.
I believe this is the one we made for tests?
Discussing this briefly with @adrifoster during the ctsm-fates stand up this morning. The issue appears to be showing up in the tests in which we generate and modify the fates parameter file on the fly. This suggests to me that this might be an issue with the generation order of operations or a bug in modify_fates_paramfile.py.
Note that derecho
is down for the next three days, but we could check this on a recent perlmutter fates test suite.
actually looking at this again I think this must be something that I did. I didn't realize that the parameter files were generated from the .cdl file in the src code - but when I changed it to main and rebuilt the test it was still using my old cdl file. I am going to close for now because I think this is something that didn't get updated on my end correctly....
In testing PR1247 I found that leaf litter fines
patch%litter%leaf_fines
is a very small negative number - which at the very least propagates to SAV and bulk density values.This issue is in the
main
branch, and I'm currently working through when it occurs. Seemingly not the first timeed_ecosystem_dynamics
is called.I don't think it's due to the way litter is initialized, because I updated that (to 0.0 from unset) and the same issue occurred (with the same values).