EDmodel / ED2

Ecosystem Demography Model
79 stars 114 forks source link

Initialize new PFTs through xml file? #173

Closed fabeit closed 8 years ago

fabeit commented 8 years ago

@mdietze Marcos has told me that you are more familiar with the xml file procedure to set parameters for PFTs. What I would like to do is take three existing PFTs (e,m,l tropical), make an exact copy of each of them, and then change one or two parameters at the most. In total I will have 6-9 PFTs which are pseudo copies of the existing ones. Is this doable through the xml file or should I just add them to ed_params?

From a quick look it seems that the xml doesn't have all parameters, and doesn't have density-dependent mortality which I am interested in. But is it even possible to have additional PFTs this way?

mdietze commented 8 years ago

Yes, that's exactly the sort of thing that the XML is designed for and it should be relatively easy. Do a test run with the PFTs you want to clone. that will create a history xml file in your outdir. within that file make copies of any PFTs you want to duplicate and then just modify any parameters. If any parameters are missing (at this point, I thought I'd got them all) then it's an easy edit to the xml code to add new parameters to the read/write. Just make sure to give your new, copied PFTs new numbers if they're going to be run at the same time as the old PFTs (it's fine to overwrite other PFTs completely).

fabeit commented 8 years ago

thanks @mdietze I will play with that, I will double check that it has all parameters though otherwise if overwrite other pfts than i might use them with some of their default parameters. There is a problem with the wiki page https://github.com/EDmodel/ED2/wiki/Model-parameters-and-xml-parameter-files I get this error message: The wiki page took too long to render. Please edit this wiki page’s content so it renders faster.

I don't remember how exactly to run ED using the xml file.

mdietze commented 8 years ago

wiki split up by section to render faster.

To run ED using the xml you just need to set the path to the file in the ED2IN

fabeit commented 8 years ago

@mdietze thanks for fixing the wiki! And I got the file run.xml file in the history path folder. Is it also possible to define new pft from here or not? If I use a PFT number beyond the current range, 18,19,20. This would be better than overwriting other PFT because if I miss a parameter the run will crash quite soon.

mdietze commented 8 years ago

In theory it's possible to define a new PFT outside the current range. In practice, there's a constant somewhere that sets the size of the PFT vector, so a PFT number too high will crash.

It's arguable whether you want to overwrite an exisiting PFT or create a new one. I don't think it's guaranteed that a PFT with 0 for the unset parameters will actually crash immediately. One test would be to give ED it's own PFT output back in only changing the ID since that should give you the exact same results -- if the results are different then something isn't being set. From there you could then modify the parameters.

mpaiao commented 8 years ago

As @mdietze said, if you go beyond 17, then you will need to increase n_pft in ED/src/memory/ed_max_dims.F90. In this case, you will probably want to initialize all variables with some sort of default value in ed_params.f90 too. I think it is a good idea to compile the model with very strict rules to make sure none of them are uninitialized.

If you are compiling with ifort, you could use "-fpe0 -no-ftz -ftrapuv" to make the model crash at the first occurrence of uninitialized variables (it works 99% of the time). For gfortran I once used "-finit-real=snan -ffpe-trap=invalid,zero,overflow,underflow" though I don't remember whether it worked or not.

Before increasing the number of PFTs, you may want to check whether anyone is using PFTs 12–15. In case no one is using them, you could turn them into generic placeholders for your tests (just to avoid increasing n_pft by a lot and creating unnecessary output).

fabeit commented 8 years ago

Thank you for the feedback, I have compared the output given by the xml to ed_params and it seems that all variables are in there. So before introducing these new pft via code I will do some testing with the xml file. I only found a few things that I would like to get your feedback on.

1) I did not find these parameters in the xml file: treefall_s_gtht and, treefall_s_ltht, fire_s_gthg, and fire_s_ltht but instead there are

<treefall_s_gt>
<treefall_s_lt>
<fire_s_gt>
<fire_s_lt>

Is this normal? How can the xml initialize a variable that has a different name in the code?

2) In init_pft_alloc_params() there are some sections of the code for tropical pft, in this case as long as I use the same pft (early,mid,late) to create a copy then these values should not change, correct?

3) <init_laimax> is set to ****** inside the xml. Should I just delete it and let ED reinitialize? I think it's the same for every pft. 4)** These parameters in the xml file

<c_grn_leaf_dry>
          1162420224
      <wat_dry_ratio_grn>
          1072483533
      <wat_dry_ratio_ngrn>
          1060320051
      <c_ngrn_biom_dry>
          1150823432
      <delta_c>
         -1017042576
      <b1Cl>
          1050611999
      <b2Cl>
          1066175300

have different values in ed_params: 3218, 1.85, 0.7, etc. Is this a notation difference? 5) Probably not so important but I have noticed that some variables that in params are initialized with only a few decimals, in the xml have a lower or higher value that it's not exactly the same, for example -0.3499999940 and -0.35. (ok maybe I am worrying too much, but could this propagate and make some differences in long runs?)

mdietze commented 8 years ago

@fabeit could you update your last post -- the markdown formatting is really off and a lot of text is missing so it's hard to follow what's up.

fabeit commented 8 years ago

Sorry about that, didn't realize that with the tags the names disappeared. :-)

mpaiao commented 8 years ago

Partial answers...

1) There are a few typos in ed_xml_config.f90:

        call putConfigREAL("treefall_s_gt",treefall_s_gtht(i))
        call putConfigREAL("treefall_s_lt",treefall_s_ltht(i))
        call putConfigREAL("fire_s_gt",fire_s_gtht(i))
        call putConfigREAL("fire_s_lt",fire_s_ltht(i))

2) You must define the new PFT as tropical. If I understand correctly, you must specify is_tropical=1 in the xml file.

3/4) Not sure what it is – possible real/integer conversion issue? I quickly checked ed_xml_config.f90 and didn't see anything wrong.

For question 5. It's probably just the error of floating point truncation. Any variable that is single precision will have similar errors, I wouldn't worry about them.

fabeit commented 8 years ago

Thanks Marcos. @mdietze I fixed 1) in the code. I am still unsure how to deal with 4), some of those variables are the same for every PFT so I just deleted them from the XML, for the ones that are PFT specific, can I just copy the value as it is in pft_param in the xml? For 3) maybe we shouldn't write it to the xml file?

mdietze commented 8 years ago

For 4, yes you should be safe writing the correct default values into the xml from the f90. Wouldn't hurt to add a diagnostic print statement during testing to ensure they were read correctly. Same goes for 3. I suspect that has to do with some fortran quirks in writing output (e.g. number has more digits than write fcn is allowing). For both these you could also add some print statements during testing to verify that they have the correct value before they are written out. I probably won't have time to do any testing myself, but the code is pretty straightforward.

fabeit commented 8 years ago

I have been able to successfully add 6-9 PFT just by using the XML, I still have to do some tests with the print statements to check that all values are read correctly. @mdietze in the wiki there is a parameter called 'max_dbh', but I don't see that anywhere in the ED code.

mdietze commented 8 years ago

I swear it existed in some branch at some point! Don't have time to traceback to which version and whether it's safe to drop (or is something that does need merging in). Will drop from the wiki in the mean time.

fabeit commented 8 years ago

@mdietze @mpaiao There is something that doesn't compute. My three custom PFTs (which overwrite 12,13,14) only have one parameter that differs from PFT 2,3,4 (nonlocal_dispersal). For testing I inverted this parameter in the default pfts and set the default value on the custom pfts, I would expect to get the same exact output as the specular run just with opposite results for each pft, but this does not happen. Other thing is that it seems that one of my custom PFT (13) doesn't seem to respond to the init_density parameter. I also tried to invert only one custom pft with one default pft and I still get very different output. I am thinking either the xml file is not overwriting all parameters or there is some other default parameter for the hard coded pfts that it's not in the xml, but I am quite sure that I have checked all parameters in ed_params.f90.

Do you guys have any idea? Is there a quick way to print all pft parameters at run time?

mdietze commented 8 years ago

write_ed_xml_config writes out all the parameters to your output directory. You could check that file. You could also try moving the call of that file to later in the initialization to verify that no parameters are being changed after that file is written.

DanielNScott commented 8 years ago

For what it's worth, this may be an issue related to the pft dependencies, which I think is implicit in what mike is saying re changing the call location. But I thought I would chime in that if that is the case then by moving the call rather than fixing any dependencies you might be making the model internally inconsistent. I don't know anything about the params you're talking about though, so you may be fine.

fabeit commented 8 years ago

@mdietze In fact I forgot to mention that I already checked the xml that is generated by ed in the output directory and it matches my input xml. I will try to move the call later, that's a good place to start. @DanielNScott not sure I understand, if I put the call to create the xml after all initializations have been done, i should not have any issues with dependencies, it only prints values, doesn't it?

DanielNScott commented 8 years ago

Oh, my mistake actually, you're totally right. I misunderstood mike's comment.

In any case, to me it sounds like maybe your xml params are being read in and then potentially overwritten by the parameter dependency mechanisms, and that mike is suggesting you look for such issues by moving the output file write. I don't recall off the top of my head if (after Ryan's sort of inconclusive work on the matter) the dependency update is happening before or after the read in, but whatever the case, there's an order of operations happening with the read in, output, and dependency adjustment which a) might not function correctly for your parameter and b) affects the internal consistency of the model. This is what I was trying to draw attention to, in case it could prove helpful or important for you.

fabeit commented 8 years ago

I have ran further test but I can't really find a solution/cause to my inconsistent results. I moved the write_xml to the end of the simulation, it contains the same values as the input xml, so that's not the issue. Also, I have tried to modify just one parameter "initial_density", and checked the value in the first year Y file, and it also matches the value in the xml. The small differences I am seeing in my plots are probably due to some calculations in the R script (I am using Marcos'). So I am still puzzled as to why I am not getting specular results when I invert pft numbers.

fabeit commented 8 years ago

Let me post an example, here I just switched the in the xml file, I am using 2,3,4,12,13,14 ba-lkt ba-lkt nplant-lkt nplant-lkt

fabeit commented 8 years ago

Carbon balance and lai for example show greater differences, pft 2,12 are less specular here than in the previous graphs. cbarel-lkt cbarel-lkt

lai-lkt lai-lkt

fabeit commented 8 years ago

I have looked again at the xml that ed generates, and I think I have found some issues. I think because some values are calculated outside ed_params, so probably after the write_xml function, they come out 0.00000 in the xml. Wood_backscatter_nir/vis/tir, leaf_scatter_nir/vis, wood_scatter_nir/vis, leaf_backscatter_vis/nir were all 0.0000000 in the xml template I have used to run all my simulations so far. So why are all these and other parameters in ed_param if they are calculated somewhere else? Some are just defined but never used. Now the question is if these variables are assigned the values provided by the xml file or are then overwritten because they are calculated after, for example in radiate_drivers.

Now that I have moved the write_xml at the end of the simulation those values are correct. There is still a problem with orient_factor that it's written as -0.00000 but should be dbl(orient_tree). I don't know if these values are responsible for the inconsistencies in my simulations.

I am thinking at this point might just be easier to modify directly the values in ed_params and recompile. And at least we should move the write_xml later in the code, maybe not at the end of the simulation but later.

mdietze commented 8 years ago

Thanks for identifying those issues. For the record, I disagreed with any parameter values being changed AFTER the xml read precisely because of the bugs you've been getting.

When I first made the XML system I understood the radiative transfer less than I do know, though I still find some of these variables confusing. I'd appreciate it if someone who's an expert in RTMs (eg @serbinsh or @ashiklom) could chime in about what parameters should be read in vs calculated -- I think the compromise that @rgknox and I had reached years ago was that the only things calculated after the read would be truly physical constrains (mass, energy), NOT statistical correlations.

As for what parameters are in the XML, we were just trying to make sure everything that's part of the module pft_coms should be in the XML. If there are temporary variables that are just used for intermediate calculations and then never used in the code, then they should be removed BOTH from the XML and pft_coms.

Yes, orient_factor should dbl -- that should be a quick fix.

I'd really encourage you to not give in to modifying ed_params directly -- that's a short term fix, but still leaves everyone else with the same bugs you are hitting.

fabeit commented 8 years ago

I think if this system works is a really good idea, especially for things such as sensitivity tests, so we should make it work, but I need some help.

I totally agree that we should remove variables from ed_param and the xml that are calculated elsewhere, but I also think we should remove variables that we should talk about how to handle the variables in ed_params that are used to calculate other variables in ed_params, because it leads to inconsistent values, one example is 'rD0'.

For some reason orient_factor always comes out as 0.0000 in the xml, but should be not zero, for example dble(0.1) for tropical pft. I am a running a few more tests and then I will get back with you.

In the meantime hopefully the others will chime in.

ashiklom commented 8 years ago

The leaf and wood optics parameters related to radiative transfer (at least in the optical domain; I know next to nothing about thermal stuff, especially in ED) are leaf_reflect_vis, leaf_reflect_nir, wood_reflect_vis, and wood_reflect_nir. In our past tests of the ED radiative transfer code in isolation, we got zero sensitivity to changes in the scatter and backscatter parameters, which suggests that they are calculated internally and therefore should not be parameters.

orient_ and clumping_factor should be parameters -- they will definitely vary across different PFTs and cannot be calculated from other parameters, at least to my knowledge.

serbinsh commented 8 years ago

All

The scattering coefficients are calculated internally. However, they are calculated using the input leaf/wood refl & trans parameters so I am actually surprised there was no sensitivity; something to look into.

And yes clumping and orientation factor are important input parameters. There is a LOT of literature on clumping (shoot and whole canopy) factor out there so it should be easy to at least estimate what is a good value for your sites.

On Aug 5, 2016, at 10:32 AM, Alexey Shiklomanov notifications@github.com<mailto:notifications@github.com> wrote:

The leaf and wood optics parameters related to radiative transfer (at least in the optical domain; I know next to nothing about thermal stuff, especially in ED) are leaf_reflect_vis, leaf_reflect_nir, wood_reflect_vis, and wood_reflect_nir. In our past tests of the ED radiative transfer code in isolation, we got zero sensitivity to changes in the scatter and backscatter parameters, which suggests that they are calculated internally and therefore should not be parameters.

orient_ and clumping_factor should be parameters -- they will definitely vary across different PFTs and cannot be calculated from other parameters, at least to my knowledge.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/EDmodel/ED2/issues/173#issuecomment-237866380, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AC7N6aIQmKmI681tSaInCLAwnHDR2zNUks5qc0l-gaJpZM4Iqxf8.

fabeit commented 8 years ago

Thanks for the input. So I definitely determined what I already mentioned in my previous post, those scatter and backscatter are all calculated and OVERWRITTEN after the xml is read (and also phi1, phi2, mu_bar), which means that it doesn't matter what value they are assigned in the xml file. Also, leaf_scatter_nir,leaf_reflect_nir,leaf_trans_nir are written twice in the xml, once in each pft and the other under radiation. Orient_factor always comes out as 0.000 in the xml, and I can't figure out why because it is simply dble(0.1). So this leaves basically only orient_factor as a confounding factor in my simulations, and I am doubtful that it's the cause of these inconsistencies because it was set to 0 for all pfts.

Should I go ahead an remove those values from ed_params and the xml file? Another difference that I got by putting the write xml at the end of the simulation was in the initcond, they were all 1 or 0 depending on the when the write xml would happen.

mpaiao commented 8 years ago

@fabeit Scatter, backscatter, phi1, phi2 and mu_bar are derived parameters, so they should be calculated internally. If they are in the xml section, it's probably a good idea to remove them. Other derived parameters should be removed as well, but I agree that we should make the distinction between input parameters and derived parameters clearer in the code.

Orientation factor shouldn't make much difference unless you set it really far from 0, but the code shouldn't change the variable. The only thing I can imagine is that there is some single/double precision inconsistency in calls. If you send an argument that is single (double) precision to a function or routine that expect double (single) precision, the routine will mess up the variable. Not sure if this is the case.

@ashiklom @serbinsh, back when I was running some sensitivity tests I would see response if I changed the total scattering (sum of reflectivity and transmissivity, equivalent to 1 - absorptivity), but also saw no response when I changed the partition between transmissivity and reflectivity while keeping the sum the same. Also, agree that orientation_factor and clumping_factor must remain as input parameters.

fabeit commented 8 years ago

@mdietze I have found some issues in ed_xml_config.f90 in both the read and write routines, which caused some incorrect values being read or written in the xml files. For example

call putConfigREAL8("leaf_trans_vis",leaf_reflect_vis(i)) call putConfigREAL8("leaf_trans_nir",leaf_reflect_vis(i)) call putConfigREAL8("wood_trans_vis",leaf_reflect_vis(i)) call putConfigREAL8("wood_trans_nir",leaf_reflect_vis(i)) call putConfigREAL8("leaf_scatter_vis",leaf_scatter_vis(i)) call putConfigREAL8("leaf_scatter_nir",leaf_scatter_vis(i)) call putConfigREAL8("wood_scatter_vis",leaf_scatter_vis(i)) call putConfigREAL8("wood_scatter_nir",leaf_scatter_vis(i))

In any case I agree with @mpaiao and I think we should move a few things around. I would like to

  1. Delete from the xml routines the parameters that are calculated internally.
  2. The parameters that are internal and fixed for the whole execution should be calculated only once instead of being recalculated at each time step, for example the radiate_driver recalculates leaf_scatter_vis/nir and a bunch of other params at each time step but their values are always the same. In sfcrad I would remove lines 337-390 and place them in a subroutine after the call to overwrite_with_xml_config.
  3. Move the call to write_ed_xml outside the if statement so it always executes and after all parameters are initialized. Does anybody have a suggestion where it might go? Right now I have it at the end of the run.

orient_factor is being read correctly from the xml file.

mpaiao commented 8 years ago

For 3, probably the best place is right after the first time step? This way it prints the file after all parameters have been read and initialized, but you don't need to wait until the simulation is finished to see that there was an error...

fabeit commented 8 years ago

I thought it could go in ed_driver, before the call to ed_model, but in that case we have to make sure that all parameters that are written in the xml file are calculated before the first call. I see that ed_model does initialize other variables before starting the time integration, so another option might be to put it in ed_model right before the do while loop. I think it should be outside that loop, otherwise it will need its own flag and will be a condition checked at each loop.

mdietze commented 8 years ago

Thanks everyone for a great thread, sorry for the slow reply (been fighting to get the next PEcAn release out).

I understand that scatter and backscatter and some of the other parameters mentioned are currently calculated internally after the XML read, but so far no one has stated definitively whether these are true physical law calculations (i.e. there's only one possible way they could ever conceivably be done), which is what we need to know to decide whether to move the calculations to before the XML read or to delete the parameters from the XML and change ed_params. Lots of things are calculated internally before the XML read that are based just on empirical correlation, and it's important that we not hard code in such correlations in a way that prohibits one from using the XML from prescribing new, better relationships. If the physical nature is confirmed then I agree with everything that's been proposed.

Regardless, I definitely agree with the bug fixes, eliminating redundant calculations, and moving the XML write to just before the do while loop.

fabeit commented 8 years ago

I agree, in fact I have looked again at all the parameters in ed_params, many of them are calculated "internally" but still within ed_param, so to implement my original idea we would have to really discuss how to go about this. But again I am concerned about the mechanics of how xml overwrites the values of derived parameters. We should be careful of a couple of situations in which we have a derived parameter, say c=a+b 1) if the users changes only the value of c in the xml, and then somewhere later in the code a or b are used, this makes things inconsistent. I have not checked if this actually happens. or 2 if I change in the input xml only a, I will overwrite a but the value of c will not be updated, which is what I have ran into. And this makes the output xml file inconsistent.

So given the current situation I will move all the leaf/wood calculations inside ed_params so we level the playing field and have all parameters calculated only once in the same subroutine, although I would prefer a more foolproof approach. I will also not remove any parameters from the xml until we decide how to proceed with this.

fabeit commented 8 years ago

@mdietze I have run several tests with this new code and I have experimented several setups with plant functional types 2, 3, 4: a. No XML file b. xml file with PFT 2,3,4 c. xml file with PFT 2, 3, 4 but where I erase the extra decimals for each parameter (for example 1.00000015 became 1.0)

What I have noticed is that a and b produce very similar results, almost identical for certain variables. c produces output that is also very similar to a and b, just a few slightly different dynamics between PFTs but very similar total values compared to a/b. So it seems that somehow those very small decimals make some sort of difference. Among setups, I have noticed most consistency on things such as BA, GPP,NPP and more variability in carbon balance, mortality, number of plants, LAI, recruitment rate. My hypothesis is that these differences might be due to the more probabilistic nature of the latter processes, but I do not know the inside of the model well enough to be sure. However there is still something that escapes the xml input, or something that gets propagated, because when I create clones of any run I get exactly the same output for each variable across runs, which is what I would expect. Interestingly enough, only once initial density matched the assigned value, which was case c.

I also tried to simply exchange PFT number in the XML files ( for example I assigned the values of pft2 to pft4 and pft3 to pft2, etc), and I produced results that are specular, exactly what I was hoping for, although the "probabilistic" variables I mentioned before are still not specular. What I will do next is to overwrite completely different plant functional types such as grass or the pines pft and see if I get the same output, if these tests produce consistent results I would consider the xml input reliable, otherwise I do not think I will have to time to figure out what is actually happening. If anybody is interested please let me know and I can share the plots I have produced.

mdietze commented 8 years ago

Unfortunately there are no stochastic variables in ED2. So the variables you consider 'probabilistic' are either just more sensitive or involve parameters that we're not catching.

fabeit commented 8 years ago

That's what I thought, given the output I was getting from replicate runs... I don't think it's even worth to run further tests given what you said. At this point I am not sure how to proceed, how will I know that the responses I see when I change even just one parameter are due to the parameter itself or this variability I am already detecting? Do you have any suggestion on how to proceed?

fabeit commented 8 years ago

So actually I have been able to recreate the same 'default' run by reducing the number of parameters initialized by the xml, I am trying to pinpoint which parameters are not read correctly by the read_xml routine. For some reason when I overwrite pft 12,13,14 I get the same results as when I overwrite 2,3,4. I have also fixed the write_xml routine and now it prints all values correctly including orient and clumping factor.

fabeit commented 8 years ago

So for the record, the parameters that do not initialize well are all the scatter and backscatter, in addition to water conductance. By taking those out of the xml input file I was able to obtain the same output I get by initiating a run without xml file, at least for PFTs 2,3,4. However I am not able to overwrite other pfts, there is still some mechanism that the xml doesn't catch, it's puzzling.

mdietze commented 8 years ago

Looks like these are all tropical PFTs. In addition to Marcos' offline email about looking for hardcoded PFT flags, you might also try following the is_tropical flag around. Taking a quick look at the repository "tropical" affects the allometries, and there appear to be some hard-coded PFT numbers in R-utils/allometry.r

fabeit commented 8 years ago

@mdietze @mpaiao ok I have given up with trying to use the XML file, basically it cannot initialize precisely floating point variables (real single and double), unless these number can be represented precisely in base 2, like 1.0 for example. I think the xml file is also good to initialize integers. The biggest problem is with variables that are calculated internally, those have the greatest error. These differences are quite small but with time they build up and change the output of the run. You also get these errors if you overwrite pfts without keeping the default order. These links helped me understand what the issue was http://www.walkingrandomly.com/?p=5380 http://www.lahey.com/float.htm

For the moment I have overwritten pft 12,13,14 and it works fine, I get the results as I do with pft 2,3,4. Now the only issue I have is that everything works fine on my local linux machine, but in the cluster when I compile with kind E, QSW for pft 3 and 13 don't match,but with A it works fine! This is quite puzzling. It's the same code I run on my local machine, what could be causing this? Which compiling options could cause this? I have posted on stackoverflow http://stackoverflow.com/questions/39414877/same-operands-different-results-within-same-code-same-machine

fabeit commented 8 years ago

I am closing this but there is an important note, I had to add the compilation flag -fp source because otherwise I would get inconsistent floating-point calculations within the same simulation. I think this is a not so trivial point and I think that this flag should be added to the standard ED compiling flags, at least when using -O2 or -O3, and it does not seem to slow things down. I have added this flag to the Intel mk.include.opt. I also think we should add a huge warning regarding the use of the XML, it really depends how someone intends to use it, it can still be useful but there are some pitfalls. But I guess we can talk about this off-line. Thanks to everybody in the discussion for the assistance with this.

mdietze commented 8 years ago

@fabeit thanks for digging into this so much!

Ultimately we need to get the XML approach to work so we can use it with PEcAn for calibration and analysis -- I'm tagging @istfer here so she knows to pick up where you left off when she gets ED2 parameter calibration running in PEcAn this fall. I think we should be able to increase the decimal precision of the XML read/write to get to true float and double standards. That said, with the exception of parameters that were getting rounded off to 0, I think our uncertainty in all these parameters is much worse than these numerical precision issues. But I agree that it's important that we be able to prove that we can swap PFTs dynamically using this approach.

fabeit commented 8 years ago

@mdietze I hope this work can be of use in the future, please let me know if you or someone else needs to discuss any of this. I am still interested because I think it is a clever idea. And it is already usable depending on the situation, in fact I am running a sensitivity analysis on a parameter and I am using the XML as input, in this case the error will be the same for every simulation and PFT so it does not matter.

fabeit commented 8 years ago

related pull request https://github.com/EDmodel/ED2/pull/180