ICB-DCM / pyPESTO

python Parameter EStimation TOolbox
https://pypesto.readthedocs.io
BSD 3-Clause "New" or "Revised" License
206 stars 44 forks source link

PetabImporter does not respect estimate=0 #1415

Closed dweindl closed 3 weeks ago

dweindl commented 3 weeks ago

When importing a PEtab problem, I would expect that the resulting objective on has the estimated parameters as parameters. However, currently it has all (instead of petab_problem.x_free_ids): https://github.com/ICB-DCM/pyPESTO/blob/f183b3c2674bac1efb26c5761e07c2e07ff30de5/pypesto/petab/importer.py#L480

For gradient-based optimization, this is not an immediate problem (just inefficient), I think, because the gradient w.r.t. to those will be 0, due to the way amici creates the model (although this is nothing we should rely on).

However, for gradient-free optimizations, I think, the fixed parameters are also optimized, although they shouldn't.

Right?

For an example, take Boehm_JProteomeRes2014, where ratio and specC17 should not be estimated, but end up in the objective parameters.

dweindl commented 3 weeks ago

False alert, this seems to be handled correctly through the somewhat opaque parameter-fixing machinery, mostly here: https://github.com/ICB-DCM/pyPESTO/blob/38d91c47ee386712da523fe89a6585c11b65a071/pypesto/objective/base.py#L178

dweindl commented 3 weeks ago

Note that AmiciObjective.x_ids and AmiciObjective.x_names are of different length, although the documentation suggests they should be the same. x_names contains only the non-fixed parameter IDs, x_ids contains all of them.

dweindl commented 3 weeks ago

Well, this problem does exist if one uses PetabImporter.create_objective() and uses that Objective without passing it through PetabImporter.create_problem() . Computing the gradient of the returned objective will just have zeros for the PEtab-fixed parameters.

Whether it affects optimization will depend on into which Problem this objective will be put.

I guess, one can argue about whether this is a bug or not. At least it's a potential pitfall. Generally, if I change an objective function parameter and it changes the objective function value, I would expect that to be reflected in the gradient.